Subscribe

We will never sell or share your email address.

A yawning chasm: Patterns of neighborhood distress in US metros

There’s a yawning chasm of neighborhood level economic distress across US metro areas.  While about 1 in 6 US neighborhoods is classed as distressed, some metro areas have large concentrations of distress, while others have almost no distressed neighborhoods at all.  Focusing on groups of contiguous zip codes classified as “distressed” shows that in some metros 20 percent or more of the population lives in a distressed zip code that is part of a cluster of distress, while in other metro’s less than 1 percent of the region’s population live in such distressed clusters.  Here are the six large metro areas with the largest and smallest shares of their metro area population living in a cluster of distressed zip codes:

In short, the problem of concentrated neighborhood distress appears to be an order of magnitude greater in some metro areas than others.  The problem of persistent, concentrated poverty continues to plague some large metropolitan areas, while others have a geography of much more evenly distributed economic well-being.

These insights are based on data from a new report from the Economic Innovation Group which illustrates the widely varying patterns of neighborhood distress across the nation. EIG’s  uses a series of seven indicators, including job growth, educational attainment, poverty, housing, income, unemployment and business growth to rank states, counties, cities and zip codes as either distressed, at-risk, comfortable or prosperous.  Viewed at its finest resolution, zip codes, the index provides a fascinating mosaic of varied economic conditions.

Overall, 52 million Americans—about 16 percent of the population—live in a zip code that is classified as distressed.

Of these 12.8 million people, about a quarter of the people living in a distressed zip code in the United States live in a zip code that is contiguous to other distressed zip codes in one of the nation’s largest metro areas.

Concentrations of distressed zip codes are disproportionately located in relatively few large metropolitan areas.  Many large metropolitan areas have less than 2 percent of their population living in contiguous distressed zip codes.

Looking past the bewildering jigsaw puzzle.

Viewed at a national level, zip code data looks like a challenging jigsaw puzzle.  The one pattern that emerges consistently is the profoundly rural and regional character of zip code level distress.  Most distressed places are rural, rather than metropolitan, and the entire southern part of the country, from New Mexico to the Atlantic is dominated by rural distressed zip codes.  Metro areas, in contrast, tend to be prosperous or comfortable, and the metropolitan, and bi-coastal character of prosperity is apparent if you look closely.

Where are distressed neighborhoods concentrated?

A critical question is how is prosperity distributed within metro areas.  The great value of the EIG website is you can quickly zoom into a map of a particular metropolitan area and see the geography of distress and poverty.  Looking closely at the pattern of zip codes, some striking patterns emerge.  In most large metros, distressed zip codes tend to be found in the center of the region, and most distressed zip codes are contiguous—that is, there are clusters of distressed neighborhoods.  Some metro areas have large numbers of distressed neighborhoods clustered in their cores, others have far fewer.  Here are maps of several metropolitan areas, with zip codes colored dark red classified as “distressed.”  First, Cleveland, a long-suffering rust-belt city has one of the largest clusters of distressed neighborhoods, which dominate most of its central zip codes.  Nearly 500,000 people live in this contiguous group of 21 zip codes, about 23 percent of the metropolitan area population.

Source: Economic Innovation Group, Distressed Community Index website

Distressed neighborhoods also occur in growing sunbelt metros as well, as this map of Houston makes clear.  Much of the region’s east side is neighborhoods that are classified as distressed.  About 12 percent of Houston’s metro population, roughly 865,000 people live in this group of 29 contiguous “distressed” zip codes.  (Note that we don’t count a second, smaller separate cluster of distressed zip codes on the city’s southwest side).

Source: Economic Innovation Group, Distressed Community Index website

Some metro areas have few or isolated pockets of distressed zip codes, according to the EIG metrics.  The Denver and Salt Lake City metro areas have no distressed zip codes, per EIG.  The Portland metropolitan area has two distressed zip codes, with the most of the rest of the region classified as comfortable or prosperous.

Source: Economic Innovation Group, Distressed Community Index website

 

Which metro areas have the most concentrated neighborhood distress?

To look at this more systematically, we counted the number of contiguous, distressed zip codes in each of the nation’s largest metro areas.  For each metro, we computed the number of people living in these distressed zip codes, and the total share of the metro area population living in one of those contiguous distressed zip codes.  Overall, 12.8 million people, about a quarter of the people living in a distressed zip code in the United States live in a zip code that is contiguous to other distressed zip codes in one of the nation’s largest metro areas.

Among metropolitan areas with a million or more population, the median metropolitan area has about 5 percent of its population living in its largest cluster of distressed zip codes.  But there’s very wide variation across metropolitan areas.  Memphis and Cleveland, for example, have more than five times as large a share of their metro populations living in a cluster of concentrated distress.  In contrast, the bottom quartile of all metro areas have fewer than 2 percent of their regional populations living in clusters of zip codes experiencing distress.

 

A national picture of concentrated neighborhood distress.  We’ve taken the same data shown in the figure above, and presented it in mapped form to show the pattern of variation in concentrated neighborhood distress across U.S. metropolitan areas.  In this map, the size of each circle corresponds to the number of persons living in contiguous distressed zip codes in each metropolitan area; the color of each circle corresponds to the share of the region’s population living in contiguous distressed zip codes (with green being fewest, and yellow/red most).  Mousing over the map symbols shows the number of zip codes in each contiguous cluster, and the number of persons and share of the metropolitan population living in each cluster.

There are clear geographic patterns here:  Rustbelt cities tend to have much larger shares of their population living in contiguous distressed zip codes.  Most Western metro’s have relatively small shares of their population living in such neighborhoods.  The pattern in the US South is varied.  And many fast-growing metros have higher proportions of concentrated distressed zip codes:  Oklahoma City, Houston, Las Vegas and San Antonio all have two to three times as large a share of their population living in contiguous distressed zip codes than in the median large metropolitan area.

There are strong reasons to focus on these clusters:  There’s a powerful body of evidence that concentrated poverty–living in a neighborhood where a high fraction of your neighbors are poor, has more devastating effects.  A single, isolated zip code that gets classified as “distressed” is a far less intractable problem than a contiguous cluster of 10 or 20 such zip codes.

About the EIG Distressed Communities Index

The methodology of the EIG Index is explained on its website.  The index consists of seven different indicators, which represent a mix of socio-economic and housing indicators.

Our thanks to EIG for publishing their zip code level maps in a publicly accessible format.  The conclusions presented in this commentary are those of City Observatory, and not necessarily EIG.

 

Three big flaws in ODOT’s Highway Cost Allocation Study

There are good reasons to be dubious of claims that trucks are being over-charged for the use of Oregon roads.

The imbalance between cars and trucks seems to stem largely from the Oregon Department of Transportation”s decision to slash maintenance and preservation, and spend more widening highways.

ODOT could largely fix this “imbalance” by spending more fixing roads, and less on widening them.

ODOT has illegally included federal funds in its cost allocation study; the state’s law and constitution apply only to state funds.

ODOT has gone out of its way to scapegoat bike and pedestrian projects, which are mostly paid for with federal funds—that aren’t even properly included in the allocation study.

And the highway cost allocation study leaves out huge social, environmental and fiscal costs that cars and trucks impose on society and on the state.

In January, the Oregon Trucking Association filed suit against the State of Oregon, alleging that trucks were being illegally over-charged for their use of Oregon roads. The lawsuit is based on a provision of Oregon law that requires a “fair” division of road costs between trucks and cars, and an arcane report called the “Highway Cost Allocation Study (HCAS).”

The truckers point to the latest version of this biennial study, which in June concluded that trucks were responsible for 26 percent of highway costs, but contributed 36 percent of state highway revenue. The difference, about $200 million per year, they argue represents the amounts that they are being overcharged.

There are, however, several huge problems with this study, which neither the truckers nor the Oregon Department of Transportation are willing to acknowledge.

The imbalance is due to spending less to fix roads and more widening them

The first is the “imbalance” problem is largely due to ODOT’s decision to spend less on preservation and maintenance (re-paving, filling potholes), which are costs attributable primarily to trucks, and to spend more on expanding Portland area highways (which are costs attributable to cars). As we’ve pointed out at City Observatory, ODOT has systematically shortchanged preservation and maintenance to cobble together money for highway expansion projects. And recently, in threatened to slash snow plowing, even as it racked up massive cost-overruns for highway expansion projects. All this is fueling the “imbalance.”

According to HCAS, almost half of the imbalance is due to a “change in project mix.” The HCAS study finds that different kinds of expenditure are differently attributable to cars and to trucks. Expanding peak hour capacity is attributed mostly to peak hour VMT (overwhelmingly cars).

ODOT has cut spending on pavement maintenance and preservation from 24% of expenditures to 15% of expenditures. At the same time, it has doubled the share of its spending on Preliminary Engineering (i.e. planning for a series of freeway widening proejct) from 3% to 6%. (ODOT presentation, Slide 17).

What this means is ODOT could reduce or resolve the cost- responsibility problem by spending more on the highway costs that are attributable to trucks trucks, i.e. repaving roads.

Illegally counting federal funds

A second problem is that the HCAS illegally mingles federal funds with state funds in its calculations.. The HCAS law applies ONLY to state revenues, but ODOT has chosen to count federal funds as well. There is no legal reason why federal funds spent on bike/ped projects, for example, should affect the HCAS allocation, but ODOT pretends they do. And, as you know, for every single project, ODOT knows to the penny how much federal money it spent on that project.

ODOT illegally includes federal funds in the HCAS. Federal funds spent on bikes, transit and pedestrians gets counted as influencing the cost allocation of state funds, which is not consistent with the Oregon Constitution or Oregon Law. ODOT should exclude all federal funds from the HCAS.

Nothing in the law or constitution directs or authorizes including federal funds in the HCAS calculations. The HCAS requirement applies exclusively to state authorized road user fees. The Oregon Constitution is clear that this applies only to state taxes and fees:

(3) Revenues described in subsection (1) of this section that are generated by taxes or excises imposed by the state shall be generated in a manner that ensures that the share of revenues paid for the use of light vehicles, including cars, and the share of revenues paid for the use of heavy vehicles, including trucks, is fair and proportionate to the costs incurred for the highway system because of each class of vehicle. The Legislative Assembly shall provide for a biennial review and, if necessary, adjustment, of revenue sources to ensure fairness and proportionality.

Oregon Constitution, Article IX(3a) (Emphasis added)

ODOT is strictly accountable to show that it spends federal funds and state funds only on eligible uses. It legally tracks the use of federal funds for every project separately. There’s no technical reason these can’t be excluded. ODOT includes these funds based on the false claim that federal and state funds are “interchangeable,” something they maintain is impossible in every other context.

State Expenditures. All state expenditures of highway user fee revenues are allocated to vehicle weight classes, as are all state expenditures of federal highway funds (e.g., matching funds). Federal funds are included because they are interchangeable with state user fee revenues. Any differences in the way they are spent are arbitrary and subject to change.

ECONW, 2021-23 Highway Cost Allocation Study, Page 16 (emphasis added)

Any court that takes a close look at the Highway Cost Allocation Study will have to conclude that ODOT has committed a serious error by co-mingling federal funds (which aren’t subject to the HCAS requirement) with state funds (which are). If the study were revised to include only at state funds, which is all the law applies to, there may not be an imbalance at all.

ODOT singles out an increase in spending on bike/ped projects as a reason why the mix of projects it funds are shifting the revenue allocation away from trucks.

A ODOT second slide emphasizes spending on bike/ped projects; there is no comparable slide showing the $622 million allocated for the I-205 Abernethy Bridge project, which is overwhelmingly state funding, and which dwarfs the amount spent all on bike and pedestrian projects.. ODOT has made a conscious decision to scapegoat bikes, while it says nothing about the billions of dollars it is spending on highway expansion projects, which are experiencing vast cost overruns.

Screen Shot 2024-01-31 at 3.11.13 PM.png

ODOT is including several almost entirely federally funded bike/ped projects to influence the HCAS. There’s no legal or logical reason why amounts of federal funds spent on bikes, pedestrian facilities or transit should affect the allocation of state funding between cars and trucks.

Plenty of other problems with highway cost allocation

The illegal inclusion of federal funds in the cost allocation study is just the tip of the iceberg of questionable assumptions behind the study. Here are three other issues that should be considered.

First, the study doesn’t fully address the social and environmental costs of roads. The HCAS study looks only an road expenditures, and doesn’t look at all the other costs that cars and trucks impose on society, and on state and local government. The cost of pollution, crashes, policing, stormwater runoff, climate change and other costs are not reflected in the study. The US Congressional Budget Office estimates that heavy trucks cause $57 billion to $128 billion in social and environmental cost per year, several times what they pay in all state and federal user fees. If we include all the direct and indirect costs associated with roads, both cars and trucks are massively subsidized, and pay far less than their “fair share” toward the cost of roads.

Second, there are many essentially arbitrary assumptions in the HCAS. Chief among them: the allocation of administration costs. More than a fifth of all spending is treated as “administration.” The study chooses to allocate these solely based on vehicle miles traveled, which disproportionately attributes these costs to cars rather than trucks. Instead, the administrative costs could be just as reasonably allocated proportional to all other costs in the study (this would be essentially neutral between cars and trucks–it would say that administration doesn’t influence the cost allocation.

Third, the HCAS isn’t really a cost allocation study, its an expenditure allocation study. The highway system incurs costs, particularly for deferred maintenance, whether it pays them in any given year or not. Arguably, the imbalance in the current study reflects ODOT’s failure to deal with its substantial maintenance backlog. It has cut spending on fixing roads, and spent more on expanding them—but these costs have not been avoided, they’ve merely been postponed, and likely increased, as a result. A true “cost allocation” study would make an allowance for the annual depreciation and maintenance costs that the system incurs, whether or not ODOT spends money on these items in any particular year. Ironically, this is exactly the approach that ODOT is employing to set its budget for tolled roadways: setting aside funds from tolls to cover future capital replacement, repair, and operating costs. ODOT has shown it can easily apply this basic accounting accounting concept to highways.

Bike Portland Interviews Joe Cortright on the Highway Cost Allocation Study

Jonathan Maus of Bike Portland interviewed Joe Cortright about the highway cost allocation study here

Bad data: Not a decline in travel

An imagined decline in trip-making is the result of bad data analysis

USDOT counted fewer trips in 2022, because it used a different, and less reliable survey method

USDOT’s social media created a false perception that 2022 data were comparable with earlier years

For all the time we spend talking about transportation, it’s surprising how little good data we have.  The Census Bureau asks a narrow set of questions about our daily journey to work as part of its annual American Community Survey, but this misses the vast majority of travel that is neither to or from a workplace.

The US Department of Transportation commissions a periodic “National Household Travel Survey” to as a richer and more detailed set of questions about all kinds of travel.  The NHTS serves as a vital benchmark for understanding travel patterns.

Completed in 2022, this latest survey seems well positioned to help us answer the question of how the Covid-19 pandemic changed our travel patterns. Last week, US DOT tweeted out an infographic offering a key finding from the latest NHTS.

WARNING: You cannot compare the 2017 and 2022 data.

The graphic seems to show a dramatic—even catastrophic—decline in trip making.  The headline is “Decrease in DAILY TRIPS”. the explanatory text offers “In 2022, household travel decreased compared to 2017.”  The table reports trips fell from 3.4 per person per day, to barely 2 per day.

Big, if true.  The message “decrease in trips” is  clearly the take that active transportation blogger Angie Schmitt got from the data, repeating the top-line number from the graphic, and then musing over our growing isolation, in a Substack essay entitled “Nobody’s leaving the house anymore.”

In 2017, Americans used to average more than three trips a day. Going to work and back and then going somewhere else — that’s three trips. And also maybe they went to the store every other day, or a friend’s house. Now, people are taking just a little over 2 trips a day. . . .

As I said, I find this stunning. And let me explain a little more why. This is a HUGE HUGE change in the way people live their lives and their daily activity. I have been involved in sustainable transportation advocacy for over a decade, and the kinds of changes we try to implement — I cannot imagine it having an impact on this scale this quickly.

But Angie was right to describe this kind of change as unimaginable.  Upon closer inspection, the supposed decline in trip-making is likely an illusion from bad data (or, more precisely, bad data inferences).

There’s a problem with the data.  Between 2017 and 2022, the US DOT made some major changes to its data gathering methodology for the NHTS.  First, it shifted to a primarily on-line data gathering method.  Earlier surveys relied more on in-person interviews.  Second, and more importantly, the survey eliminated the “travel diary.”  Earlier versions of the NHTS sent participants a travel diary, in which to contemporaneously record their travel.  The 2022 online survey doesn’t include a diary, and relies on the recollection of participants.

As the technical notes to the study (which appear on the website of the Oakridge National Laboratory, which conducted the survey for USDOT) reveal, this change almost certainly accounts for much (and perhaps all) of the decline in reported trip-making.  Examining the results of earlier surveys, USDOT concluded that relying solely on recollection (as opposed to contemporaneous diaries) left out about 20 percent of all trips.

To be sure, it may well be that trip-making now is different (and lower) than it was pre-pandemic.  It;s just that you can’t use this data to make any reliable statements about the direction or size of the change because the methodologies are substantially different from year to year.

In Raiders of the Lost Ark, Salah snatched a poisoned date from the air just before Indiana Jones swallowed it, exclaiming “Bad dates”.  Unfortunately, we weren’t able to snatch this bad data before it got swallowed.

The critical takeaway for any user of this NHTS data has to be that it you simply can’t compare the 2017 trip-making data with the 2022 trip making data.  This shouldn’t be an obscure footnote:  it should be a cigarette-pack warning.  But, with its cute-infographic, the USDOT did exactly the opposite.  Its tweet approvingly presents the two data sets as directly comparable.  Thanks to this misinformation, we are running into the “BS-Asymmetry” problem: it takes almost no effort to use social media to create a false (but interesting) finding, and vastly more effort to correct the error.  We hope that US DOT will take steps to correct this mistaken impression.

If it were true that overall trip-making had fallen by about a third, that would have profound implications for the transportation system.  It would, for example, call into question almost every expenditure to expand highway capacity.  Its a shame that the NHTS data are presented in such a sloppy and inaccurate way, by of all entities, the US Department of Transportation.

None of this is to say that the National Household Travel Survey is useless.  The key point here is that 2022 data can’t be directly compared with earlier data to make plausible assertions about changing travel patterns.  It turns out that’s one of the big issues that we want to know about–because it definitely appears that the pandemic and the advent of work-from-home has produced some fundamental changes in travel patterns.

Down is not up: The truth about traffic, congestion and trucking

A central message of the highway building sales pitch is that traffic is ever-growing and ever worsening, and that we have no choice but to throw more money at expanded capacity.

The Oregon Department of Transportation (ODOT) claims that traffic is every-rising, congestion is ever-worsening, and we’re always moving more and more trucks.

The reality, as revealed by ODOT’s own statistics is very different:  Post-pandemic, traffic levels are lower than before, time lost to traffic congestion is down almost 40 percent, and fewer trucks are on Oregon’s roads.

This lower level of demand means we don’t need to squander billions on added capacity, as ODOT is proposing.  Instead, measures to reduce or manage demand, like congestion pricing, could give us much faster travel times, at far lower cost.

For decades, highway advocates have described traffic as an ever-worsening menace. That messaging was very much on display in a recent legislative hearing in Oregon.  at a meeting of the Joint Subcommittee on Transportation Planning.  At its November 6, 2023 meeting, the Joint Subcommittee on Transportation Planning heard OregonDOT’s Brendan Finn, who presented a set of slides and made a number of key claims about future trends.  Chief among them, the following:

  • Traffic have already rebounded to pre-pandemic levels
  • Congestion will only get worse
  • Trucking will always increase

None of these claims is, in fact, true, according to data collected and reported by both ODOT and the federal government.

  • Travel is consistently below pre-pandemic levels, and its flat as the economy expands
    • Traffic on I-5 is down 7% below 2019 levels in 2023, and is lower than in 2021
    • Traffic on I-84 is down 3% below 2019 levels in 2023, and is lower than in 2021
  • Time lost to traffic congestion has declined by 40 percent in Portland
    • Congested lanes miles are down from 400 in 2019 to 256 today
    • Clark County vehicle hours of delay on SR 14, I-5 and I-205 are down 75% from 2019 in 2022
  • Average commute times are down 10 percent
    • Portland area commute times were 26.6 minutes in 2019
    • They are now 24.4 minutes
  • Trucking less than 20 years ago, and is declining
    • Truck movements across the Columbia River are down nearly 20% from 2005
    • Truck miles in Oregon are down 2.4% from 2019 levels, and down 3.9% for large trucks
    • Truck miles are expected to decline further (HCAS) and are trending below pre-Covid levels, something the ODOT economist admits he can’t explain.

The reality is, now nearly three years after the peak of the pandemic’s effect on daily travel, travel has not returned to pre-Covid patterns or levels.  Work at home has persisted.  Even though the Oregon economy has more than fully recovered the jobs and income lost in the pandemic recession, travel levels and patterns are different and lower than prior to the pandemic.  Work-at-home, at least a few days a week has become a “new normal” for a significant fraction of the Oregon workforce. Likewise, the Oregon economy (and national and global economies) have worked through the supply chain disruptions that plagued the transportation sector during, and just after the pandemic.

In short, claims that traffic is increasingly inexorably, that congestion is steadily getting worse, and we need more room for ever more trucks are flat out wrong.

ODOT’s fear-mongering predictions don’t really inform the policy debate about transportation.  They don’t explain why we have transportation issues, and what can be done to solve them.  They don’t shed any light, they only aim to generate heat.  The reason for these dire forecasts about traffic, congestion and trucking is to serve as a sales pitch for giving ODOT billions of dollars for road building.

ODOT is proposing to undertake a series of incredibly expensive highway widening projects, even as there appears to be a fundamental shift in travel patterns than undercuts the rationale for these projects.  ODOT is living in a world where Covid never happened, where work-from-home never happened, and where the long-term decline in per-capita driving in Oregon never happened.  It is proposing to three giant projects each of which cost more than $1 billion per mile (the IBR 5 miles and $7.5 billion; the Rose Quarter 1.5 miles and $1.9 billion, and the Abernethy Bridge (.5 miles and $622 million).  And it’s proposing to go deeply into debt to pay for each of these projects.

Traffic Levels are down, and staying down

What’s inarguable is that traffic levels are down in Oregon.  The key question is whether these declines persist in the post-pandemic era?  ODOT’s traffic counting staff have prepared a special report on exactly that subject which shows that traffic is not rebounding to pre-pandemic levels even three years after the height of the Covid shutdowns.  ODOT’s staff analysis of travel trends shows that traffic volumes have gone down and stayed down in the post pandemic era.  ODOT’s own traffic counting expert, Rebecca Knudsen reported in July 2023, that traffic levels on I-5 and I-84 in the Portland area were still below pre-pandemic levels in 2023, and were not increasing above 2022 or 2021 levels.  In a document entitled Pandemic Impacts on Future Transportation Planning: Implications for Long Range Travel Forecasts Knudsen reported:

Source: OregonDOT, July 2023

This pattern holds for Portland’s busiest highways.  Current traffic volumes on I-84 in the Portland region are about 5% lower than 2019 overall, weekday volumes are about 3%;  Traffic volumes on I-5 in the Portland region are currently about 6% lower than 2019 overall, weekday volumes are about 7% lower

Traffic congestion is down, and staying down.

ODOT periodically prepares a statewide “congestion report.”  Its latest report, released earlier this year, confirms that congestion in Oregon isn’t growing.  In fact, its declined significantly, even as the state economy grows rapidly following the pandemic.

 

Table 9 reports 2021 data presented in Figures 17 and 18, but also includes 2019 data to illustrate the difference pre- and post-pandemic. Statewide congested lane miles were 35% lower in 2021 than 2019. The largest decline was in the Willamette Valley. Portland Metro had a 38% decrease in congested lane miles, shifting from 400 congested lane miles in 2019 to 246 in 2021. (Page 27, emphasis added.)

 

ODOT has a “key performance measure” for statewide traffic performance that it provides routinely to the Oregon Transportation Commission, and which is supposed to guide policy.  This measure shows that Oregon traffic congestion is down 38 percent in 2021 compared to pre-covid (2019 levels).  ODOT is doing dramatically better than its goal. The number of congested lane miles statewide has declined from nearly 500 in 2019 to 322 in 2021.

Oregon’s Portland area data is confirmed by similar data gathered by the Washington State Department of Transportation.  WSDOT’s Mobility Dashboard reports that traffic congestion is down sharply in Clark County with a persistent and sustained decline in congestion-related travel delays.  According to WSDOT data, total vehicle hours of delay in Clark County’s three principal roadways  are down more than 75 percent from pre-Covid (2019) levels.

Vehicle Delay, 1000s of hours (Clark County)

I-5 I-205 SR 14      All
2019                 104,350        107,276         11,135        222,761
2022                   21,542          28,170           2,409          52,121
Change                   -79%          -74%        -78%          -77%

Sourrce:  WSDOT, Mobility Dashboard

Commute times are down 10 percent from pre-Pandemic Levels

Lower levels of traffic and less congestion are showing up in reduced commute times for all workers.  Census data confirm that the average commute trip in the Portland metropolitan area is about 2 minutes 20 seconds faster now than it was in 2019. The average Portland area commuter spends 2.2 minutes less time traveling to work each day according to the latest Census data, compared to the period before the pandemic.  In 2019, the average resident spent 26.6 minutes traveling to work (one way); in 2021, the average resident spent 24.4 minutes. (Census Bureau, American Community Survey). These data actually understate total time savings, because they only represent travel times for workers who still regularly work outside the home; the commute data do not include the time savings for those who work at home.

Truck freight is lower than twenty years ago and is declining

Highway boosters love to assert that unless we expand highways, our economy will somehow grind to a halt.  But the truth is, in the Internet era, and an economy shifting to smaller and lighter products and more services, and ever greater efficiency in production and distribution, truck freight movements have peaked and are declining.  Oregon’s economic growth, in particular, has de-coupled from goods movement.  ODOT’s testimony claims that truck freight in the Portland area will increase 57 percent in the next two decades, and implies we need to expand highway capacity to match.

ODOT actually has detailed data on truck freight movement in Oregon—data that wasn’t presented at the November 6 hearing.  That data shows that statewide, truck freight is down 2.6 percent from pre-pandemic levels, and is decreasing, and is expected to decrease further.  Oregon DOT’s corridor-level data show the number of trucks crossing the Columbia River today is down more than 20 percent from 2007 levels.  ODOT’s just-released revenue forecasts shows a decline in freight movement, contrary to ODOT’s earlier forecasts that truck freight would continually increase.  One reason that the recent Highway Cost Allocation Study (HCAS) shows trucks “overpaid” their share of highway expenses is that truck freight has grown much more slowly than ODOT projected.

ODOT’s statewide congestion report tracks truck mileage by weight class.  The latest report  concludes that truck VMT in Oregon is down below 2019 levels, and is down more sharply for large trucks.

Truck Vehicle Miles 2019, and 2021
Class 2019 2022 Change
Medium              958              961 0.3%
Large           2,213           2,126 -3.9%
Total           3,171           3,087 -2.6%
Source:  ODOT, 2022 Statewide Congestion Overview

What is true statewide is also true in major corridors.  One key corridor is I-5 and I-205 across the Columbia River.  ODOT’s vehicle count data show that the number of trucks crossing the Columbia River is down almost almost 20 percent since 2006.

ODOT’s forecasts of truck freight have been consistently and wildly overstated

ODOT has long predicted steady increases in trucking.  And it has long been wrong. In 2011, ODOT adopted the “Oregon Freight Plan”. It called for the volume of truck freight to increase 96 percent in 25 years, between 2010 and 2035.  This amounts to an annual rate of increase of 2.2 percent per year.  We are now more than half way through that period, and truck freight has gone down; between 2007 and 2022, truck freight volumes declined at an average annual rate of -0.2 percent per year.

ODOT’s latest financial reports concede that truck traffic is both below expectations, and is expected to decline further.  Both ODOT’s biennial Highway Cost Allocation Study (HCAS) and its October 2023 Transportation Revenue Forecast point to a decline in truck travel in Oregon.  ODOT’s own economist admits its modeling failed to accurately predict the decline in in trucking, which is now below its pre-Covid-forecast.

ODOT’s latest transportation revenue forecast reports a dramatic decline in weight-mile transactions.  ODOT’s April 2023 forecast predicted that weight mile transactions would exceed 4.6 million (orange line), instead they’ve fallen to 4.3 million (blue line), and are now below the department’s pre-pandemic (2019) forecast (green line).

ODOT’s economist is at a loss to explain this decline:

Unfortunately, the prior forecast and our current forecast model do not adequately predict the drop we are currently seeing in the weight-mile transactions. This is concerning, as the forecast model has traditionally done a good job of predicting trucking activity. The last two quarters since the prior forecast have shown a significant drop, bringing us back in line with our pre-pandemic forecast.
(emphasis added)

This clear trend puts the lie to inflated forecasts claiming truck growth would always increase.  In 2010, Metro predicted that truck freight in the Portland area would double by 2035.   In fact, as we’ve seen truck freight has actually declined.

Why this matters

The alarmist warnings that an inexorable tide of traffic will condemn us to permanent traffic congestion is standard fare from highway boosters.  The “predict and provide” approach is a way of rationalizing capacity increases as a way to avoid congestion.  But added capacity has invariably simply generated more travel, more sprawling development patterns and more costly, congested roadways.  The process is so well documented by study-after-study that its called the “Fundamental Law of Road Congestion.”

State highway departments invariably overstate future travel growth and congestion to sell highways.  The State Smart Transportation Institute has cataloged this behavior, which has been going on for decades.  No matter what the actual trend is in transportation, highway agencies predict “hockey stick” growth trends.

What recent Oregon data shows is that traffic increases aren’t inexorable or unavoidable.  Since the pandemic, we’ve fundamentally changed our travel patterns.  If ODOT officials were serious about reducing congestion, they’d be working to understand what about the past few years has enabled the reduction in traffic and congestion.  Clearly, it has not been because of expanded capacity.  Instead, the experience of the past few years shows the efficacy of managing demand.  We saw big reductions in traffic congestion when more people started working at home—and those reductions and travel time savings have persisted.  While initially it was due to the pandemic restrictions, workers and firms have embraced greater flexibility, and work from home is the new normal for a significant segment of the workforce.  The underlying point is that measures that reduce or manage travel demand are the most effective means of reducing congestion.

What that should signal to ODOT and Oregon policy makers is the urgency of adopting a comprehensive congestion pricing system for the Portland area.  Congestion pricing would directly manage demand.  And critically, congestion pricing would enable us to greatly reduce congestion and improve travel times without spending billions on expanding highway capacity.  ODOT’s own analysis of the I-5 Rose Quarter project showed that congestion pricing would be more effective in eliminating congestion and reducing traffic and pollution than spending $1.9 billion widening the roadway.  In addition, congestion pricing makes urban freeways work better and carry more traffic:  by keeping traffic flowing smoothly, pricing avoids traffic jams that actually reduce capacity.

Highway departments like ODOT view congestion data as a gimmick to sell roadway expansions, nothing more.  When states actually chalk up big reductions in congestion (invariably, not because they’ve built more highways), there are crickets.  OregonDOT and its peers are uninterested in learning from places that actually reduce congestion.  When data provider Tom-Tom reported that Portland recorded the biggest decrease in congestion of any US metro area in 2019, ODOT said . . . nothing.  At the time, we observed:

There’s a calculated asymmetry here: You can bet that if Portland had the biggest increase in congestion per Tom-Tom, it would be a front page story on the Oregonian, and a regularly repeated talking point by the Oregon Department of Transportation. If you’re a highway engineer, or traffic reporter, drawing attention to the terrible (and worsening) nature of congestion is a big part of the way you justify your existence.  But good news, it seems, is no news.  If there were any science or objectivity here, you’d think that the media would be celebrating this success (and praising the policies that led to it), and that the transportation agency would be looking to do more of whatever it was that made the congestion numbers improve.

The reason for this asymmetry, as we’ve suggested before at City Observatory, is that for all their bloviating to the contrary, highway departments really don’t care about reducing traffic congestion. Traffic congestion statistics and rankings are simply convenient public relations fodder for selling the next big highway construction project. If they were serious about reducing traffic congestion, these highway engineers would have looked seriously at the big declines in traffic congestion in the early part of this decade (thanks to higher gas prices), and the decline in traffic generated by tolling congested roads, like I-65 in Louisville, and moved aggressively to implement congestion pricing, which is the only strategy that’s been shown to be effective.  But building things, not solving traffic problems, is really their priority.

 

Lying about climate: A 5 million mile a day discrepancy

Metro’s Regional Transportation Plan (RTP) claims it will meet state and regional climate objectives by slashing vehicle travel more than 30 percent per person between now and 2045.

Meanwhile, its transportation plan actually calls for a decrease in average travel of less than 1 percent per person.  Because population is expected to increase, so too will driving.

Rather than reducing driving, and associated greenhouse gas emissions, Metro’s RTP calls for accommodating more than 5 million additional miles of driving a day—a 20 percent increase from current levels.

The RTP climate strategy asserts the Portland area will drive 20 million miles a day and meet our greenhouse gas reduction goals.  But Metro’s transportation modeling shows the RTP is planning for a system that will lead to 25 million miles per day of driving.

This disconnect between Metro’s climate modeling, and the modeling it’s using to size the transportation system, and make investments violates state climate rules.

The Portland region is a self-styled environmental leader.  Oregon has a legislatively enacted goal to reduce greenhouse gases 75 percent from 1990 levels by 2050.  Metro, the regional government, adopted a “Climate Smart Strategy” in 2014, calling for taking steps to achieve that goal by reducing driving.  A new, federally required (and state regulated) “Regional Transportation Plan” is supposed to spell out how the region will manage its transportation system and spend its limited resources over the next couple of decades to stay on a path to achieve that goal and other regional priorities.

Unfortunately, the Metro region is nowhere close to achieving its climategoal, is actually headed in the wrong direction, and the new Regional Transportation Plan will likely make things worse.  As we previously documented at City Observatory, the RTP’s climate analysis left out the inconvenient fact that Portland area transportation greenhouse gas emissions are actually increasing, rather than decreasing as the plan assumed–indicating that our efforts are actually failing. In addition, the climate policies in the plan give a pass to a ten-billion dollar plus program of freeway expansion that will lead to more driving and more pollution.  That’s bad enough.

But there’s more:  A close look at the technical analysis that is the foundation for the RTP shows that Metro has two completely different sets of “books” for assessing transportation.  When it comes to demonstrating compliance with state climate laws and regulations, Metro has produced a set of projections showing we’ll hold total driving in the Portland area to its current level—in spite of increase population—by reducing per capita driving by almost a third.  But when it comes to sizing the transportation system—and in particular—justifying investments in added highway capacity, Metro has a second set of books, that assume per capita driving doesn’t change at all, and that as a result, we end up driving about 5 million miles more per day in the Portland area than assumed the climate analysis.  These two estimates are completely contradictory, and they mean that the Regional Transportation Plan doesn’t comply with state climate laws, and that if we actually followed through on our stated climate strategy of holding driving to its current level of about 20 million miles per day, we wouldn’t need to spend any more on expanding highway capacity.

Under state law and regulations, Metro has an affirmative legal obligation to monitor and report its performance—something it simply hasn’t done.  At the state’s land use regulator, the Land Conservation and Development Commission is required to review and approve their climate work and policy.  LCDC should reject the Metro climate plan and RTP as out of compliance with these state regulations, and send Metro back to the drawing board to produce a transportation plan that is consistent with professed climate goals and state law.

The key problem here is two sets of books:  An ambitious climate plan that would dramatically reduce average driving (and comply with state regulations), and a second set of books that is a “driving as usual” projection, that’s being used to fuel a highway spending spree.  The difference is 5 million miles a day—and vastly more carbon pollution.

Ambitious climate rhetoric:  We’ll reduce per capita driving 31 per cent compared to 2020 levels

Metro’s current RTP purports to put the region on a path to reducing greenhouse gas emissions by making investments in the transportation system that reduce driving.   And when it comes to its climate analysis, the RTP makes a bold claim that the region will cut driving by more than 30 percent from current levels.  The Climate Analysis (Appendix J, page 9) makes this claim:

 

But that’s the climate portion of the plan.

Reality:  We’re going to drive 20 percent more, and per capita driving will decline less than one percent

A separate portion of the report offers metro’s “system performance measures” for judging the overall operation and success of the region’s transportation system.  Here, the RTP uses its transportation demand model to estimate how much we’ll drive in the future under various scenarios.  These are the numbers that are used to select projects, estimate traffic delays, and guide investments.  And the picture here is very different.  According to this modeling, per capita driving in the Oregon portion of the metropolitan area will decline by just two-tenths of one percent from current levels.  And these performance measures indicate that the RTP investments make almost no difference in reduced driving:  the RTP “constrained” scenario, representing billions of dollars in spending, reduced driving by only one-tenth of a percent more below current levels compared to doing nothing.  Either way, the Metro performance measures suggest almost no change in per capita driving, and as a result, total travel in the region will increase by more than 5 million miles per day—making it that much harder to reach the region’s and the state’s climate objectives.

These data are contained in Appendix I:  Performance Evaluation Documentation

This duplicity is important, because in Appendix J, Metro has concocted an almost entirely fictitious scenario, in which the state government imposes very high per mile fees  fees on driving.  Metro’s climate analysis uses these assumptions to pretend per capita driving will decline sharply.  But the rest of the RTP makes no such assumptions; it plans for a world where we won’t charge drivers much more than they pay today, aside for some tolls, and that we’ll invest in big capacity expansion projects, like the Interstate Bridge and the I-5 Rose Quarter freeway widening.  In reality, as Metro’s performance measures report shows, the region has no intention or expectation of meeting state climate goals, and is going to continue building car infrastructure as if it were 1950, rather than to head off a devastating climate crisis by 2050.

As we pointed out, Metro uses its climate analysis, with its dubious assumptions, to assert that it doesn’t need to worry about the polluting effects of spending billions of dollars expanding highways.  It claims because we’ll only drive 20 million miles a day, we’ll meet state climate targets, and therefore there’s no need to even examine how much widening roads will increase driving.  But the agency’s own transportation modeling—which it uses to justify these expenditures, and select investments—is planning for a world where we drive 25 million miles a day, with arguably 25 percent more pollution, no matter how “green” vehicles are in 2045.

Make no mistake, Metro planners are really counting on their 25 million mile a day forecast.  They only include the 20 million mile projection as a fig leaf, to be able to assert that they’ll meet climate objectives.

If Metro really believed its climate forecasts, and planned accordingly, it would create a plan that provided for no increase in total driving in the region above today’s levels.  But they clearly have no intention of planning for such an outcome.  They—and the Oregon Department of Transportation—are pushing forecasts claiming we’ll drive vastly more miles and that congestion will only get worse, unless we do something—in this case, spend billions on expanded highways.

Having two completely inconsistent travel forecasts–really two sets of books–is effectively perpetrating a climate fraud.

Metro is failing to comply with state law showing it is making progress

Metro has had a climate plan for nearly a decade.  It adopted its Climate Smart Strategy in 2014, and at the time, as an integral part of that plan, pledged to monitor progress—i.e. whether its efforts were leading to the needed reduction in greenhouse gases.  Since then Land Conservation and Development Commission has adopted further rules that direct Metro to plan to achieve statewide climate goals, and again, periodically report on their progress.

OAR 660-044-0060

Monitoring
(1) Metro shall prepare a report monitoring progress in implementing the preferred scenario including status of performance measures and performance targets adopted as part of the preferred scenario as part of regular updates to the Regional Transportation Plan and preparation of Urban Growth Reports.
(2) Metro’s report shall assess whether the region is making satisfactory progress in implementing the preferred scenario; identify reasons for lack of progress, and identify possible corrective actions to make satisfactory progress. Metro may update and revise the preferred scenario as necessary to ensure that performance targets are being met.
(3) The commission shall review the report and shall either find Metro is making satisfactory progress or provide recommendations for corrective actions to be considered or implemented by Metro prior to or as part of the next update of the preferred scenario.

Metro’s Regional Transportation Plan fails to demonstrate whether the region is making progress, and makes no effort to say that it is making “satisfactory progress.”  In fact, emissions inventories show that actual greenhouse gas emissions from transportation have increased by between 1.4 percent and 5 percent per year since 2014.

When presented with these facts, Metro’s only response is kicking the can down the road—saying it will revisit this entire subject in its next Regional Transportation Plan (to be adopted in 2028).  That fails to comply with OAR 660-044-0060, which requires the progress report do gauge progress as of now.

Instead of acknowledging the failure of current actions, and proposing stronger and more effective policies, Metro has simply chosen to embrace a new set of assumptions that we’ll make even faster progress by the adoption or enforcement of as yet un-enacted policies in future years.

Metro acknowledges that it is wrong about current GHG trends, but isn’t making any substantive changes to the current Regional Transportation Plan.  Instead, it says it will use the updated as the basis of “future climate analysis.”  In its response to comments made on the RTP dated October 18, 2023, Metro staff says it will:

2. Update RTP climate assumptions in Chapter 7 and Appendix J to:
a. Describe which state assumptions are required to be used in the RTP climate analysis and why.
b. Document state assumptions in more detail, including a table describing key state assumptions (e.g., vehicle fleet turnover rate, share of SUV/light truck vs. passenger vehicles, share of electric
vehicles), as well as current trends with respect to these assumptions and discussions of state policies, programs or other actions the state is taking to support the state assumptions used in the RTP climate analysis.
c. Describe that the region will not meet its targets if the state assumptions used in the analysis are not met, along with the results of the RTP 23+AP scenario, which quantifies how much the region falls short of its targets if the Statewide Transportation Strategy (STS) assumptions are not included in the analysis.
d. Describe current trends in GHG emissions, both in the region and state, and nationally, based on DARTE and other inventory sources.
e. Use the updated assumptions as the basis of future climate analysis.

Part 1 to Exhibit C to Ordinance No. 23-1496
MTAC Recommendation to MPAC on Key Policy Topics, October 18, 2023
(Emphasis added)

These changes to the RTP do not put the document in compliance with OAR 660-044-0060:  They do not include the required status of performance measures, they do not identify whether the region is making “satisfactory progress”—it isn’t: transportation greenhouse gases are increasing when the plan said they would be decreasing—and it doesn’t explain why we’re not making progress or identify actions that would be corrective.  Instead, Metro has in effect, deferred all of these obligations until the next update of the RTP (scheduled for 2028).  And, notably, Metro is not proposing to do anything to reconcile the conflicting assumptions about future vehicle travel in its environmental analysis (Appendix J), with the 25 percent increase in vehicle travel it says it is planning for in its transportation plan (Appendix I).  As we’ve said:  This is a “Don’t Look Up!” climate plan.

As a result of these failings, the Metro RTP isn’t in compliance with OAR-044-0060, nor is it in compliance with Metro’s own adopted Climate Smart Strategy (which similarly pledged to report progress in reducing emissions, and take additional steps as needed).  As shown above, the RTP has two separate sets of books and actually contemplates a future where total vehicle miles traveled in the Portland area expands by 20 percent—completely inconsistent with achieving climate goals, and exactly the opposite of what Metro asserts in its claims that it is complying with state law.

Rose Quarter’s Killer Ramps

The proposed re-design of the I-5 Rose Quarter Project now includes two deadly hairpin freeway off-ramps.

Just last week, Brandon Coleman was killed at a similar hairpin highway ramp in downtown Portland 

The Oregon Department of Transportation doesn’t really care about safety.

The plan to widen I-5 through the Rose Quarter, at the staggering cost of $1.9 billion, has a new added safety problem, a complicated new freeway offramp, of the kind that often leads to serious or fatal crashes.

Earlier, we reported how ODOT’s plan for the so-called “Hybrid 3” re-design of the Rose Quarter project called for moving the I-5 southbound off-ramps about half a mile south to N. Williams and Wheeler, and in the process creating a dangerous 210-degree hairpin off-ramp from I-5.  Even ODOT’s own safety analysis noted the off-ramps would cause big trucks to veer across marked traffic lanes, and would increase the number of crashes.

In part because of these safety concerns—and objections from the Portland Trail Blazers (who own the Moda Center arena abutting the proposed off-ramp location)—ODOT has developed yet another re-design of the project.  This one calls for constructing another off-ramp which would make a second hairpin turn, up and over both the Southbound and Northbound lanes of I-5, and joining the existing I-5 Northbound off-ramp at NE Weidler.

As a result, the latest proposed re-design of the I-5 Rose Quarter project proposes not just one, but two hairpin off-ramps.  Rather than improve safety, this new ramp arrangement would likely be even deadlier for those traveling in and through the Rose Quarter.

ODOT’s initial description called this the “anchor” design, because the freeway off-ramp splits in two, with hairpin turns to both the left and right.  Here is an illustration of ODOT’s proposed “Anchor+Wheeler” Design.  The two hairpin off-ramps are shown in red.  One hairpin off-ramp turns right, and pours traffic exiting the Freeway onto N. Wheeler Avenue.  The second hairpin off-ramp turns left, vaults up and over the I-5 freeway mainline, and then circles back North to merge with the existing I-5 northbound off-ramp as it meets N. E. Weidler Street.  (The circular inset picture with the anchor logo shows the exit ramps emerging from under ODOT’s freeway overpass/cover).

In an earlier commentary, we pointed out the inherent risks of forcing freeway traffic to make a 180-degree (or greater turn) as they exit from a highway (with a design speed of 70 miles per hour) on to local streets with high levels of bicycle and pedestrian users.

We know the combination sharply curving freeway on- and off-ramps feeding into busy arterial streets are deadly to vulnerable road users.  Just this month, Brandon Coleman was killed in a hit-and-run crash where the Morrison Bridge ramps intersect with S.W. Morrison Street and Naito Parkway.  Here’s the police report:

A pedestrian has died in a Downtown Portland hit and run crash.

Brandon Coleman (Portland Police Bureau)

On Saturday, October 21, 2023 at 4:30a.m., Central Precinct officers responded to a crash at Southwest Naito Parkway and Southwest Morrison Street. When officers and EMS arrived, they found a person, believed to be an adult male, laying on Southwest Naito Parkway at the ramp connected to the Morrison Bridge. He was confirmed deceased at the scene. The involved driver left the scene of the crash and was not immediately located.

Just like the proposed Rose Quarter configuration, this intersection combines a curling, high speed and low visibility ramp with local arterial streets and a dangerous pedestrian crossing.  Traffic turning left or right from Naito Parkway does a tight 180-degree turn on to the Morrison Bridge.

Here’s a Google Streetview image of the intersection where Brandon Coleman was killed.

 

Just like the proposed Rose Quarter project, the Morrison Bridge has two hairpin ramps intersecting with busy city streets.

 

As we’ve pointed out, ODOT has cynically and falsely portrayed the I-5 Rose Quarter freeway widening as a “safety” project, claiming (again falsely) that its the “#1 crash location in Oregon.  It’s latest proposed re-designs actually make the area much more dangerous, both for those traveling in vehicles, and especially people traveling on foot and by bike.  The pair of 180-degree hairpin off-ramps proposed for I-5 southbound funnel high speed traffic exiting the freeway right into arterial streets that carry high volumes of people walking and cycling.  They’re recreating exactly the same fatal design error at the Morrison Bridge ramps that led to the death of Brandon Coleman.

Doubling down on climate fraud in Metro’s RTP

Earlier, we branded Portland Metro’s proposed Regional Transportation Plan (RTP) a climate fraud because in falsely claimed the region was reducing greenhouse gases, and falsely claimed its transportation investments were on track to meet adopted state climate goals. Metro’ staff has responded to these critiques, but  proposes only to fix these mistakes at some vague future time, and more importantly, make absolutely no substantive policy or investment changes to the RTP.
In essence, the staff response puts the lie to the claim that climate/GHG reductions are the “controlling measure” in RTP system planning.  Whether Metro is on track to achieve its committed GHG reductions or not has no bearing on any of the substantive policy and spending decisions in the RTP.
Moreover, this is a straightforward violation of the policies enacted in Metro’s 2014 Climate Smart Strategy (and reiterated in the 2018 RTP, and current RTP draft), to continuously monitor progress in GHG reduction and undertake additional measures if we were not making adequate progress.<
Metro staff treats GHG estimates as a perfunctory and irrelevant codicil to the RTP, and continues to rely on fanciful and speculative assumptions about the future vehicle fleet and an extraordinarily aggressive state pricing policy, to allow it to simply ignore taking any further steps to reduce GHG, or stop widening highways.

City Observatory recently published two detailed commentaries examining the climate analysis and policies contained in Portland Metro’s proposed Regional Transportation Plan (RTP).<
The RTP claims that the region is “on track” to reduce greenhouse gas emissions as required by state law, and that it can afford to spend more than ten billion dollars on highway widening projects and meet its stated climate objectives.
In reality, transportation greenhouse gases in Portland are increasing, not decreasing, illustrating the failure of existing climate policies; the models Metro uses to estimate future GHG reductions are flawed (based on demonstrably incorrect vehicle age and fleet composition assumptions), and the policies in the RTP do nothing to prioritize policies and investments that would actually reduce greenhouse gas emissions.

As part of its review process, Metro staff has prepared its rejoinder to these comments.  In this commentary, we show that Metro’s analysis ignores many of City Observatory’s  comments, mis-states others, and suggested changes do nothing to correct the deficiencies we identified.

For clarity:  both state law and adopted regional plans call for a significant reduction in greenhouse gas emissions.  Oregon law (ORS 468A.205) , and Metro’s adopted 2014 Climate Smart Strategy call for an 75 percent reduction in greenhouse gas emissions from 1990 levels by 2050.  The RTP fails to comply with these policies.

Metro:  “Yes, our climate analysis is wrong, but we’re not going to change our policy or spending”

The analysis contained in the RTP mis-states actual trends in greenhouse gas emissions from transportation (they are increasing, not decreasing), and falsely claims that the region is on track to meet that legislatively adopted goalMetro acknowledges City Observatory’s comment that it has failed to include accurate GHG inventory information in the RTP, and that transportation GHGs in Portland are increasing, rather than decreasing (as shown by DARTE, DEQ and Multnomah County inventories). It proposes, at some unspecified future time, to include more accurate information on transportation GHGs.<
Specifically Metro says it will “amend” its analysis at an unspecified future date and “discuss the potential impact of these trends on RTP achieving climate targets.” (Metro Staff Response to Comment #210).

Metro’s proposed changes to the RTP labeled “Climate Tools and Analysis”  makes it clear that this will not have any effect on the current RTP, and that corrected inventories and trend analysis will be deferred to an unspecified later date:

“use the updated assumptions as the basis of future climate analysis.”

Elsewhere, the RTP concedes that the region is not on track to meet its Vision Zero safety goals.  The RTP needs to have a definitive statement that the region is not on track to meet its climate goals, either.  And while that’s a necessary first step, the RTP must go further.

Admitting error, but doing nothing to fix it

The issue here is not simply whether the RTP contains correct emissions inventories and trend analyses; The actual issue is that Metro has falsely portrayed the progress (actually backsliding) on transportation emissions. The fact that emissions are increasing demonstrates that adopted measures since the 2014 Climate Smart Strategy are failing, and that much more powerful and effective steps need to be taken to achieve stated Metro and State climate goals. Metro needs to both correct its inventory data, and modify its policies and investments to achieve these greenhouse gas goals. It is not sufficient to simply admit we are going rapidly in the wrong direction; Metro needs to change course. Acknowledging that the inventories and trends are wrong, and doing nothing violates the policy commitment in the CSS to periodically adjust the strategy to reflect actual progress. If the RTP is not on a path to achieving climate targets then the policies and investments contained in the plan need to be changed.

The proposed changes in the “Climate Tools and Analysis” include no substantive changes to further reduce greenhouse gas emissions from transportation to compensate for the errors and false assumptions in the RTPs current climate analysis.

 

As City Observatory pointed out, the climate analysis of the RTP can be summarized briefly as claiming that because the overall RTP is “on track” to meet the 2050 goals, that there is no need or obligation to either prioritize projects and investments that reduce GHGs, or to analyze the GHG increasing impacts of projects, particularly highway expansions. The climate analysis contained in the RTP represents a fraud on the public. Despite labeling greenhouse gas reductions “a controlling measure” in system planning, the RTP fails to achieve adopted state and regional greenhouse gas reduction goals, fails to prioritize expenditures and policies that would put us on a path to reduce greenhouse gas emissions as it has pledged, and fails to strengthen its policies or change its investments in light of its demonstrated failure to achieve promised progress.  The climate analysis is not a “controlling measure” if major flaws in the climate analysis don’t immediately necessitate a revision to RTP policies and investments.

A double standard for pricing

As City Observatory has pointed out, road pricing (tolls, road user fees and congestion charges) are an essential component to achieving the RTP’s purported GHG reductions.  But Metro has a blatant double standard for pricing: It assumes that the state will achieve dramatic GHG reductions by enacting widespread pricing (a carbon fee, a very high road user fee, nearly universal congestion pricing on throughways), and yet it fails to incorporate the effects of those pricing measures on the need/justification for billions in highway widening projects. Pricing roadways will reduce or eliminate the need for capacity expansion.  For example, Metro ignores City Observatory’s comment noting that ODOT’s own analysis of Regional Mobility Pricing (RMPP) pricing would obviate the need for additional lanes in the $1.9 billion I-5 Rose Quarter project.

Metro also fails to analyze the negative greenhouse gas effects of major RTP projects. The blatant double standard in the RTP is obvious in the treatment of major capacity projects (the IBR, Rose Quarter and I-205 widening projects). As Metro has acknowledged, adding lane capacity will induce additional travel, and additional greenhouse gas emissions. The only aspects of these projects which moderate or reduce greenhouse gas emissions are the potential tolls that may be used to pay for them. Yet Metro’s RTP fails to acknowledge that tolling/pricing alone would be more effective in both moderating the growth of traffic and reducing GHGs, and would obviate the need for additional capacity. In that same vein, Metro ignores City Observatory’s comment that ODOT’s own analysis procedures manual denies the existence of induced travel and bars use of the scientifically based induced travel calculator.

Whether the state will actually impose high and widespread road pricing as as assumed in the STS and Metro’s climate analysis, or even imposes tolls, for example, on I-205, is still uncertain and speculative. But the RTP commits to building this additional capacity, regardless of whether these GHG (and traffic) reducing measures are actually implemented. An honest and accurate GHG analysis would also show what would happen to regional VMT, GHG and congestion if these speculative and uncertain pricing measures aren’t enacted.

In essence, the RTP pretends it will achieve state GHG goals because of the imaginary and speculative pricing policies described in the State Transportation Strategy (STS). In contrast, the RTP provides a regional commitment to spend billions on freeway widening, which—absent pricing—will certainly make our already bad transportation GHG trajectory much worse.

Hiding behind ODOT’s flawed policies and modeling

City Observatory’s analysis documented the significant deficiencies in the ODOT modeling used to predict future transportation greenhouse gas emissions, and also showed the speculative nature of assumed pricing policies. ODOT’s STS has Metro asserts that the RTP complies with state law and the Climate Smart Strategy because they are consistent with the state’s Climate Friendly and Equitable Communities (CFEC) regulations.

  1. CFEC compliance does not assure that either the Climate Smart Strategy goals or ORS goals are met.
  2. CFEC rules are the minimum required to comply with state land use laws. The state’s GHG gas reduction law and Metro’s own climate smart strategy predate and supercede the CFEC rules.
  3. Nothing in CFEC precludes Metro from doing more to reduce transportation GHG emissions; in fact, Metro’s own Climate Smart Strategy independently commits the region to a larger reduction in VMT and greenhouse gases.

It’s clear that Metro staff are trying to shift the blame for their flawed greenhouse gas analysis to their colleagues at the Oregon Department of Transportation, who developed the State Transportation Strategy (STS), and have prepared their own greenhouse gas estimates.

Failure to correct policies and investments is climate fraud

Metro’s adopted Climate Smart Strategy, adopted nearly a decade ago, committed to monitoring progress and taking additional steps if fell behind.  Metro pledged to monitor transportation greenhouse gas trends, and:

If the assessment finds the region is deviating significantly from the Climate Smart Strategy performance monitoring target, then Metro will work with local, regional and state partners to consider the revision or replacement of policies, strategies and actions to ensure the region remains on track with meeting adopted targets for reducing greenhouse gas emissions.

While Metro has acknowledged that it has overstated progress, but is proposing no additional regional actions, no changes in policy, no different investment strategy, despite the demonstrable failure of its current efforts.
Metro staff’s response to comments confirms the toothlessness and irrelevance of the climate commitments in the RTP. The staff admits that the GHG analysis in the RTP is simply wrong—that it ignores trends of increasing transportation greenhouse gases, and that modeling is based on demonstrably flawed assumptions about fleet turnover and composition—and that this means that the region will definitely not meet its climate commitments. But then, in the staff’s view, these errors and failures necessitate no substantive change to the RTP; no imposition of new policies, no shift in investments. The climate analysis is not, in reality, “controlling” in any way of the RTP. Whether the RTP is on track to meet greenhouse gas reduction targets or not matters not at all to the policy substance or investment choices in the RTP. This is simply climate fraud.

ODOT Snow Job: Give us more money, or we’ll stop plowing your roads

Oregon’s Department of Transportation (ODOT) says it doesn’t have enough money to maintain roads, fix potholes or even plow snow.

This is a Big Lie: Mega-projects and their cost-overruns, not maintenance, are the cause of ODOT’s budget woes

ODOT has chosen to slash operations, while funneling hundreds of millions to billion-dollar-a-mile mega-projects and consultants

Plowing is a trivial part of the $3 billion ODOT budget; ODOT has voluntarily chosen to sacrifice plowing and other safety operations

ODOT’s gambit is a cynical and deadly version of the “Washington Monument” strategy:  Give us money or we won’t plow your roads.

ODOT has aggravated this problem by repeatedly diverting operations and maintenance funds to road-widening projects

ODOT is choosing to make roads even more dangerous as Oregon road fatalities have increased 71 percent; it’s violating its own “Vision Zero” and “Safety First” policies.

The Snow Job:  “Budget cuts are forcing us to reduce snow plowing

Winter is nearly upon us, and the Oregon Department of Transportation has launched a new seasonal budget campaign, it’s claiming its too broke to plow state roads this winter, with the not-at-all-subtle message that people need to give ODOT more money.  The agency’s PR machine has generated a raft of media stories uncritically repeating this story line:

ODOT says highways ‘may not be safe’ this winter due to budget cut

▶️ Expect less snowplowing of road to Mt. Bachelor, other roads this winter

Fortunately, one media outlet didn’t fall for this contrived message.  KGW-TV’s Pat Dorris has a long-form analysis that asks some basic questions and debunks

In October, the Oregon Department of Transportation began getting the word out that it will not have enough funding to plow or sand roadways over the coming winter to the extent that it has in previous years, blaming a combination of inflation and declining fuel tax revenue. But there is a distinction between the agency’s messaging and the facts. . . .

But the idea that fuel tax revenues have declined is not factually accurate. The Story looked at the numbers behind ODOT’s budget and could not verify that claim.

The Story’s Pat Dooris reached out to ODOT Director Kris Strickler to request an interview, but was told he was not available.

The Big Lie:  Megaproject Cost-overruns, not maintenance are the cause of ODOT’s budget woes

ODOT has chosen to slash operations, while funneling hundreds of millions to megaprojects and consultants

The trouble is, as Dooris reported, the ODOT message is false:  Snow removal (and other operations) are a minor, nearly trivial part of the ODOT budget, which instead is dominated by giant construction projects, which have been so badly mismanaged that they have cost-overruns running to billions of dollars.  ODOT’s strategy is to threaten to slash snow plowing and other vital, and visible maintenance to build public pressure for greater funding.  And in addition, ODOT’s budget is going up, not down:  As KGW’s Pat Dorris has pointed out:

“it’s not accurate to say that fuel tax revenues have gone down — they are still going up”

As Dorris pointed out, the agency’s own revenue numbers show it  has more money for the current fiscal year than previous fiscal years.

What this means is that ODOT is choosing to cut spending on operations and maintenance–and the reason it is doing that is because it is devoting huge sums to and handful of expensive highway projects in the Portland area.  ODOT knows these projects aren’t popular, and it can’t defend its persistent cost overruns and expensive consultants, and so, instead, its threatening to cut vital and popular services like snowplowing, in order to gin up popular support for more funding.

It’s a cynical and deceptive ploy, one that endangers road users.  ODOT is planning to reduce snow plowing on some roads, and not repaint fog lines on the sides of many rural highways.  Cutting these  modest expenditures won’t save much money, but what they will do is  directly endanger road users.

ODOTs Budget Problems are from Squandering Billions on Megaprojects

To be absolutely clear:  the problem with the ODOT budget is not a lack of funds to fix potholes and plow snow, but rather the exploding cost of highway widening megaprojects in the Portland Metropolitan Area.  The maintenance “crisis” is purely a product of ODOT choices to slash funding for re-paving and regular operations, and instead dedicate hundreds of millions of dollars to a handful of expensive highway expansion projects–that are all experiencing dramatic cost-overruns.

  • Item:  The cumulative cost of three Portland mega-projects is nearly $10 billion.  ODOT is prioritizing projects costing more than $1 billion per mile of roadway–the Rose Quarter is $1.9 billion for 1.5 miles; the I-5 Bridge is $7.5 billion for 5 miles, and the I-205 Abernethy Bridge,  is $622 million for barely a half-mile.
  • Item:  Each of the three largest projects has experienced 100 percent or more cost-overruns.  The cost increases announced in the past year amount to a total of more than $3 billion ($600 million increase for the Rose Quarter, $2.5 billion increase for the IBR, and $370 million increase for the I-205 Abernethy Bridge.
  • Item:  ODOT won’t even say how much will be saved by plowing less—it is at best a few million dollars, and will come mostly from laying off or not hiring ODOT front-line workers
  • Item:  ODOT’s says it needs to cut its overall budget by 5 percent, but ODOT has chosen to slash operations by four times as much:  20 percent, while holding harmless mega-project construction (in fact, funding continuing cost-overruns).
  • Item:  The Oregon Legislature gave ODOT $500 million in short-term borrowing authority in 2021.  ODOT has used none of this authority to maintain operations.  Instead, it has used all of this authority for highway widening projects–and the debt service on these short term bonds cuts in to revenue that could be used for operations.
  • Item:  The Highway Cost Allocation Study revealed that over the past several years, ODOT has systematically slashed spending for pavement preservation (repaving) and operations, and diverted more money to highway widening projects.
  • Item:  ODOT proposes to plow fewer roads, stop painting fog lines on many rural roads, and not fix as many potholes, just as the number of persons dying on Oregon roads has skyrocketed, with road deaths up 71 percent since 2010.
  • Item:  ODOT routinely juggles its books to “find” revenue for highway widening projects.  It diverted $32 million in maintenance funds to Interstate Bridge Replacement project consultants and planning.  It routinely finds  “savings” and “unanticipated revenue” and uses them to launch expensive expansion projects, that experienced cost-overruns, instead of using those funds to maintain and fix existing roads.
  • Item:  ODOT proposes to spend $40 to 60 million over the next two years, largely on consultants, to advance the planning for the I-5 Rose Quarter project to the “30 percent” level of design—even though it lacks committed funds to pay for the full $1.9 billion project.
  • Item:  ODOT has spent more money on consultants for its highway widenings—over $100 million each for the I-5 Rose Quarter project and the Interstate Bridge Replacement project—than it will ever save by slashing snow plowing.  ODOT has spent more than $16 million on public relations and communications consultants for these two projects (see below for details).
  • Item:  ODOT’s overall budget is more than $3 billion per year and was cut less than 2 percent from the previous biennium, yet the agency is cutting operations (like snow plowing) by ten times as much (20 percent).

In short, ODOT’s PR push to slash snow plowing is a cynical ploy to get Oregonians to give more money to an agency that has been reckless and irresponsible.  The reason ODOT doesn’t have enough money for roads isn’t electric vehicle adoption or faltering revenues, its a spendthrift agency that’s chosen consultants and big contractors over the safety of road users and tax payers.

Plowing is a trivial part of the ODOT budget; ODOT has voluntarily chosen to sacrifice plowing and other safety operations

A close look at ODOT’s explanation shows a strong bias against basic safety operations.  The agency has a $3 billion annual budget, and is seeing revenue increase—plus implementing a 2 cent a gallon gas tax increase in January.  Yet it’s choosing to slash operations, like snow plowing ten times as much as its other parts of its operating budget–like administrative expenses.

While much of the agency is being asked to make a 5 percent reduction, ODOT has chosen to impose a four-fold higher reduction on basic operations, cutting them by 20 percent.  In the agency’s regional “fact” sheets justifying the cuts, it says:

For our next budget, we implemented a 5% cut across all programs funded with state dollars. Within maintenance, we cut our services and materials an additional 15% to account for inflation and our reduced buying power.

ODOT’s region 4 report notes it cut its maintenance budget by 20 percent

Implementing our 2023-2025 budget For our next budget, we implemented a 5% cut across all programs funded with state dollars. Within maintenance, we cut our services and materials an additional 15% to account for inflation and our reduced buying power. We are reducing service in three primary areas:

• Low-volume road maintenance.

• Roadside maintenance.

• Winter maintenance

Despite its emphasis on cutting snow plowing, none of ODOT’s explanations show how much money the state will save by cutting these services.  It’s not likely to be much.  Overall, ODOT spends about $288 million on all “emergency services:–a broad category that includes everything from dealing with crashes, to plowing roads, to cleaning graffiti, and helping disabled motorists .  A 20% cut in that mount is $56 million or about $29 million per year.  $30 million per year is about one-tenth of one percent of ODOT’s annual spending.

Meanwhile, the agency is not imposing these same cuts on its plans for bloated freeway widening projects.  Projects like the $7.5 billion dollar IBR, the $1.9 billion dollar Rose Quarter project, and the $622 million Abernethy Bridge projects–all of which have experienced 100 percent or more cost-overruns, are held harmless from ODOT’s proposed budget cuts.  In fact, ODOT is doing just the opposite:  promising to spend money it doesn’t have on these projects, and likely further cost overruns.

While ODOT has been mum about how much not plowing roads will save, it’s clear that its no a major amount of money in a $3 billion agency.  How much does snow removal cost?  The Pennsylvania highway department spends about $200 million per year on snow removal.  The agency reports plowing, sanding and salting about 94,000 lane miles of highway, for a rough annual cost of $2,000 per lane mile, per year.   The Klamath County road department reports spending about $1 million per year to plow about 100 miles of roadway in the county (about $5,000 lane mile per year).  If ODOT were serious about its budget, it would tell us how much cutting back snow plowing will save–instead, they simply menace us with more dangerous roads, and ask for more money, which will mostly be used for highway widening.

Shorter ODOT:  “Your money or your life.”

ODOT’s PR strategy boils down to:  “Your money or your life.”  We’ve squandered the gas tax increases you approved just six years ago on expensive boondoggle highway widening projects, and unless you give us more money, we’ll stop fixing potholes and plowing snow, and your roads will be more dangerous.  And to be clear, plowing less comes at a cost in human life and limb.  ODOT may not be adequately plowing roads to protect traveler safety.  In 2021, a car plunged off the I-205 bridge, killing the vehicle’s driver; his family is suing ODOT for improperly plowing the bridge, creating a snow ramp that caused the vehicle to jump the guard rail.

ODOT’s plans to reduce plowing come after a decade in which statewide road deaths have spiked by 71 percent.  In spite of the rising death toll, ODOT is choosing to slash its budget for basic safety operations, like plowing snow-covered roadways, and repainting fog lines on many roads.  And ODOT admits its choice to slash plowing and other safety expenditures will likely injure and kill more Oregonians.

Glenn, the ODOT spokesperson, said the state transportation agency is troubled by the trend of increasing traffic deaths, both Oregon and nationwide.

But he said those findings won’t preclude major budget cuts that would eat into the agency’s operations and maintenance budget. The agency is facing a budget shortfall largely due to declining gas tax revenue and inflation.
“We cannot commit that these service level reductions won’t impact safety,” Glenn wrote in an email. “However, we are working to prioritize safety for as many travelers as we can and data like this is helpful in that effort. We are working with our policymaking partners to identify solutions to this structural revenue issue so that we can better invest in building and maintaining a safe system for all users.”

ODOT diverts maintenance funds to highway expansion projects

ODOT routinely diverts funds allocated to and available for maintenance to fund capital construction projects.  ODOT used interstate maintenance discretionary funds to pay for the planning of the failed Columbia River Crossing project.  It diverted funds that could otherwise be used for maintenance to pay for the Interstate Bridge Replacement project.  It routinely prioritizes capital construction in the use of “unanticipated federal funds” and “project savings.”  It cobbled together just these funding sources to pay for the initial work on the I-205 Abernethy Bridge before the Legislature authorized any funding for the project.  Each year it gets a tranche of what it calls “unexpected” federal funds (federal money that is unspent from nationally competitive programs that is allocated to the states).  At its July, 2022 meeting ODOT recommended (and the OTC approved) using this money, which could be applied to the maintenance backlog, to fund $10 million towards the Interstate Bridge Replacement project.

In 2021, ODOT diverted $36 million in funds dedicated to maintenance to pay for consultants for the Interstate Bridge Replacement project.    ODOT’s own memo makes this clear.

This project change requires adjustment to the fiscally constrained RTP. Funds from the fiscally constrained Fix-It buckets in the RTP will be reduced to allow for the $36M ODOT funds to be advanced on this project. Memo with details was sent to Metro 9/17/21 by Chris Ford. We find the analysis is still applicable with the addition of WDOT funds since RTP focuses on Oregon revenue only.

Chris Ford, Memo to Metro TPAC, “I-5:Columbia River (Interstate) Bridge: Requested Amendment to the 2021-24 Metropolitan Transportation Improvement Program.” Oregon Department of Transportation. September 24, 2021, aka ODOT/Ford Memo. Page 6. Emphasis added.

This is still going on today:  At its November 9, 2023 meeting, the Oregon Transportation Commission is being asked to approve using $7.6 million in “savings” from a construction project to pay for further overruns on the I-205 bridge.  If it wanted to the Commission could use these savings to pay for snow plowing—but it’s choosing not to.

ODOT excels at playing three-card monte with its budget, “finding” money for projects it wants to build, and while slashing spending on basic operations.  In 2018, after the Legislature provided no funding for the I-205 Abernethy Bridge project, ODOT suddenly “found” tens of millions dollars in “savings”, “unanticipated revenues” and “unexpended funds” with which to launch the unfunded bridge project.  Here’s a slide from ODOT’s December, 2018 briefing on the project:

Most of these funds (regional flexible funds, “reallocated savings,” “unanticipated federal revenue” and especially the “operation program funds,”) could all otherwise be used to pay for ODOT operations and maintenance—but instead they’re being used here to fund a capital construction project.

ODOT routinely pleads “pothole poverty” when asking for tax increases–then diverts the money to megaprojects

This is nothing new.  Back when the Legislature was considering more funding for transportation in 2017, ODOT swore up and down it would use additional money to keep up roads, not build new ones. In 2017, ODOTs sales pitch for gas tax increases consisted of telling the public how much it cared about maintenance: Here’s the agency’s current deputy director, Travis Brouwer, speaking to OPB, in April, 2017 as the Legislature was considering a giant road finance bill.:

Of course, patching potholes are far from the only thing ODOT has to spend money on. So how does the agency decide what to prioritize? According to ODOT assistant director Travis Brouwer, basic maintenance and preservation are a top priority.“ Oregonians have invested billions of dollars in the transportation system over generations and we need to keep that system in good working order,” he said. “Generally, we prioritize the basic fixing the system above the expansion of that system.”

Back in 2017, the Oregon Department of Transportation put out a two-page “Fact Sheet” on the new transportation legislation.  It’s first paragraph stressed that most of ODOT’s money would be for maintaining the existing system:

“Generally,” meaning, unless we decide to build shiny new projects—which they do.  Make no mistake:  When it comes to one of the agency’s pet mega-projects, there’s always money lying around, and if there isn’t, they’ll pretend like there is and charge full speed ahead, maxing out the credit cards to generate the cash.

A deadly take on the “Washington Monument” strategy

Budget wonks talk about a bureaucratic ploy known as the “Washington Monument Strategy.”  Asked to cut their budget by a few percent, an agency chooses its most visible and valued service.  The National Parks Service says if it its budget is cut, it will have to close the Washington monument (the nation’s most visited and visible national monument). The object is to rally public support for the agency’s budget, not to promote efficiency or focus on priorities.  ODOT’s “we won’t plow” because of budget cuts is the same idea, with a lethal twist.  Closing the Washington Monument doesn’t endanger tourists, it merely inconveniences them.  Reducing plowing and not painting fog lines will likely lead to more crashes, injuries and deaths.

Mocking ODOT’s supposed “Safety First” and “Vision Zero” Policies

ODOT plans to slash these basic safety expenditures even as the state is experiencing increasing levels of traffic crashes, deaths and injuries.  Just this month, the Oregon Health Division released a new dashboard showing the increasing death toll on the state’s roads and highways.  Fatal injuries on Oregon roadways are up 71 percent since 2010, with more than 600 Oregonians killed.

 

The dashboard:  Highway deaths up 71 percent since 2010

ODOT’s own stated goal is Zero fatalities and serious injuries–something it is utterly failing to do.  The state’s Transportation Safety Action Plan says says the long term goal is for zero fatalities and serious injuries.  The state’s target for 2022 was 444 deaths (TSAP, page 9); the actual number was over 600.

Oregon is committed to zero transportation-related fatalities and serious injuries. To make progress and improve traffic safety, stakeholders and partners are tasked with coordinating priorities, leveraging joint resources where possible, and using quantitative data-driven tools (e.g., benefit-cost analysis). Funds are limited; therefore projects, programs, and policies will need to be prioritized to focus on those treatments which will have the greatest benefit toward achieving the vision of zero fatalities and serious injuries. (TSAP, page 72, emphasis added)

ODOT’s own plans call for making safety a priority, even when there are tradeoffs with other objectives.  It’s adopted Transportation Safety Action Plan calls for a quote “Safety First” prioritization.

For those who address transportation and/ or safety in their jobs, including the . . .  ODOT,. . . cultural shifts will be seen when safety is prioritized as a core value. A strong safety culture means that agency leadership and employees, at all levels, are encouraged, and rewarded for prioritizing safety, and identifying safety issues and solutions while carrying out their agency’s missions and their individual job responsibilities.
TSAP, page 60.

ODOT’s decision to slash maintenance expenditures by 20 percent, while cutting its overall budget by 5 percent (and holding harmless a handful of megaprojects and consultant spending) flies in the face of its professed “Vision Zero” policies, and clear direction to prioritize safety first.

Megaprojects and ODOT Cost Overruns

ODOT is pursuing three massive highway expansion megaprojects in the Portland Metropolitan area wiht a total price tag of about $10 billion.  Each of these projects costs more than $1 billion per mile of highway:  The five-mile IBR is $7.5 billion (about 1.5 billion per mile), the one and a half mile  Rose Quarter project is $1.9 billion (about $1.3 billion per mile) and the half-mile long I-205 Abernethy Bridge is $622 million (again, more than $1 billion per mile).  Each of these projects has experiences enormous cost increases in the past three years, totalling more than $2.5 billion in increased costs.  ODOT has shown no ability to accurately predict or control project costs, so further cost increases on all these projects are possible.  These costs dwarf the cost of snow plowing and the revenue impacts of electric vehicles, yet ODOT says nothing about these expensive projects or their cost overruns in their explanation of their budget problems.

 

Prioritizing Funding for Consultants

These mega-projects involve hundreds of millions of dollars for consultants.  OregonDOT and Washington DOT spent more than $200 million on the failed effort to plan the Columbia River Crossing (the failed earlier version of the IBR).  Its already spent more than $100 million on the new IBR.  Likewise, Oregon DOT has spent about $110 million on consultants and staff for the I-5 Rose Quarter Project.

At its June 2023 meeting the Oregon Transportation Commission approved funding for $40 to $60 million to do more design work on the Rose Quarter project, mostly for consultants, whilea cknowledging that it simply doesn’t have the roughly $1.9 billion it would cost to actually build the project.
For each of these projects, ODOT has spent millions on public relations and communications consultants.  Here is a listing of the amounts paid to such consultants for the I-5 Bridge Replacement and the Rose Quarter.  The total is more than $16 million, so far.

 

 

 

 

Exaggerated Benefits, Omitted Costs: The Interstate Bridge Boondoggle

A $7.5 billion highway boondoggle doesn’t meet the basic test of cost-effectiveness

The Interstate Bridge Project is a value-destroying proposition:  it costs more to build than it provides in economic benefits

Federal law requires that highway projects be demonstrated to be “cost-effective” in order to qualify for funding.  The US Department of Transportation requires applicants to submit a “benefit-cost” analysis, that shows that the economic benefits of a project exceed its costs. We take a close, critical look at the benefit-cost analysis prepared for the proposed $7.5 billion Interstate Bridge Replacement project between Portland and Vancouver.

City Observatory’s analysis of the Interstate Bridge Replacement Benefit-Cost Analysis (IBR BCA) shows that it is riddled with errors and unsubstantiated claims and systematically overstates potential benefits and understates actual costs. . 

  • It dramatically understates the actual cost of the project, both by mis-stating initial capital costs, and by entirely omitting operation and maintenance and periodic capital costs.
  • The construction period is under-estimated, which likely understates capital costs, and overstates benefits 
  • In addition, the study also omits the toll charges paid by road users from its definition of project costs, in clear violation of federal benefit-cost guidelines. 
  • In addition, the IBR BCA study dramatically inflates estimated benefits. 
  • It uses an incorrect occupancy estimate to inflate the number of travelers benefiting from the project. 
  • The IBR BCA analysis also presents inflated estimates of safety benefits, based an incomplete and un-documented crash analysis. 
  • In addition, ODOT’s study fails to separately present the benefits and costs of the project’s tolling and capacity expansion components, and omits an analysis of the distribution of benefits and costs among different demographic groups.

A correct evaluation of this project shows that its costs exceed its benefits by a wide margin.  What this means is that the proposed freeway widening is not cost-effective; not only is it not something that qualifies for federal funding, it also is a demonstrably wasteful, value-destroying expenditure of public funds.  The amount of money that the federal government, the States of Oregon and Washington, and highway users would pay in tolls, exceeds by a factor of more than two the actual economic benefits that would accrue to a subset of highway users.  This is a project that would make us worse off economically—exactly the kind of project that the cost-effectiveness standard is established to prevent.

Benefits are overstated

ODOT and WSDOT claim that the present value of benefits from the IBR project amount to more than $4 billion; nearly all of these benefits are attributed to travel time savings, congestion cost reductions and seismic resilience, and reduced crash losses.  ODOT’s estimates of both travel related savings and crash reductions lack documentation.

Travel Benefits:  The IBR BCA claims that the project will produce $2.4 billion in travel time benefits.  ODOT’s estimates are plagued with errors and a lack of documentation

  • Travel benefits are minuscule to individual travelers—averaging about 20 seconds in a typical five-mile trip, according to the BCA.  These savings are imperceptible to individual travelers and are likely to be of no significant economic value.
  • The estimates use the wrong value for peak hour vehicle occupancy, exaggerating peak travelers by 13 percent.  The BCA assumes 1.67 passengers per vehicle while USDOT guidelines prescribe a figure of 1.48 passengers per vehicle.
  • The project fails to document the diversion of traffic to the parallel I-205 bridge as a result of charging tolls on I-5; this will cause longer trips for 33,000 diverted vehicles per day, and will increase congestion and travel times for the 220,000 persons crossing the I-205 bridge.  These costs will largely offset the travel time savings purported to accrue to travelers in the project area.

The Benefit Cost Analysis concedes that tolling the I-5 bridges will divert traffic to the I-205 bridge, but the project’s benefit cost analysis only models the effect of the project in the study area.  The added cost, pollution and other effects on the I-205 area are not included in the benefit cost analysis.

 

The Benefit Cost Analysis admits:

The Build scenario assumes tolling for the highway river crossing. The added cost from inclusion of tolls causes a reduction in I-5 auto trips as people shift to transit, use the alternative I-205 crossing, or change their destination to avoid the crossing

As described, this benefit-cost analysis is highly selective:  it counts beneficial time savings in the project’s “study area” but ignores the costs in added travel distances, travel times and congestion that will occur outside the study area when traffic diverts to avoid tolls.

Resiliency Benefits:  The IBR BCA claims savings for lives lost in a potential earthquake, savings on the cost of a replacement bridge, and added savings in traveler delay in the event that the bridges collapse in an earthquake.  All these estimates are exaggerated, including probability of a major seismic event, likelihood of collapse, fatality rate in the event of a seismic event, number of persons on the bridge at the time of an event, the cost of replacing the bridge, and the scale of added travel that would result from traffic disruption if the bridge collapses.  

Safety Benefits:  The IBR BCA claims that the project will reduce crashes on I-5 and will produce benefits with a present value of approximately $53 million.  The IBR-BCA asserts that it has used the ISATe model to predict a 17 percent decline in crashes in the project area. Also, it has not documented what features of the project produce the supposed ISATe benefits, and it has failed to calibrate the ISATe model for I-5, and the ISATe methodology can’t be used to accurately compute crash reduction on highways with ramp-metering, which I-5 has. 

Costs are understated

The IBR BCA  claim that the present value of the initial capital costs of this project are $2.7 billion.  That is a significant understatement.  The project’s construction cost, according to other IBR BCA documents is as much as $7.5 billion.  IBR BCA’s failure to comprehensively account for project costs violates federal benefit cost guidance which requires that costs include “the full cost of the project. . . regardless of who bears the burden . . including state local and private partners . . ”  This should include tolls paid by users.

Costs Exceed Benefits by a Wide Margin

After we correct IBR BCA’s study for under-counted costs, and unsubstantiated benefit claims, the project’s benefit-cost ratio falls to dramatically less than one, which is the minimum standard for meeting the statutory requirement that the project be cost-effective.  Our corrected estimates show that the actual cost of the project ranges as high as $5 billion. The actual benefits of the project, are roughly $2 billion.  This means that the project has a benefit-cost ratio of between 0.4 and 0.3, well below the minimum threshold of 1.0.  The correct analysis shows that the I-5 Bridge Replacement project is a value-destroying endeavor:  it costs users and taxpayers far more than it provides to the public in benefits.  It is not cost-effective, and should not be approved by FHWA.

Failing to disaggregate benefits and ignoring distributional impacts

Federal regulations require that a benefit-analysis separately report the benefits and costs of independent elements of a project.  This is to prevent a prospective applicant from combining an ineligible project (with costs that exceed benefits) with an eligible project (with a positive benefit-cost ratio) in order to get a larger amount of federal funds.  The IBR project consists of at least two elements with independent utility:  a plan to toll I-5, and the proposed widening of the highway, intersections and approaches.  Nearly all of the travel time benefits associated with the project result from tolling, according to IBR BCA’s own analysis.  Appraised separately, the tolling would have a far more favorable benefit-cost ratio than the highway expansion. To comply with federal requirements, IBR BCA should produce separate benefit cost estimates for each component of the project.

Federal regulations strongly encourage applicants to examine the distribution of benefits and costs among different segments of the population.  IBR BCA included no distributional analysis in its benefit-cost study.  Nearly all of the travel time, and congestion reduction benefits accrue to peak hour travelers.  Yet a majority of the the cost of tolls are likely to be paid by travelers who use the I-5 during off-peak hours; these off-peak travelers get no travel time benefits.  In effect, they are made worse off:  they have to pay a toll even though they get no better service than under the no-build scenario.

Conflict of interest and risk of fraud

The benefit-cost analysis is more than a mere formality:  it is a legal requirement for the $7.5 billion project to qualify for federal aid.  False representations made in the IBR BCA could represent fraud. It is concerning that the benefit-cost analysis is prepared by a private sector contractor with a direct financial interest in the construction of the IBR. The Benefit-Cost Narrative report indicates that the report was “Prepared by WSP.” Financial records obtained from the IBR project pursuant to a public records request show that WSP has current contracts to perform paid work on the Interstate Bridge Replacement Project valued at $76,282,807.03. Indeed, WSP is the single largest contractor for the project. In the event that federal funding is not forthcoming, it is unlikely that the project will proceed, and WSP will lose this lucrative source of income. WSP is not, and cannot be, an independent and objective evaluator of the benefits and costs of this project. It has a blatant conflict of interest, which is not disclosed.

City Observatory Analysis of Interstate Bridge Project Benefit-Cost Analysis

Cortright_IBR_BCA_Critique_Nov2023

Britain’s Caste System of Transportation

UK Prime Minister Rishi Sunak proclaims the primacy of drivers

“We are a nation of drivers”

Those who don’t own cars, or can’t, or choose not to drive, are second class citizens

The transportation culture war is flaring up in Britain.  Conservative Prime Minister Rishi Sunak has cancelled the nation’s big high speed rail initiative (HS2), even plowing salt into the ground by pledging to aggressively sell off property acquired for rights-of-way.  But that’s just part of his posturing, like that of Phillips Oil’s “76” brand, to show he’s “on the driver’s side.”

Hot on the heels of the rail cancellation, Sunak prominently issued a Driver’s manifesto, proclaiming that the United Kingdom is “a nation of drivers.”

On Sunday, I slammed the brakes on anti-motorist measures. For many, our car is a lifeline. We use them to get to work or see our family. But too often drivers feel under attack. Our new plan will put drivers back in the driving seat and improve their experience on the road.

A caste system for transportation, and those not in cars are the untouchables

In a “nation of drivers” people who walk, cycle and take transit are non-citizens.

Sunak’s comments lay bare the caste-system of transportation in Britain.  Those wealthy enough and able enough to own and drive cars are in the favored caste.  Those who can’t or choose not to drive, are in the lower-caste untouchable and unimportant.

The “war on cars” strategy is a blatantly political attempt by Sunak, who’s Conservative Party trails badly in polls and which will confront a general election in the next year or so.  It’s evident that campaign advisers have made a cynical calculation that many voters will identify themselves as oppressed victims.  According to UK Census data, though, 17 million Britons live in households that don’t own a car, and such households represent about 22 percent of all UK households.  And even households that own a car frequently include many residents who don’t or can’t drive, and who may choose or want to walk or cycle near their homes.

Phony claims of a “war” on drivers

In addition to cancelling a major rail project, Sunak also has attacked 20 mile per hour speed limits in dense urban neighborhoods calling them “against British values.”  He also spoke out against and “low traffic zones”—local policies that have been shown to reduce traffic and improve safety.  Sunak proposes stripping local councils of the ability to implement such measures on local streets.

The claim that drivers feel “under attack” met with appropriately graphic replies on social media.

More substantively, The Economist wrote that Sunak’s claims of a “war on drivers” were simply hogwash.

Prominent claims of a “war on drivers” will likely inflame passions, but as the media coverage of Sunak’s obviously political gambit shows, it may help generate some objective scrutiny of these claims.  If we look closely at the data, the aggrieved victims of our manifestly unfair transportation system are not drivers, but those who cannot or choose not to drive.  The success of local initiatives to reduce driving, lower traffic speeds, make walking and cycling safer, and make transit more available and more convenient, all signal that we can make or transportation system better and fairer by moving away from a world where a car is effectively a prerequisite to full citizenship.

Gentrification and Housing Supply

New York lost more than 100,000 homes due to the combination of smaller, more affordable apartments into larger, more luxurious homes

When rich people can’t buy new luxury housing, they buy up, and combine small apartments to create larger homes.

This is a negative sum game:  the number of housing units gained by high income households is fewer than the number

If you’re worried about gentrification and displacement, this is a vastly larger problem than new construction–which has been repeatedly shown to lower rents and create more housing opportunities for lower income households.

The obsession with fighting new development reflects a profound cognitive bias in thinking about housing:  we equate new units with unwanted change, while ignoring the effectively invisible destruction of existing units by upscaled combinations.

New York Lost 100,000 Homes to Consolidation

A new study reported in The City finds that over the past several decades, the number of homes in New York has declined by more than 100,000 as smaller apartments have been consolidated into larger homes.  The data come from a thesis prepared by Adam Brodheim of Columbia University.

The effect of unit consolidation has been to partially or totally offset the positive supply effects of new construction.  In some neighborhoods, the number of housing units lost to these combinations dwarfs new construction.  In New York, the largest number of units have been lost in Manhattan and Brooklyn.

Combinations and Gentrification

The demand for consolidation comes from higher income families who want to live in the city but can’t find units that are large enough to accommodate their needs and income.  In a very real sense, the failure to build enough new luxury housing means these higher income households don’t go away, they outbid multiple middle and lower income households for these units.

Do each of these brownstones have four apartments, or only one? (Flickr: Sharona Gott)

Building more high end housing keeps those with high incomes from moving down market and out-bidding those with less income for the existing housing stock, we still hear this argument. For remaining doubters, have a look at Noah Smith’s thought experiment, asking what we think would happen to housing prices  if we suddenly demolished 10,000 units of expensive housing.

This study confirms exactly the Smith’s thought-experiment posed by economist Noah Smith some years ago: The households don’t disappear; they outbid people with less income for the housing units that remain.  Limiting supply doesn’t reduce demand, especially by high income households.  The demand is there whether you supply new larger, luxury units or not, and with no other place to go, it spills over into other parts of the housing market, to the detriment of everyone else lower on the ladder.

And in New York, with these high end-remodeling combinations, the result is actually a negative sum game.  High income households don’t simply displace lower income households one for one:  each new combined unit to house one higher income household displaces multiple households with lower incomes.

If you are concerned about gentrification, you ought to be deeply concerned about these conversions, rather than new construction.  While the knee jerk solution might be to try and block combinations, that misses the fact that the underlying problem is that there are simply too few units and too little space compared to the number of people who want to live in cities.

(We see the same thing happening in Silicon Valley, where an otherwise unremarkable ranch house from the 1950s commands a multi-million dollar price tag–because its so difficult to build new housing there).

An invisible process produces cognitive bias

The process is largely invisible:  Unlike new buildings, which are obvious, public and highly regulated, the combinationof apartments occurs out of public sight, behind closed doors and with minimal regulatory scrutiny:

. . . three previously multi-unit brownstones have been converted into single-family homes over the years — but you’d never know it unless you spotted a construction permit, or noticed multiple buzzers replaced by one doorbell, he said.

“From the perspective of most people on the street, they’re not noticing that seven fewer families are able to live on this block … and this happens all the time,” Brodheim told THE CITY. “Unlike new buildings, which have to go through this huge gauntlet of, often, public opposition to create new units, here you’re able to get rid of apartments without anyone noticing.”

As we’ve noted in our analysis of gentrification, there’s a profound cognitive bias in understanding neighborhood change.  Our research shows that there’s been a dramatic increase in concentrated poverty in US cities, and that poor neighborhoods tend to hemorrhage population.  Fewer than one in twenty high poverty US neighborhoods gentrified over four decades; far more commonly high poverty neighborhoods spread and lost population.  But  these processes occur slowly, over decades, and are imperceptible or simply unperceived by most residents.  In contrast, new construction is obvious, and people understandably associate it with neighborhood change.  Our attention is naturally drawn to those places where an urban transformation is happening the most rapidly; new investment and construction are much more noticeable than the imperceptible processes of neighborhood decline.

Adam Brodheim, 2023. “Bigger Houses, Fewer Homes: Dwelling Unit Consolidation in New York City.” M.S. Historic Preservation Thesis, Columbia University.
https://www.thecity.nyc/housing/2023/8/24/23843686/100k-apartments-lost-to-house-conversions?s=09

The ten lane freeway hiding in Rose Quarter Plans

Secret ODOT plans obtained by City Observatory show ODOT is planning a ten-lane freeway through the Rose Quarter

Though the agency claims its “just adding one auxiliary lane” in each direction, the I-5 Rose Quarter project is engineered with a 160-foot wide footprint, enough for 10 full travel lanes and extra wide shoulders. 

In places the I-5 Rose Quarter project would be as much as 250 feet wide.  ODOT’s plans are to double or triple the width of the roadway from its current 82 feet.

ODOT plans are drawn to conceal the massive width, with cartoons and misleading “Not to scale” drawings.  The project’s 2019 Environmental Assessment implied the roadway would be only 126 feet wide, but these newly obtained plans confirm it will be vastly wider.

Once built, ODOT could re-stripe this massive roadway for 10 lanes in an afternoon.

The agency has not disclosed the true size of the project, and its Environmental Analysis doesn’t consider the traffic, pollution and safety effects of a ten lane structure.

The project’s massive width—not its covers—are the real reason for the project’s huge expense, which has exploded from $450 million in 2017 to an estimated $1.9 billion today.

ODOT has ignored its own international expert consulting engineers who called for a much narrower roadway, and alternating refuges in the tunnel section to minimize expense.

For years, we’ve been pointing out the lengths that the Oregon Department of Transportation (ODOT) has gone to in concealing the width of it’s I-5 Rose Quarter “Improvement” Project.  ODOT claims that it is merely adding one “auxiliary lane” to the freeway.

But documents—newly obtained by City Observatory from a public records request—show that the I-5 Rose Quarter Freeway will be as 160 feet wide, and in places much as 250 feet wide as it slashes through Northeast Portland.  That’s about two to three times wider than the existing 82 foot roadway.

The proposed Rose Quarter I-5 mainline is 160 feet wide, enough for a ten-lane freeway

As it passes under Broadway and Weidler streets, the main stem of the widened I-5 freeway (excluding the separate on- and off ramps) is a full 160-feet wide according to the previously unreleased ODOT drawings.  The ODOT diagram purports to show the number of lanes and the width of shoulders, but if you look closely, there’s something fishy going on here.  First, the fine print at the bottom of the drawing says “No Scale” which is an admission that despite the measurements shown, this is not a true scaled drawing of the project. That becomes clear when you start comparing the measurements indicated for the lane and shoulder markings with the overall width of the project.  Let’s take a closer look at the plans.

Both the Northbound and Southbound clear-span areas is about 81 feet wide.  According to ODOT’s labeling there are just three 12-foot wide travel lanes in each of these 81 foot openings, meaning that 45 feet of the width under the span isn’t actually being used for travel lanes and is available for shoulders.  We’ve added the red annotations to the ODOT diagram below.

ODOT drawing obtained via public records request (red annotations by City Observatory).

Despite ODOT’s misleading and incomplete labeling of this diagram, its apparent that the project will easily allow construction of a 10-lane freeway (with five through travel lanes in each direction) at Broadway-Weidler.  We’ve further annotated the ODOT drawing to show full 12-foot travel lanes and ten-foot inside and outside shoulders.

ODOT drawing obtained via public records request (red annotations by City Observatory).

ODOT’s own consultants, the international design firm ARUP recommended narrower lanes (11 feet) and much narrower shoulders (as little as three feet) to minimize project costs.  Elsewhere in the project, ODOT is using 11-foot through-travel lanes and narrower shoulders.  If it used followed standard industry practice here, ODOT could stripe the freeway for twelve-travel lanes, and urban-standard freeway shoulders.

The freeway will be as much as 250 feet wide

While the mainstream of the freeway at Broadway-Weidler will be 160 feet wide, the project is actually even wider at its North end.  As it crosses under Hancock Street, the I-5 freeway will be 250 feet wide—more than three times wider than the current roadway, and more the twice as wide as depicted in the project’s 2019 Environmental Assessment.  ODOT provided an image of the project that provides some key details, including the fact that the two spans of the overpass will be 130 feet and 120 feet wide.  On this section, the freeway consists of two southbound offramp lanes, six travel lanes (three in each direction) and two northbound on-ramp lanes, plus considerable additional room for shoulders as well as other space which is not labeled (more on that in a moment).

ODOT drawings: 2019 Environmental Assessment (top), and via public records request (bottom).

 

The Rose Quarter project  so  expensive because its too wide

The reason the Rose Quarter project’s budget has exploded from $450 million just six years ago, to as much as $1.9 billion today is because of the bloated size of the proposed roadway. Building a roadway that is two to three times wider than thecurrent I-5 freeway is the primary driver of high costs, according to ODOT’s consultants ARUP. The overly wide freeway requires enormous beams to support the lengthy overpasses: the girders (BT84) for the overpasses have to be about 7 feet tall.  To accommodate the wider footprint, and provide sufficient vertical clearance over the roadway, ODOT will have to excavate an enormous area, and dig deeper to lower the level of the freeway (notice the brown area to be removed on the Broadway-Weidler diagrams). And while ODOT wants to blame the covers for the high cost of the project, its actually the grossly oversized width of the freeway that drives up the cost of both the roadway, and the covers: a narrower freeway would be vastly cheaper to cover. As part of the Interstate Bridge project, WSDOT is proposing to build an acre-sized cover over I-5 to connect downtown Vancouver to historic Fort Vancouver for a cost of less than $40 million.

ODOT has actively concealed the width of the Rose Quarter Project

ODOT has undertaken multiple and sustained efforts to hide the actual width of the highway widening.  They’ve falsely repeated the Orwellian claim that they’re adding a single auxiliary lane in each direction.

They’ve published false and misleading “not to scale” illustrations that understate the true width of the project as part of the federally required Environmental Assessment.

Instead of presenting the actual plans, ODOT has repeatedly published misleading, not-to-scale illustrations purporting to show the width of the project.  The original 2019 Environmental Assessment contained this drawing.  It didn’t show the actual overall width of the project, but labeled lane and shoulder widths that together totalled 127 feet

Misleading ODOT illustrations from 2019 Environmental Assessment

In the Supplemental Environmental Assessment released late in 2022, ODOT updated and repeated this same tactic, producing two new illustration).  It shows the existing alignment and the proposed allignment, as follows.  The fine print at the bottom says “not to scale), but the diagram make the proposed project look narrower than the existing roadway.

.  

 

City Observatory and others have long pointed out that ODOT is planning a much wider roadway at the Rose Quarter than it lets on.  In 2020, the Oregon Transportation Commission directed ODOT staff to provide City Observatory with information about the actual width of the freeway. When pressed to provide details, and actual overall width measurements, ODOT provided elliptical and disingenuous responses to specific written questions asking for a statement of the project’s actual width.  Here is an image of ODOT’s 2021 written response to a question asking the width of the project.

There is not a single number mentioned.  Even when explicitly directed by the Oregon Transportation Commission to say how wide the road would be, ODOT officials dissembled and obfuscated, failing to reveal information that was clearly known to them.

In 2021, City Observatory obtained, via Freedom of Information Act requests and other sources, plans produced by ODOT showing that the project was actually designed to be 160 feet wide.  These internal documents date back to 2016, and show a decision on project width was locked in by ODOT staff for years–just not revealed to the public.

Now, the latest plans for the Rose Quarter project show that it will be, at least in places, more than 250 feet wide.  As before, these are documents that have not been released to the public until ODOT was forced to provide them via a public records request.  In July, 2023, ODOT initially insisted on being paid more than $2,000 to release these records, asserting that their release to the group No More Freeways was “not in the public interest.”  After an appeal to the Oregon Attorney General’s office, the records were released without charge.  ODOT continues to treat the actual size of this hugely expensive project as a state secret, something the public is not allowed to know.

ODOT’s persistent efforts to conceal the true width of the proposed I-5 Rose Quarter project are an attempt to cover up the reasons for the extraordinary cost increases and un-disclosed environmental impacts of this project.  A 160- to 250-foot wide roadway will further divide the neighborhood—repeating and aggravating the historic harms ODOT has inflicted on Albina.  The ten through lanes this widened roadway will enable will produce additional traffic and pollution, and it will pour this added traffic onto nearby city streets, creating safety problems and turning nearby areas into hostile, traffic-burdened places, inhospitable to people and new development.  ODOT has failed to look at a right-sized solution—simply capping the existing highway, or only widening it enough to accommodate the single additional auxiliary lane they say they want (something that could be accommodated in a roadway perhaps 24 feet wider than the existing 82-foot wide roadway).

 

Metro’s Climate-Denying Regional Transportation Plan

Portland Metro’s Regional Transportation Plan (RTP) does nothing to prioritize projects and expenditures that reduce greenhouse gases

Metro falsely asserts that because its overall plan will be on a path to reduce GHGs (it wont), it can simply ignore the greenhouse gas emissions of spending billions to widen freeways

The RTP’s climate policies don’t apply to individual project selection;  projects are prioritized on whether they reduce vehicle delay—a failed metric it uses to rationalize capacity expansions that simply induce additional travel and pollution

The RTP environmental analysis falsely assume that ODOT will impose aggressive state charges on car travel, including carbon taxes, a mileage fee and congestion fees than have not been implemented, and may never be, to reduce VMT

The RTP’s traffic modeling fails to incorporate the effect of expected pricing on the need for additional capacity.  Modeling done by ODOT shows that pricing would eliminate the need for capacity expansion, saving billions, and reducing greenhouse gases.

Transportation is the largest and fastest growing source of greenhouse gases in the Portland Area;  every one of the state, regional and local plans to reduce transportation greenhouse gases is clearly failing.  The proposed 2023 Regional Transportation Plan could be a vital tool for prioritizing actions to reduce transportation GHGs.  It isn’t.  It’s a vehicle for justifying a multi-billion dollar wish list of road projects, and pretending that someone else will solve the climate problem.  The plan does nothing to use climate criteria to prioritize spending decisions, and instead, gives a pass to expensive road expansion projects that will encourage more driving and higher levels of greenhouse gases.

Climate denying transportation plans: Golfing at Armageddon

State and regional transportation plans fail to acknowledge the grim reality of increase transportation greenhouse gases (GHGs).  As we’ve documented at City Observatory Metro (and others) have concealed the fact that transportation emissions are increasing by ignoring actual inventory data, and instead, reporting fictional results obtained from their own models, that ignore actual emissions information, and instead make rosy and unsupportable assumptions about future technology, market trends and policy.  In essence, these plans pretend that transportation GHGs are already decreasing, and will decrease even more dramatically in the future.

RTP Priority:  Billions for highway construction and expansion

The Regional Transportation Plan is an official, federally required planning document that spells out how the region will invest in transportation over the next two decades.  This is exactly the time when scientists tell us we must take decisive action to reduce greenhouse gas emissions.  But the largest projects—and the bulk of the expenditures—in the RTP are highway construction and widening that will facilitate more car travel, and increase greenhouse gas emissions.

The RTP document tries to downplay the emphasis on road building with a misleading graphic that shows dots for each project.  The massive Interstate Bridge Replacement is one tiny dot, the huge Rose Quarter widening one tiny dot, the I-205 Abernethy one tiny dot—even though these represent more than $10 billion in capital spending.

The fine print text acknowledges that this is mostly a few big highway projects, but even then substantially understates their true costs.  The Executive Summary fine print says:

. . . the “big three” projects—the I-5 Interstate Bridge Replacement Program, the I-5 Rose Quarter Project, and the I-205 Widening and Toll Project—each cost more than $1B.

In fact, the estimated price tag for the IBR is as much as $7.5 billion, the Rose Quarter project has ballooned to $1.9 billion.  .  The RTP neither reflects the current cost estimates of these projects, nor the likely costs of further cost overruns, which are endemic on major ODOT highway projects.

The RTP spends bulk of its capital on projects that add capacity to freeways—even though a decade old Metro climate plan conceded that these have “low”impact on reducing GHGs.  And in fact, all of the available science on induced demand shows that added capacity increases driving, and increases emissions.

How can Metro square spending billions on highway widening with the climate crisis?  As we pointed out earlier, Metro has ignore the actual inventory data showing increasing transportation greenhouse gas emissions, and substituted its own demonstrably wrong emission modeling to assert we’re on track to reduce greenhouse gas emissions.

Then the policies in the RTP use this umbrella assertion that “this is fine” to simply ignore the greenhouse gas emission effects of individual projects.  The result is a “drive and pollute as usual” approach to  the region’s transportation spending plans and policies.  The bureaucrats assert that because their models show that the overall plan will (based on wildly wrong assumptions) make progress toward the 2050 state goal, that there is essentially no need to rank or prioritize investments based on whether they increase or decrease greenhouse gas emissions.  Meeting the greenhouse gas reduction goal is a criteria applied only (and falsely) to the overall regional plan, and not to any specific projects.

This umbrella claim that the RTP as a whole RTP meets the state climate goals, is spelled out in policy:

Vehicle miles traveled (VMT)/capita will be a controlling measure in both system planning and plan amendments to ensure that the planned transportation system and changes to the system support reduced VMT/capita by providing travel options that are complete and connected and that changes to land use reduce the overall need to drive from a regional perspective and are supportive of travel options.

• For system planning, the final planned system must support OAR 660 Division 44 (Metropolitan Greenhouse Gas (GHG) Emissions Reduction rule) and OAR 660 Division 12 VMT reduction targets.

• For plan amendments, VMT/capita will be used to determine whether the proposed plan amendment has a significant impact on regional VMT/capita that needs to be mitigated or not.

System completeness and travel speed reliability on throughways are secondary measures that will be used to identify needs and inform the development of the planned system.

“Controlling measure” sounds imposing, but this is deceptive.  In effect,  the VMT reduction goals apply only to the overall plan, and to amendments to the plan.  Projects included in the plan are given a pass on whether they increase or decrease VMT (and greenhouse gas emissions).  While VMT is labeled as “a controlling measure” and travel speed is described as a “secondary measure,” the language of the RTP conceals the fact that the secondary measure really determines the priority for spending.  The RTP prioritizes project spending based on travel speed, not reducing VMT or greenhouse gases.

The RTP doesn’t prioritize spending money on projects that reduce VMT.  The RTP contains only  a requirement that plan amendments that increase per capita VMT have to be “mitigated.”  That’s problematic for a couple of reasons.  First:  several huge freeway widening projects are included in the plan itself, and aren’t amendments, so they won’t be mitigated at all.  Second, Metro claims that its models can’t actually detect whether projects—even very large ones, like the IBR or Rose Quarter Freeway widening—increase VMT.  Third, ODOT (falsely) claims that highway expansions  don’t increase VMT.  Metro has not adopted any  objective third party method for assessing per capita VMT effects of projects—like the CalTrans adopted induced travel calculator.  ODOT’s own technical manual simply denies the existence of induced travel and bars its inclusion in ODOT modeling).  Finally, the policy doesn’t limit or ban plan amendments that increase per capita GHG emissions—it only requires that increases be mitigated.  (The RTP fails to say where the mitigation will come from, especially if the region is actively implementing other ways to reduce VMT).

RTP travel speed standards prioritize projects to increase capacity

What the RTP does do, however, is create a rigid standard prioritizing travel speeds on throughways and arterials. Throughways need to provide no less than 35MPH at least 20 hours per day; other “signaled” arterials must provide at least 20MPH no fewer than 20 hours per day. These speed standards do apply to the prioritization of project spending.  While they are labeled as “secondary” these are in fact the “controlling” metrics for project selection and prioritization.

 

 

Again, in contrast, the climate standards, calling for a reduction in VMT  effectively only apply to the overall plan, not segments thereof, and only have to “support” possible VMT reductions, not actually result in them.

In sum, individual investments, even ones as large as the multi-billion dollar widenings of I-5 at the Rose Quarter and the Interstate Bridge are effectively exempt from any climate analysis.  Climate simply doesn’t matter for setting regional spending priorities.  The only thing that matters under the terms of the Regional Transportation Plan (RTP) is whether investments speed traffic.  The RTP sets a goal of making sure that area “throughways” travel at no less than 35 MPH 20 hours per day, and that area arterials travel at no less than 20 miles per hour for 20 hours per day.

Projects that speed traffic on highways have been proven to increase travel—a widely documented scientific finding called “induced travel” which means that wider roadways generate more vehicle miles of travel and more pollution.

The Metro RTP criteria give no additional weight or priority to projects that reduce transportation greenhouse gas emissions.  Speed, not greenhouse gases or safety, drives the distribution of resources under the plan.

RTP climate compliance depends on imaginary, unadopted policies

A key climate question is whether the region will reduce VMT.  The RTP contains little, if any information, on which of its investments will reduce VMT.  It makes a sweeping and general claim that providing transit (and other alternatives) “create the conditions” that could reduce VMT; but lower VMT has to come from reflecting back to drivers the true costs associated with their decisions.  When it comes to such actual financial incentives, the bottom line is that Metro assumes that as yet unadopted, and highly speculative state policies, not anything in the RTP, will reduce VMT.

The RTP counts on reduced driving as a result of ODOT and other state policies to make driving more expensive.  There’s an old economist joke, about how to solve the problem of opening canned food when one has no means to do so; the economists waves the problem away, saying “Assume we have a can-opener.”  Metro assumes that ODOT will produce a can opener in the form of a plethora of new fees on driving, including an unspecified carbon tax, a per mile fee of 6 to 10 cents per mile on all driving in the state, as well as a 9 to 17 cent per mile congestion fee for using throughways (limited access roads in Portland), plus tolls to finance the Interstate Bridge and I-205 bridges.  The RTP climate analysis assumes that the state will enact all these fees, and this will reduce driving and carbon emissions.

In effect, the RTP is overwhelmingly dependent on the purely hypothetical actions of others to achieve climate goals:  It depends on state and federal fuel economy, vehicle emissions and fuel policies to reduce emissions per mile driven, and depends on state imposed taxes and fees to reduce vehicle miles traveled.

If the state doesn’t take these actions—and while they would be smart policy, there is no guarantee it will do so—then the hoped for (and modeled) changes in VMT and greenhouses gases simply won’t occur.  But there’s nothing in the plan to pick up the slack, and meanwhile these dubious assumptions will have rationalized spending billions of dollars of irreplaceable public capital on projects that increase driving (just as the climate crisis grows worse).

Failure to include pricing in transportation demand modeling and project evaluation

There’s a profound contradiction in the RTP’s treatment of road pricing.  When it comes to climate strategy, and funding adequacy, the RTP assumes that pricing is a done deal.  When it comes to modeling traffic demand, and especially the need for added capacity, it simply ignores the effects of pricing.

The work that has been done on pricing shows that if the state implements any of the proposed pricing mechanisms (Regional Mobility Pricing or RMPP; tolling on the I-205 Abernethy Bridge or the Interstate Bridge), the region will not need to build any new capacity.  A particularly stark analysis was prepared by ODOT consultants showing that highway pricing (the RMPP) alone—and leaving the Rose Quarter in its current configuration—would be more effective in reducing traffic delays, congestion, VMT and greenhouse gases than spending $1.9 billion widening this 1.5 mile stretch of roadway.  Yet Metro has refused to examine the greenhouse gas implications of these project alternatives, and won’t even apply such tools to project evaluation.

The strategy assumes that the state and region institute a stringent per mile pricing of freeways and arterials for purposes of estimating climate compliance, but the transportation modeling used to justify new project and capacity assumes that the roads are unpriced.

New revenue mechanisms in the STS include a road user charge that levies per-mile fees on drivers, carbon taxes, and additional road pricing beyond what is currently included in the 2023 RTP. These changes are not reflected in the RTP because they are not yet adopted in state policies or regulations, but the climate analysis for the RTP is allowed to include them because these state-led pricing actions are identified in STS and were assumed when the state set the region’s climate targets.
(Emphasis added).

The net effect of including the effects of as-yet-unadopted pricing for climate analysis, but not including it in travel demand analysis for capacity expansion projects, is to create a falsely optimistic picture of climate progress, and a falsely exaggerated picture of the need for additional capacity.

The Cop-Out:  We’re following state rules

Metro’s RTP asserts that “this is fine” for climate because they are following LCDC rules for their land use plan which are designed to address climate change.  LCDC has adopted a “Climate Friendly and Equitable Communities” (CFEC) rule that requires Metro to plan to reduce VMT.  The key problem is that the CFEC rule is based on the same flawed ODOT analysis as the Metro RTP:  making wildly unsupportable assumptions about the rapid adoption of clean vehicles.

Complying with the LCDC rule doesn’t put the region on track to reduce driving or transportation greenhouse gases, and doesn’t demonstrate how we will comply with the legally adopted state goal to reduce greenhouse gases to 25 percent of 1990 levels by 2050:

468A.205 Policy; greenhouse gas emissions reduction goals. (1) The Legislative Assembly declares that it is the policy of this state to reduce greenhouse gas emissions in Oregon pursuant to the following greenhouse gas emissions reduction goals:

     . . . (c) By 2050, achieve greenhouse gas levels that are at least 75 percent below 1990 levels.

Instead, Metro asserts that its RTP conforms to LCDC regulations governing land use plans.  The RTP makes no mention of ORS 468A.205.

Both the LCDC rules and the Metro RTP are based on badly flawed modeling of greenhouse gas levels.  The modeling makes a series of incorrect and unsupported assumptions about vehicle fuel efficiency and emissions reduction technology.  As a result, the modeling wildly understates the actual level of greenhouse gases produced by transportation, and wildly overstates the current and future reductions in greenhouse gases due to greater efficiency.

The 2022 LCDC “Climate Friendly and Equitable Communities” Rule relies on 2016 modeling prepared by former ODOT employee Brian Gregor.  These figures have not been updated, despite a legal requirement that they do so.

Metro claims to have done additional modeling with its “Vision Eval” model.  That modeling assumes that average vehicle ages fall to less than seven years, and that passenger cars make up more than 70 percent of household vehicles.  As we’ve demonstrated both these assumptions are not only wrong, market trends are moving in the opposite direction of Metro’s forecast:  cars are getting older and larger, not smaller and newer (and cleaner) as assumed.

Metro is counting on improved vehicles and fuels for more than 90 percent of greenhouse gas emission reductions.  Appendix J of the RTP projects that the plan (which relies on pricing which is still speculative) will result in an 88 percent reduction in transportation GHG, with 81 percent reduction from fuels and vehicles, and 7 percent reduction from reduced VMT.  That means that 92 percent (81/88) of the reduction in greenhouse gases comes from policies other than those in Metro’s RTP.

These heroic and wildly exaggerated assumptions about improved vehicle fuel efficiency enable Metro to plan for only an extremely modest reduction in VMT.

The RTP is climate denial

Metro leaders talk a good game about climate.  They point to their nearly ten-year old Climate Smart Strategy.  They acknowledge the reality of climate change, and the general need to reduce greenhouse gases.  They’ve listened to national experts who point out the problems with traditional planning approaches.

In spite of all this, the RTP remains what it has always been, a highway-centric spending wish list.  All this version does, is add on an additional layer of rationalization to insist that the region continue building roads on the elaborate and plainly false assumptions that cars will become vastly cleaner, and ODOT will aggressively price roads and carbon.  The plan is still replete with billions of dollars of spending to increase highway capacity, including the $7.5 billion Interstate Bridge Replacement Project and the Rose Quarter.  These highway expansions facilitate continued car dependence and increased greenhouse gas emissions.

Like Metro’s so-called Climate Smart Strategy, the climate provisions in the RTP are a at best an afterthought, and a performative fig-leaf, meant to provide rhetorical cover to a vast investment strategy that is fundamentally at odds with reducing greenhouse gas emissions.

Metro has promised to update its “Climate Smart Strategy” from 2014, but in fact it hasn’t.

Clicking on the “climate smart strategy” link and it takes you to a nine-year old document that hasn’t been updated.  This is what still appears on the Metro website.

Metro’s real climate strategy is “Don’t look up.”

Metro’s RTP needs to examine the travel impacts of tolling and new capacity expansion

Metro claims that its travel modeling can’t really discern the effects of tolling on regional travel patterns, and instead of specific quantitative outputs it simply offers a series of descriptive, generalized statements—”qualitative findings”— about the impact of tolling.

The large-scale, aggregate nature of Metro’s travel model makes it challenging to detail the regional impacts of any single project, even one as potentially significant as tolling. Instead of attempting to isolate the impacts of tolling, Metro staff identified several qualitative findings about tolling’s impacts based on the modeling results for the constrained RTP scenario and on Metro’s experience supporting tolling analyses in the region

System Analysis Public Review Draft 2023 Regional Transportation Plan | July 10, 2023(Chapter 7, p. 7-7-28).

It is, in fact, possible and proven to estimate the effect of new highway capacity on travel patterns and greenhouse gas emission.sIn contrast, California and CalTrans have developed and created tools specifically to analyze the carbon impacts of individual projects:  The Induced Travel Calculator.  This calculator has been adapted to Oregon by the Rocky Mountain Institute.  Metro could use this calculator to estimate the carbon associated with highway expansion projects.  But ODOT, in a bit of science-denial, the Oregon Department of Transportation has specifically banned the used of induced travel analysis in state highway modeling.

 

The climate fraud in Metro’s Regional Transportation Plan

Metro’s Regional Transportation Plan rationalizes spending billions on freeway expansion by publishing false estimates and projections of greenhouse gas emissions

Transportation is the number one source of greenhouse gases in Portland.  For nearly a decade, our regional government, Metro, has said it is planning to meet a state law calling for  reducing greenhouse gas emissions 75 percent by 2050.

But the latest Metro Regional Transportation Plan (RTP) has simply stopped counting actual greenhouse gas emissions from transportation.

Inventories compiled by the state, the city of Portland and the federal government all show the region’s transportation emissions are going up, not down as called for in our plan.

In place of actual data, Metro and other agencies are substituting fictitious estimates from models; these estimates incorrectly assume that we are driving smaller cars and fewer trucks and SUVs, and rapidly replacing older cars.  None of those assumptions are true.

As a result greenhouse gases are going up; our plans are failing, and Metro’s Regional Transportation Plan, the blueprint for spending billions over the next several decades will only make our climate problems worse

This may be our last, best chance to do something to reduce greenhouse gas emissions from the largest and fastest growing source of such pollution in the state and region. Metro’s federally required Regional Transportation Plan is supposed to reconcile our transportation investments with our social and environmental goals.  Instead the draft RTP simply lies to the public about worsening greenhouse gas emissions, the failure of current efforts, and the inadequate and counterproductive aspects of the proposed RTP.

Portland and Oregon leaders proudly celebrate our acknowledgement of the gravity of the climate crisis and our oft-professed commitment to reduce greenhouse gas emissions.  For the mass and social media, there’s soaring rhetoric.

In the bureaucratic backrooms though, it’s pollution as usual.  No where is this more clear than when it comes to roadbuilding.  Oregon is embarking on the largest and most expensive highway expansion effort in 50 years, proposing to spend more than $10 billion in the Portland area on highways. All of those billion dollar plus highway expansion projects are contained in Metro’s proposed 2023 Regional Transportation Plan.

This, in spite of the fact that transportation is the largest and fastest growing source of greenhouse gases are higher now that they were in 1990, and every one of the state, regional and local plans to reduce transportation greenhouse gases is clearly failing.

State and regional transportation plans fail to acknowledge the grim reality of increase transportation greenhouse gases (GHGs).  Instead, they conceal the fact that our transportation emissions are increasing by ignoring actual inventory data, and instead, reporting fictional results obtained from their own models, and instead make rosy and unsupportable assumptions about future technology, market trends and policy.  In essence, these plans pretend that transportation GHGs are already decreasing, and will decrease even more dramatically in the future.

By steadfastly ignoring increasing emissions, Metro and the State of Oregon have simply ignored pledges made in their original climate planning to regularly measure progress, not in terms of checklists, but in terms of actual, measured reductions in greenhouse gas emissions.

Transportation and Climate:  Plans ignore reality

It’s been a decade since Metro’s first Climate Smart Plan in 2014, which promised to put the region on track to meet state greenhouse gas reduction goal—reducing emissions 75 percent from 1990 levels by 2050.

Since then, the urgency the of the climate crisis has grown manifestly worse, locally epitomized by weeks of suffocating smoke from climate-caused fires; record 116 degree heat that killed dozens (and likely more), and steadily warming oceans and melting glaciers and icecaps.

The clock is ticking; we’ve used up a quarter of the time we have to achieve our 2050 goal.  Now would be a good time to consider whether what we’re doing is working.  This question is especially salient given Metro’s consideration of the 2023 Regional Transportation Plan, which will spell out the course of transportation investment for the next five years (and following decades).  Since transportation is the largest source of greenhouse gases in the city, region and state, this transportation will be crucial to achieving our goals.

All evidence shows that Metro’s “Climate Smart Plan” has failed completely to reduce greenhouse gases.  Every independent inventory of transportation GHGs shows that emissions have increased since the plan was adopted.  The region already emits more transportation GHGs than it did in 1990; and the authoritative DARTE database found that regional transportation emissions are up 20 percent in the past five years.  And bafflingly, Metro’s RTP climate monitoring doesn’t even bother to report on emission trends.

Instead, the plan relies on its own optimistic modeling of future trends.  The problem here is that  the plan itself is founded on wildly unrealistic and already disproven assumptions about the rapid adoption of cleaner vehicles.  State and local transportation officials confidently predicted a decade ago that we’d rapidly replace older, larger, dirtier vehicles with cleaner newer ones.  In fact, the opposite has happened:  The average age of vehicles in Oregon is now up to 14 years, and heavier, dirtier trucks and SUVs make up nearly 80 percent of new vehicles old.  We’re no where near on track to achieve our greenhouse gas reduction goals.

But the plan assumes, falsely, that the average age of cars is about six years, and that two-thirds of vehicles are smaller, cleaner passenger cars.  It uses these assumptions to predict that greenhouse gas emissions will fall rapidly.  And even though reality has shown these assumptions to be wrong, modelers have doubled down on them, and now assume, for example, that cars will be replaced even faster than they thought a decade ago, even as the fleet gets older and older.

We’re failing to achieve our goal:  Transportation GHGs are increasing

Transportation emissions are the largest source of greenhouse gas emissions in Portland and in Oregon.  Transportation emissions account for 41 percent of greenhouse gas emissions in Multnomah County, and 32 percent of emissions statewide.

It’s good to have ambitious plans.  But ultimately, those plans have to work in the real world.  Locally, we have three different real world estimates of transportation greenhouse gases:  The federally sponsored DARTE database, a geographically detailed nationwide estimate of greenhouse gases broken down to 1 kilometer squares cover the entire nation, the Department of Environmental Quality’s annual statewide estimates of Oregon greenhouse gas emissions by source (residential, commercial, industrial, electricity generation and transportation), and Multnomah County’s annual accounting of local greenhouse gas emissions.  Every one of these estimates shows we are failing to reduce transportation greenhouse gases.

When it comes to transportation, we’re not making any progress in reducing our greenhouse gas emissions; in fact, greenhouse gas emissions are higher than in 1990 in Multnomah County (up 3 percent), the Portland Metro area (up 27 percent) and statewide (19 percent).  We’re going in the wrong direction.

 State, regional and local climate plans are failing

And since we adopted city, regional and state plans to reduce transportation emissions (the Portland Climate Action Plan in 2015, the Metro Climate Smart Strategy in 2014, and the State Transportation Strategy in 2013), transportation emissions have increased, not decreased.  From 2013 (the year before these climate plans took effect through 2019 (the last full year prior to the pandemic), greenhouse gas emissions form transportation have risen.

Oregon transportation GHG emissions are up 2.7 percent per year since 2013, Portland regional emissions are up 4.9 percent per year  and Multnomah County emissions are up 1.4 percent year.  Transportation emissions are going up when our plans call for them to be going down.  The result is a yawning and unacknowledged gap between our plans and reality.  The DARTE data show the region going rapidly in the wrong direction.

All of the available independent inventory data for the state, city and region make it clear that our transportation emission reduction plans are failing in monumental fashion to achieve their goals.

Climate plans haven’t been adjusted to reflect reality

Increased transportation greenhouse gases should be triggering stronger efforts to fight climate change. Metro committed to monitor the progress and implementation of its Climate Smart Strategy, and to take additional measures as needed.  This commitment appears in the Climate Smart Plan and is reiterated in the latest draft of the 2023 Regional Transportation Plan.  (RTP 2023 Draft, Appendix J, page 21)


Metro’s RTP fails to report increasing transportation greenhouse gas emissions

Despite these commitments, Metro’s RTP does not accurately report on regional greenhouse gas emission trends. It does not acknowledge that, contrary to the 2014 CSS and the 2018 RTP, transportation greenhouse gas emissions are increasing, not decreasing. The 2023 RTP contains no graph or time series information on transportation greenhouse gases in Portland; in contains only a single reference to the per capita level of greenhouse gas emissions in 2023 and 2045; both of these figures are obtained from Metro’s model, not from actual inventories of greenhouse gas emissions prepared by independent agencies.

We are “deviating significantly” from our earlier projections and plans, but we haven’t acknowledged it, and therefore, aren’t proposing to change our plan.

The RTP substitutes inaccurate models for actual data

ODOT, Metro, and LCDC are substituting flawed and biased models for actual data about carbon emissions.  Transportation greenhouse gas emissions are increasing, yet all these agencies pretend, based on inaccurate models, that they’re making progress toward reducing greenhouse gases.  The actual data show that vehicles on the road today (and tomorrow) are vastly older and dirtier than assumed in the models these agencies use to falsely portray their climate progress.

Both the LCDC rules and the Metro RTP are based on flawed modeling of greenhouse gas levels.  The modeling makes a series of incorrect and unsupported assumptions about vehicle fuel efficiency and emissions reduction technology.  As a result, the modeling significantly understates the actual level of greenhouse gases produced by transportation, and overstates the current and future reductions in greenhouse gases due to greater efficiency.

The 2022 LCDC “Climate Friendly and Equitable Communities” Rule relies on 2016 modeling prepared by former ODOT employee Brian Gregor.  These figures have not been updated, despite a legal requirement that they do so.

For the current RTP, Metro claims to have done new modeling with its “Vision Eval” model.  That modeling assumes that average vehicle ages fall to less than seven years, and that passenger cars make up more than 70 percent of household vehicles.

Both Gregor’s and Metro’s climate modeling assumes we will quickly replace the existing fleet of large, dirty fossil fueled vehicles, with newer, smaller, more efficient vehicles powered by electricity and/or clean fuels.  The modeling asserted that the amount of carbon pollution generated by each mile of vehicle traveled would be 80 percent less than it is today.  Unfortunately, we’re nowhere close to being on this trend.

The key assumptions are average vehicle age and mix of trucks/SUVs Metro and LCDC rely on projections of these emissions that have already been proven wrong.  Metro and LCDC assumed, critically and incorrectly, that the vehicle fleet would turnover more rapidly (dirty, older cars would be replaced more frequently by newer, cleaner ones) and that consumer preferences would shift from larger, dirtier trucks and SUVs to smaller and cleaner passenger vehicles.  Not only are both of these assumptions wrong, exactly the opposite has happened over the past decade:  the average age of automobiles has increased significantly, and the share of light trucks and SUVs has grown to almost 80 percent of new car sales.  The following RTP table summarizes Metro’s assumptions:

Metro’s assumptions are simply wrong:   the average car on the road today is vastly dirtier than assumed in Metro and LCDC modeling.  In essence, the climate modeling assumes that the typical car in today’s fleet is a relatively clean six-year-old Honda Civic, that emits about 257 grams per mile.  In reality, the typical vehicle in today’s fleet is a twelve-year-old quarter-ton pickup truck, that emits about twice as much greenhouse gases—555 grams per mile.

2023 Model assumption:  Typical car is a 2017 Honda Civic; 2023 Reality:  Typical vehicle is a 2010 Ford F-150.

These two mistakes in the Metro/LCDC modeling lead them to understate greenhouse gas emissions from the current fleet by 50 percent.

And these errors also affect future years.  The growing longevity of the vehicle fleet means that the future fleet will be less efficient (and much dirtier) than assumed in Metro’s modeling.  If the average age of vehicles stabilizes at the current 12 years, the median vehicle in 2035 will be a 2023 model year vehicle (eighty percent of which were larger, more polluting SUVs).  Fleet turnover will happen much more slowly, and emission rates will decline more slowly still.

Metro and LCDC projections assume that average emissions of GHGs will fall from about 450 grams per mile to about 100 grams per mile in 2045.  In reality, GHG emissions per mile are falling far more slowly.  In 2021, the average vehicle emitted about 390 grams per mile rather than the roughly 300 grams per mile assumed in Metro and state climate modeling.

The RTP should be based on actual, honest data about greenhouse as emissions

The first step is to accurately report our progress—actually backsliding—in terms of reducing transportation GHGs.  Instead of reporting claims based on models with false and now discredited assumptions, it needs to show that actual GHG emissions are rising, and present a clear case showing why this has happened.  It’s been because we’re keeping cars longer, buying bigger, dirtier vehicles, driving more, and not improving fuel efficiency as fast as excessively optimistic assumptions made a decade ago.  We have to “mark to market” our forecasts:  replace decade old guesses about what our transportation emissions would be with actual data on what we’ve really accomplished.

Once we’ve done that, we’ll see that we need to do much more, and do it far more quickly than we thought.  It’s been nine years since Metro adopted its Climate Smart Strategy in 2014.  Those nine years represent fully one-fourth of the time available to get the region on track to meet its goal of reducing greenhouse gases by 75 percent by 2050.  During those nine years, regional transportation greenhouse gas emissions have actually risen (by more than 20 percent, according to the DARTE inventory).  That means we have a bigger task, and a shorter period of time to accomplish it.  This simply isn’t reflected in the Regional Transportation Plan, in  state land use regulations, or the Oregon Department of Transportation’s “State Transportation Strategy (STS).

Appendix:  Vehicles are older, larger and dirtier than assumed in Metro climate models

The strategy assumes trends in vehicle type, fuel efficiency and fleet replacement that are the opposite of what we’ve experienced.  All of these errors lead to understating GHG emissions.

REALITY:  Average Vehicle Age is Increasing

Slower fleet turnover means that the vehicles on the road are on average, older and dirtier.  State modeling assumes that older vehicles are being replaced quickly; with the average age of a vehicle being 6 or 7 years.  In reality, the average vehicle is more than 12 years old.  The Oregon Department of Transportation reports that the average age of vehicles in Oregon is higher than the national average (14 years) and is increasing.  The climate modeling is wildly off:  the fleet is getting older, and the models assumed it would be getting younger.

The slow rate of fleet replacement is a particularly large problem for the modeling.  With an average age of 12 years, the median vehicle in 2035 will be a 2023 model.  Those vehicles average about 330 grams per mile.  That’s about 80 percent higher than the 180 grams per mile that state modeling assumes for the fleet in 2035.  The increasingly long life of vehicles locks in a high carbon emission rate.

The average age of vehicles on the road has increased to more than 12 years according to IHS Automotive.

REALITY:  Trucks and SUVs make up nearly 80 percent of new car sales. 

Fewer passenger cars, more light trucks and sport utility vehicles.  State modeling assumed that the share of trucks and SUVs would decline steadily, and that 60 percent or more of all private vehicles would be passenger cars, which use less fuel and emit less greenhouse gases.  In reality, nearly 80 percent of new vehicles sold today are light trucks and sport utility vehicles. The climate modeling is off by a factor of three, with passenger cars accounting for 20% of the fleet, not 60 percent.

Rose Quarter: So expensive because it’s too damn wide

The cost of the $1.9 billion Rose Quarter freeway is driven by its excessive width

ODOT is proposing to more than double the width of the I-5 Rose Quarter Freeway through the Albina neighborhood

ODOT could easily stripe the roadway it is building for ten traffic lanes

The high cost of building freeway covers stems from the project’s excessive width

WSDOT plans to cover I-5 in Vancouver for less than $40 million

The fundamental problem with the Rose Quarter project, and the reason why it has blown through its budget is that really a massive freeway widening project.  The agency claims its just adding a couple of “auxiliary” lanes, but in reality, its doubling the width of Interstate 5 in a complex urban environment, and its plans for a much wider roadway are the principal reason the project, and its covers, are so expensive.

A too wide freeway.

What no one seems willing to do is ask basic questions about the Rose Quarter.  Is the project worth $1.9 billion?  Does it even need to be that big and expensive?  Isn’t the skyrocketing cost and ODOT’s growing fiscal crisis a signal that we should consider some other options?

The high cost and prodigious cost overruns of the Rose Quarter are directly related to the excessive width of the project, something that ODOT has gone to great lengths to conceal, characterizing the project as merely adding a single auxiliary lane in each direction. In reality, the project would essentially double the width of I-5 through the Rose Quarter, from its current 82-foot width, to 160 feet (and in some places as much as 200 feet).

A brief chronology shows how ODOT staff have repeatedly concealed or obscured the width of the I-5 Rose Quarter project.  Their initial 2019 Environmental Assessment presented a misleading and cartoonish freeway-cross section that appeared to show that the freeway would be widened to about 126 feet.

City Observatory challenged these claims about the width of the freeway to the Oregon Transportation Commission in December 2020, and the commission directed the staff to meet with us to discuss the issue.  The staff refused to answer any questions during this meeting, and instead later issued a written report obfuscating the actual width of the freeway.

In March 2021, No More Freeways obtained three different internal project documents indicating that the actual width of the roadway would be 160 feet.  These included 2015 engineering drawings, as well as architect’s illustrations and computerized CAD files.

As we’ve pointed out at City Observatory, this cross-section could easily accommodate  10 travel lanes, and regardless of ODOT’s labeling, once built, the road could be re-striped in an afternoon.

Even the project’s Supplemental Environmental Assessment, released in November 2022 conceals the actual width of the project.  Here is the project’s own plan showing the freeway cross-section.  The plan omits measurements, so we’ve added scale markings showing 200 foot widths.

ODOT plans for I-5 Rose Quarter Freeway (200′ scale marking added by City Observatory)

ODOT’s own consultants, the internationally recognized engineering firm ARUP, concluded that the Rose Quarter project was vastly wider than it needed to be.  They pointed out that no comparable urban freeway in any city has the over-wide 12 foot shoulders designed into the Rose Quarter project.  ARUP concluded that the extreme width of the ODOT design was the principal reason freeway covers cost so much, and said the freeway could be 40 feet narrower than ODOT’s design.  ODOT’s own “Cost to Complete” report concedes that a key cost driver is the need to lower the surface of the existing roadway in order to provide the necessary vertical clearance over the much thicker overpass beams that will be needed to span the wider roadway.

Covers alone could be vastly cheaper

If this project consisted simply of building a cover over the existing I-5 freeway, it would be vastly cheaper.  Washington’s Department of Transportation is proposing to build a similar cover over a portion of I-5 in Vancouver as part of the Interstate Bridge Replacement Project; The cover, called the “Community Connector” is designed to re-connect historic Fort Vancouver with the city’s downtown.  It will be about 300 feet wide, and about an acre in size and is estimated to cost $37 million.

Vancouver’s proposed Community Connector cover I-5 for just $37 million

ODOT has never explored simply building a lid over the existing freeway to “re-connect” the community.  If this were simply about building a cover to re-connect the community, it could have been done by now for a fraction of the $115 million ODOT has spent so far, just on planning the Rose Quarter.

What to do instead:

ODOT could cap the I-5 freeway at the Rose Quarter without widening it.  And if ODOT is really committed to “restorative justice” reallocate available money for this project as reparations to the Albina community, and allow them to spend it however they see fit to rectify the damage done by the construction of of I-5, Interstate Avenue and the Fremont Bridge ramps.  Oregon routinely spends highway funds mitigating the environmental damage of its freeways, on everything from sound walls to wetlands.  It also has used highway funds to replace displaced structures (the old Rocky Butte Jail), and other states have used federal highway funds to replace housing destroyed by freeway construction.  If we were serious about redressing the harm done to the Albina neighborhood, we’d be looking to reduce the size of I-5, and spend more money improving the neighborhood, and building the housing ODOT destroyed.

 

Rose Quarter: Death throes of a bloated boondoggle

For years, we’ve been following the tortured Oregon Department of Transportation Plans to widen a 1.5 mile stretch of I-5 near downtown Portland.  The past few months show this project is in serious trouble.  Here’s a summary of our reporting of key issues

Another exploding whale:  The cost of the Rose Quarter has quadrupled to $1.9 billion.  In 2017, the project was sold to the Oregon Legislature based on an estimated price of $450 million.  Since then, ODOT has diverted nearly all of the money earmarked for this project to other freeway expansions.

ODOT’s Plan:  Extend and Pretend.  Governor Kotek forced ODOT to prepare a financial plan for its massive freeway expansion program.  ODOT now admits the Rose Quarter faces a $1.35 to 1.75 billion financial hole, with no identified solution.

Pens Down:  ODOT staff claim it’s too late to question the design of the bloated $1.9 billion Rose Quarter Freeway widening, even though they also say it’s only 30 percent designed, and they have a new design the public hasn’t seen yet.

 

The Rose Quarter project is so expensive because it’s too damn wide; Just up the road in Vancouver, the Washington Department of Transportation is planning an acre-sized freeway cover over I-5 to connect downtown Vancouver to historic Fort Vancouver for a mere $40 million.

Who sold out the Historic Albina Advisory Board?  ODOT has advertised its freeway widening project as a way to promote restorative justice for the historically Black Albina neighborhood it destroyed with decades of highway construction.  But now ODOT can’t fund the Rose Quarter project, because  for the How ODOT took money from the Rose Quarter project and used it to widen a suburban freeway bridge.

Lying about freeway width:  For years, ODOT has been concealing the actual width the Rose Quarter project, and deceiving the public about its plans for a 10-lane highway.

One-tenth of one-percent:  What Black contractors got from ODOT’s biggest construction project.  While ODOT claims to want to help Black contractors, its current largest construction project, the I-205 Abernethy Bridge, has spent just one-tenth of one percent of its budget with Black contractors.

 

ODOT’s I-205 Bridge: 1/10th of 1 Percent for Black Contractors

The Oregon Department of Transportation (ODOT) is falling short of its own goals of contracting with disadvantaged business enterprises

One-tenth of one percent of I-205 contracts went to Black construction firms

ODOT professed a strong interest in helping Black contractors as a selling point for the
I-5 Rose Quarter project, but instead advanced the I-205 Abernethy Bridge project, which has provided very little opportunities for Black-owned firms.

ODOT has been dangling promises of lucrative construction contracts for Black construction firms if its proposed $1.9 billion I-5 Rose Quarter freeway widening project goes forward.  Not surprisingly, as we reported earlier, these firms and many in the local community were angered to hear that the Rose Quarter project was being delayed, probably indefinitely, because ODOT lacks funds—and shifted the funding it did have to a different project.

ODOT has prominently advertised that it intended to hire Black contractors to undertake a significant portion of the I-5 Rose Quarter project.  Even though the project was, according to the agency, only about 15 percent designed in 2010, the agency signed a contract with a joint venture (Sundt/Raimore).  One of the partner firms, Raimore Construction, is Black-owned.  An article published in The Oregonian quoted ODOT officials as saying that Raimore would be expected to bill more than $100 million in project costs (this when the project’s price tag was estimated at a mere $795 million).

Now it appears that the Rose Quarter project is going nowhere fast.  ODOT’s latest financial plan reports that the agency has almost no money to meet the project’s $1.9 billion construction cost.Par

A lot of this has to do with the project’s exploding cost, but as we noted earlier, ODOT diverted the funding that was originally earmarked for the Rose Quarter (a project in Oregon’s historically Black Albina neighborhood) and instead used it to pay for the I-205 Abernethy Bridge project in the wealthier and much whiter suburb of West Linn.  The members of the Historic Albina Advisory Board, which include Black contractors, have expressed anger that ODOT isn’t moving forward with the Rose Quarter Project—which now as a $1.35 to $1.75 billion funding deficit.

A disparity in Black contracting

The $622 million I-205 Abernethy Bridge project is the biggest source of highway construction contracts in Oregon in nearly half a century, according to ODOT. but so far, African-American construction firms have gotten just one-tenth of one percent of contract payments, according to ODOT reports.  While the I-5 Rose Quarter project was supposed to provide $100 million in contract payments for its lead Black contractor (and likely more for subcontractors), Black contractors have so far gotten just 142,000 of the $126 million disbursed for the Abernethy Bridge.

Apparently, ODOT is only committed to hiring Black contractors when they can provide politically valuable leverage for a project in Northeast Portland.

The one big ODOT highway project that is moving forward has provided only a tiny amount of work for African-American contractors.  ODOT’s I-205 project dashboard shows that the agency is well behind in meeting its diversity goals and that only about one-tenth of one percent of contracted payments have gone to African-American firms.

ODOT set a goal of providing 14 percent of contract revenues with certified disadvantaged business enterprises (which include women-owned and minority-owned businesses).  To date, ODOT has disbursed $126 million to all contractorsand Its current dashboard shows $12 million has gone to certified disadvantaged business subcontractors, only about 9 percent of the total–well below its goal. The bulk of these funds have gone to businesses owned by women, Asian-Americans, and Native-Americans.

Only about $142,000 out of $126 million, a little more than one-tenth of one percent, have gone to African-American businesses.

The $622 million that ODOT plans to spend on the I-205 Abernethy Bridge, makes it the biggest source of contracting opportunities in recent memory.  On its website, ODOT brags:

“The I-205 project is the largest ODOT highway project in 45 years.”

If ODOT were seriously interested in bolstering opportunities for African-American businesses there’s no reason that they should not be able to qualify for contracts.

Who sold out the HAAB?

The members of ODOT’s “Historic Albina Advisory Board” (HAAB) are hopping mad.  As related by Jonathan Maus at Bike Portland, they feel board betrayed by a decision to postpone construction of the $1.6 billion I-5 Rose Quarter freeway widening project.

For years, the staff of the Oregon Department of Transportation have been promising the HAAB a bonanza of community improvements and lucrative construction contracts as part of its I-5 Rose Quarter freeway widening project.  A key part of ODOT’s marketing of the freeway widening is a claim that highway covers (really oversized overpasses) will be instrumental in providing restorative justice to the Albina neighborhood that was ripped apart by three different ODOT highway projects over several decades.

At its June 27, 2023 meeting, ODOT staff dropped the bombshell that after more than five years of planning, ODOT simply doesn’t have the money to pay to actually build the Rose Quarter project.  Members of the HAAB feel they’ve been betrayed.

 

‘This is not okay’: Black committee members respond to Rose Quarter funding shortfall at emotional meeting

 

ODOT staff tried to claim that the project’s apparent demise was because of a May  decision to suspend tolling for at least two years, to 2026.  At the HAAB meeting on June 27, Brendan Finn squarely put the blame on Governor Tina Kotek’s decision to postpone tolling in Oregon:

Something’s happening down in Salem that I want to share with all of y’all . . . we have been moving forward on two separate tolling programs.  The Rose Quarter project is intertwined with those tolling programs in that they are supposed to help pay for portions of construction  . .  . we’ve known going through the design process together over the years that this project is under-funded— it was way underfunded.  . . . Governor Kotek came into office   . . . and said to us you got to take a little bit more time with tolling . . . so she delayed the implementation of tolling . . . that has reverberations on all of our projects and the timing of implementation . . . we have put together a a financial plan for for these pieces that takes into account the fact that we are not going to be getting the revenue from tolling.

As a result, Finn conceded, the Rose Quarter project would be put on life support, with barely enough money to keep planning moving forward, and construction delayed for at least two years (and likely much longer).  The members of the HAAB could tell they were in deep trouble, but Finn’s explanation—effectively blaming Governor Kotek’s suspension of tolling—isn’t right.  The actual cause of the project’s demise is much different.  Every step of the way, over the past five years. ODOT has taken actions that undercut the progress of the Rose Quarter project and instead elevated and accelerated another project, a $622 million rebuilding of the I-205 Abernethy Bridge in the wealthy and predominantly white suburb of West Linn, rather than the Rose Quarter (in Portland’s historically Black Albina neighborhood).

Along the way, ODOT:

  • “found” money to move the I-205 project forward when the Legislature appropriated nothing for its construction.
  • diverted state gas tax funds originally earmarked by the 2017 Oregon Legislature for the Rose Quarter to pay for the I-205 bridge
  • Used Rose Quarter funding to enable the I-205 bridge to circumvent federal environmental review (which has delayed the Rose Quarter project)
  • Accelerated signing construction contracts for the I-205 bridge, putting it ahead of Rose Quarter in line for state funding
  • Proceeded with the I-205 bridge even though its cost as increased by 150 percent since 2018, from $250 million to $622 million
  • Officially told the federal government that it wasn’t “reasonably foreseeable” that the Rose Quarter project would be financed by tolling revenues.

As a result of all these decisions, the I-205 Bridge is moving forward, and ODOT, by its own admission is committing to paying for the bridge even if that state raises no toll revenue.  Meanwhile, the Rose Quarter project is languishing, and is no closer to construction than it was six years ago.

It’s baffling that Finn would blame ODOT’s financial woes on Governor Kotek’s recent actions.  It’s been apparent for years that ODOT has lacked the money to actually build the Rose Quarter project, and Kotek has been Governor for just six months.  In 2021, as House Speaker, Kotek voted against the bill that allowed the diversion of funds from the Rose Quarter (HB 3055) and urged ODOT to “right-size” its mega-highway projects.  And in May, as Governor, Kotek finally insisted on injecting a note of fiscal realism into ODOT’s work by requiring this new financial plan for its megaprojects.  As we’ll see, all of the financial problems plaguing the Rose Quarter project pre-date the Kotek Administration and are the direct product of decisions made by ODOT staff, including Finn.

About the HAAB and the I-5 Rose Quarter Freeway Widening Project

The I-5 Rose Quarter project would widen about 1.5 miles of freeway in North and Northeast Portland.  Part of the project involves constructing a partial cover over a portion of the freeway, ostensibly to make up for the damage the freeway did in dividing the historically Black Albina neighborhood.  (Construction of I-5 in the 1960s was actually one of three ODOT projects that divided and helped trigger the decline of Albina.  Facing community resistance to the project in September 2020, ODOT unilaterally disbanded an earlier community advisory group—which was raising uncomfortable questions—and instead created the Historic Albina Advisory Board.  ODOT rebranded the project as contributing to “restorative justice” in part by building the covers, and in part by implying it would hire Black contractors to do much of the work.  In City Observatory’s view, there are multiple fatal flaws with the Rose Quarter project:  it’s vastly too expensive, doesn’t fix any safety problem, won’t reduce congestion, will actually increase pollution, and doesn’t revitalize the neighborhood.  

A 2017 Earmark for the Rose Quarter:  Diverted by ODOT

The Legislature’s landmark 2017 transportation package specifically included the $450 million in funds for the Rose Quarter in the form of a 2 cent per gallon statewide gas tax.  The bill contained no funding the I-205 project.  Even so, in 2018, ODOT used its discretion to divert more than $50 million from a variety of sources to move the I-205 project forward.  Here’s a list of the funds ODOT scraped together to pay for I-205:

Then, in 2021, ODOT convinced the Legislature pass HB 3055, to open up the $450 million set aside for the Rose Quarter project for other projects, including the I-205 Abernethy Project.  ODOT quickly used that discretion to effectively commit all of that money to paying for I-205, rather than the Rose Quarter.

Evading federal environmental review for I-205 by using Rose Quarter funds

ODOT used the Rose Quarter funding diversion to evade federal environmental review of the I-205 project. ODOT assured the Federal Highway Administration that the Abernethy Bridge could be built without any toll revenues, by diverting the funds originally earmarked for the Rose Quarter.  This enabled ODOT to get an exemption from federal environmental review—a CE or “categorical exclusion.” If ODOT hadn’t offered those assurances, FHWA would have had to perform a lengthy Environmental Assessment on the I-205 bridge project (called “Phase 1a”), something that has slowed the I-5 Rose Quarter project.  Here’s the FHWA’s official finding:

Recently signed into law, Oregon House Bill 3055 provides financing options that allow Phase 1a of the I-205: Stafford Road to OR 213 Improvements Project to be constructed beginning in the spring/summer 2022 without the use of toll revenue. . .

As Phase 1a is now advancing as a separate project with independent funding, the 2018 CE decision is being reduced in scope to include only Phase 1a (the “I-205: Phase 1a Project” or “Phase 1a Project”).

[Emily Kline, “Re-Evaluation of the Categorical Exclusion for the I-205: Stafford Road to OR 213 Improvements Project, Federal Highway Administration, May 4, 2022, page 3.]

A key reason for the Rose Quarter’s delay, despite its head-start over the I-205 bridge, is ODOT’s flawed project development process and environmental assessment.  The City of Portland pulled out of the project in 2020 citing a lack of community engagement.  And the Federal Highway Administration rescinded the project’s Finding of No Significant Impact (FONSI), in part because of flaws in the ODOT-prepared Environmental Assessment. Only the personal intervention of then-Governor Kate Brown revived the project.

ODOT gave preferential treatment to the I-205 bridge project

ODOT also chose to launch the Abernethy Bridge construction first, expediting a construction contract, even though the bridge repair came in at double ODOT’s cost estimate. (And in an unnoticed part of ODOT’s new financial plan is an acknowledgment that the Abernethy Bridge Project will now cost $622 million, up from $500 million a year ago).

Now that the Abernethy project is launched, ODOT is dissembling about the role of tolls.  The agency’s finance director flatly contradicted the FHWA finding in his testimony to the Oregon Transportation Commission on June 28.  Brouwer said:

. . . we’ve already put the Abernethy Bridge Project out to bid based on the assumption of being able to toll this and it is under contract, under construction so we have now that the situation where if for any reason tolls on I-205 do not move forward whether that’s due to action at the federal, state or regional level it would punch a significant hole in the finance plan.

As a result, the failure to toll I-205 now will likely jeopardize funding for the Rose Quarter, because ODOT is contractually obligated to pay for the Abernethy Bridge even if tolling doesn’t materialize.

Rose Quarter:  Cost Overruns and No Funding Plan

As we’ve documented, the Rose Quarter has chalked up an impressive string of cost-overruns, with new, and much higher cost figures, arriving every 18 to 24 months.  The project was originally budgeted at $450 million when approved in 2017, jumped to $795 million just two years later, and then to $1.45 billion in 2021, and now $1.9 billion

In September 2021, the Oregon Transportation Commission, shocked by the new cost figures, directed ODOT staff to come back with a new finance plan by December of 2021. As Willamette Week reported, OTC was hoping somebody else would ride to the rescue:

The OTC told ODOT staff to come back with a funding proposal by Dec. 1 that includes significantly more than the $500 million to $700 million available from the state. The commission directed ODOT to include specific information in the funding plan, including (1) an estimate of the amount of dedicated funding needed to build the project and (2) “a discussion of whether a viable plan to secure that dedicated funding from federal, state and/or the city of Portland, Metro, Multnomah County, TriMet and other organizations in Portland is reasonably likely to be authorized and appropriated by July 1, 2023.”

The department completely missed that deadline.  More than 18 months later, in May 2023, the staff showed up at a Commission meeting and asked for yet another year of delay to prepare a financial plan.  This project’s financial woes are not the product of the recently announced tolling postponement; they’re a long-standing dereliction of financial duty by ODOT.

Rose Quarter is now permanently behind an even more expensive Abernethy Bridge.

Now that the Abernethy Bridge has started construction, that project takes absolute priority over the Rose Quarter project. As ODOT Finance Director Travis Brouwer testified to the Oregon Transportation Commission on June 28, because the agency had started the I-205 Abernethy Bridge, that was “locked in.”

. . . we have started on the I-205 Abernethy bridge and so that is locked in . . .

OTC Vice Chair Lee Beyer confirmed that in his comments in the meeting:

. . . one of the fiscal realities is we have to move forward on Abernathy because we’re in the midst of construction we really don’t have an alternative there . . “

But the loss of funding was only part of the problem:  ODOT has badly botched the design of the Rose Quarter project, leading to an escalating series of cost overruns.  The project which was estimated to cost $450 million in 2017, jumped to $795 million in 2018, to $1.45 billion in 2021, and now to $1.9 billion.

Rose Quarter’s Fatal Flaw:  A Too Wide Design

All of these cost increases are driven by ODOT’s decision to build a massively wider freeway.  The current roadway is about 82 feet wide; ODOT plans to double it to 160 feet (and in places 200 feet).  ODOT has got to great lengths to conceal the actual dimensions of the freeway, and claims that it’s adding just one auxiliary lane in each direction.  The reality is its intent on building a roadway broad enough for ten travel lanes (up from four today).

ODOT’s own consultants, the internationally recognized engineering firm ARUP, explicitly said that ODOT was designing an excessively wide roadway, with shoulders in the covered section wider than in any city in North America.  It recommended reducing the width of the roadway by more than 40 feet.

The excessive width of the roadway is the biggest cost driver.  It necessitates huge and expensive columns and girders to carry local streets across the widened freeway.  And because the beams supporting the road (and proposed covers) have to be much taller than current beams, ODOT has to depress the roadbed of the freeway below its current level—excavating at great expense to assure adequate vertical clearance for the new road.

ODOT’s attempt to package the I-5 Rose Quarter project as “restorative justice” for the damage a series of ODOT highway construction projects did to the Albina neighborhood from the 1950s to the 1970.  Grafting elaborate (but still very constrained) covers on to its overly wide freeway is plainly uneconomical.

 

Testimony to the Oregon Transportation Commission

On June 28, 2023, City Observatory’s Joe Cortright testified to the Oregon Transportation Commission about the agency’s dire financial situation.

Background:  The Oregon Department of Transportation is pushing a multi-billion dollar freeway widening program in Portland, dubbed the “Urban Mobility Plan.” The agency has never fully identified how the plan would be paid for, and recent plans to put tolling on hold for two years, prompted Governor Tina Kotek to direct a new look at project plans and agency finances.  ODOT has cancelled one project (Phase 2 of I-205), and effectively admitted it has no way to pay for another, the I-5 Rose Quarter project.  The new plan reveals yet more cost-overruns on these already bloated projects, and in reality, provides no explanation of how they might be funded.

For the record: Joe Cortright. I’m an economist with City Observatory and a member of No More Freeways. I’m commenting on the Urban Mobility Finance Plan developed by your staff. This isn’t really so much a plan at all as as it is a belated and only partial admission of the deep-seated structural financial problems for which your staff has no serious solution. The “plan” that they are offering is a vague hope that more federal and state funds will magically appear for the projects in the Urban Mobility plan.

The fiscal crisis that ODOT is now in was foreseeable and foreseen to anyone who took a serious look at the at the agency’s finances. Your revenue model and your expenditure processes are broken: the gas tax is already coming in below projections and is projected to decline further. Vehicle miles traveled, according to your own forecasts, are in permanent decline. State climate goals call for a 50 percent decline in gasoline sales which will further reduce your revenue.

And we received notice earlier this month that the Highway Cost Allocation Study shows that because of ODOT spending patterns we’ll have to reimburse trucks and heavy over-the-road vehicles about $220 million per year. So your revenue situation is far worse than you’ve acknowledged.

In the face of this, the Urban Mobility Plan is confronting you with huge cost overruns. We’ve seen that the Rose Quarter Project’s price tag has now ballooned to $1.9 billion —more than four times the $450 million that the Legislature was told that this project would cost when it approved it in 2017.

Despite these cost overruns, there is not one word in this plan about right-sizing any of these projects, which are all over-built. I would note that then-Speaker now Governor Tina Kotek called for right-sizing these projects in 2021 when she voted against House Bill 3055 which authorized the commission to do additional borrowing.

Please take a close look at the scale of these projects, because your staff has concealed exactly how large these projects are. This is the reason they’re so expensive is that the Rose Quarter project is a 10-lane wide freeway project and the Interstate Bridge Replacement is a 12-lane wide freeway project. If these projects were right-sized they would be vastly less expensive.

Finally you’re counting on toll revenues to bail out your financial situation. As an economist I can tell you the effect of tolls will be to reduce traffic, which in many respects is a good thing. But by tolling these roadways to pay for them, you will essentially obviate the need for additional capacity. ODOT’s own studies of the Rose Quarter project show that implementation of Regional Mobility Pricing will be more effective in reducing congestion than the now $1.9 billion cost of widening the roadway through there.

In the past you’ll pursued a piecemeal approach to these projects. ODOT is in the midst of a serious financial crisis: the cost of these projects is exploding. It’s time to take a serious objective look —and I just have to say as somebody who’s been commenting on these projects for more than a couple of decades now— some engagement by your staff in a serious fashion, rather than just two minutes of enduring the comments that we make and then simply ignoring them would be much appreciated. We have technical expertise and would be happy to engage with your staff and assist the commission to deal with the gnarly financial problems that it faces.

Scratch one flat top!

Oregon freeway fighters chalk up a key victory—but the fight continues

On June 26, the Oregon Department of Transportation finally bowed to reality that it simply lacks the funds to pay for a seven-mile long widening of I-205 just outside of Portland.

Predictably, ODOT conceded defeat in the most oblique possible terms; the I-205 project isn’t dead, its officially just “indefinitely postponed.”  This, in exactly the same way that the White Star Lines could still describe the arrival of RMS Titanic as “indefinitely postponed.”

Opposition to the project was led by No More Freeways, a grassroots Portland group fighting billions of dollars of freeway widening projects being pushed by ODOT.  No More Freeways filed detailed objections and critiques of the project technical work in comments on its Environmental Assessment. In addition, NMF’s community members submitted over 300 comments in opposition to the I-205 expansion during the public comment period last spring, including technical comments pointing out the explicit violation of federal environmental protection law. 

ODOT’s proposed I-205 expansion was listed as one of the worst transportation projects in the country in USPIRG’s “Highway Boondoggles” report in 2022. 

In a prepared statement, No More Freeways co-founder Chris Smith said:

“No More Freeways is delighted to learn that the Oregon Department of Transportation proposes indefinitely postponing expansion of Interstate 205 even as the agency acknowledges they simply do not have a path forward to fund the now $1.9 billion Rose Quarter Freeway Expansion. 

These are both massive victories for any Oregonian who enjoys clean air, safer streets, a hospitable planet, and fiscal responsibility from their state government. Now more than ever, No More Freeways continues to insist that ODOT conduct a thorough Environmental Impact Statement on the proposed Rose Quarter Freeway Expansion that studies alternatives to expensive freeway expansion that reduce congestion while bringing clean air and justice to the Albina neighborhood.”

This decision saves Oregon taxpayers more than $400 million that would otherwise be spent on this highway widening project.

Scratch one flat top

In May of 1942, in the darkest days of World War II, American naval aviators struck the first blow agains the previously un-beaten Japanese Navy at the Battle of Coral Sea.  American dive-bombers, led by Lieutenant Commander Robert Dixon, attacked and sank the aircraft carrier Shoho; Dixon famously signaled “Scratch one flattop,” which subsequently became a rallying cry for Allied forces.

We can only hope that this first small victory will signal a turning of the tide in the battle against wasteful and counterproductive highway expansion projects.  Oregon DOT continues to maintain the fiction that its now-$1.9 billion Rose Quarter project is still alive, but it too, will have to yield to the fiscal reality that the highway department is essentially broke and doesn’t have the resources to maintain the roads it currently has, much less build enormously expensive new ones.

 

What Cincinnati’s Brent Spence Bridge can tell Portland

There’s plenty of time to fix the Interstate Bridge Project

Contrary to claims made by OregonDOT and WSDOT officials, the federal government allows considerable flexibility in funding and re-designing, especially shrinking costly and damaging highway widening projects

In Cincinnati, the $3.6 billion Brent Spence Bridge Project

  • Was downsized 40 percent without causing delays due to environmental reviews
  • Got $1.6 billion in Federal grants, with only about $250 million in state funding plus vague promises to pay more
  • Is still actively looking to re-design ramps and approaches to free up 30 acres of downtown land

For years, the managers of the Interstate Bridge Project have been telling local officials that if they so much as changed a single bit of the proposed IBR project, that it would jeopardize funding and produce impossible delays.  Asked whether it’s possible to change the design, and they frown, and gravely intone that “our federal partners” would be displeased, and would not allow even the most minor change.  It’s a calculated conversation stopper—and it’s just not true.

Across the country, in Cincinnati, local leaders–who’ve already gotten a commitment of $1.6 billion in federal funds—based on a modest down payment and vague commitments to pay more.  Collectively, Kentucky and Ohio still have to figure out where about $1.5 billion in state funding will come from.

The new bridge over the Ohio River could be one of these two designs: cable-stayed or tied arch. Ohio and Kentucky officials pictured these options in a July 2022 presentation about the Brent Spence Bridge Corridor Project.

Last year, in response to local government concerns, the two state DOTs reduced the size of the Brent Spence Bridge by 40 percent from the version that the Ohio and Kentucky DOTs pushed through the environmental review process.  And that change isn’t expected to affect its environmental approvals or timetable.

Not only that, but local governments–led by Cincinnati—are still actively pushing for a major re-design of the bridge’s on- and off-ramps to free up 40 acres of land in downtown Cincinnati for urban redevelopment—something that they believe can be done without jeopardizing the project.

This is an object lesson for Oregon and Washington:  The federal government doesn’t require all local funding to be in place before it makes its commitment, it’s possible to shrink a project even after its gotten its environmental approval, and its also possible, even after getting the federal funding, to make major changes to the project design to lessen its impact on urban spaces.

As Metro President David Bragdon observed, Oregon and Washington DOT officials routinely lie about federal requirements and deadlines to block local officials from designing better and more affordable highway projects.

Leadership at ODOT frequently told me things that were not true, bluffed about things they did not know, made all sorts of misleading claims, and routinely broke promises. They continually substituted PR and lobbying gambits in place of sound engineering, planning and financial acumen, treating absolutely everything as merely a challenge of spin rather than matters of dollars or physical reality.  . . . ODOT management has revived one of its favorite old falsehoods by claiming they are facing an “imminent federal deadline,” and that if local leaders don’t knuckle under to ODOT’s plan–and soon–the region will lose millions or tens of millions of dollars forever.  Creating fictional “federal deadlines” and other federal processes as an excuse for false urgency is a familiar ODOT tactic.

For too long, highway officials have gotten away with their best Jerry Lundegaard impressions, telling state and local officials that their hands are tied, because their manager (in another room) just won’t approve a better deal or not charging for the under-coating.  Cincinnati’s Brent Spence project shows the federal government will allow changes that make highway projects have fewer environmental impacts, become more affordable, and benefit local communities.

Honey:  I shrank the bridge

The original design for the Brent Spence Bridge was approved by the US Department of Transportation in a “Finding of No Significant Environmental Impact, (FONSI)” in 2012.  As originally proposed, the bridge would have been nearly 150 feet wide.  Not only was this design over-sized (and expensive) but it had significant impacts on  the City of Covington Kentucky (the southern terminus of the new bridge).  The project languished without funding for more than a decade.

In 2022, the Kentucky and Ohio Departments of Transportation agreed to significantly reduce the width of the project.  They width of the double-decker bridge was reduced from 145.5 feet to 84 feet.

In June, 2022, the new, much more compact design was announced:

Based on engagement and technical analysis, the announced Friday said, the footprint of the new bridge has been significantly reduced from the alternative approved in 2012. . . . . The new bridge was planned to cover nearly 25 acres and span nearly 150 feet in width. Revised plans show the new bridge at almost half the size of the 2012 footprint – covering approximately 14 acres and 84 feet in width. (emphasis added)

This was a major concession to local leaders:

Covington Mayor Joe Meyer, led the negotiation for the City, called the agreement a monumental victory for Covington residents and businesses.  . . . “Meyer said it will “reduce the width of the driving companion bridge by over 40 %. It’s a 61 and a half foot reduction in the driving width of that bridge.They’ve reduced the additional right of way that was necessary by 10 acres, another 40 plus percent reduction in right of way acquisition.”

The Federal Highway Administration is being asked to “re-evaluate” it s NEPA approval.  The Kentucky and Ohio transportation departments are preparing an updated Environmental Assessment, and FHWA is expected to issue a revised “FONSI” this Fall.  The key argument made by the state transportation departments is that the new, smaller design is within the “footprint” of the already approved 2012 design, and therefore can be expected to have fewer environmental impacts.

Here’s the lesson for Oregon and Washington:  Just because the decade-old plans for your bridge called for a massively wider freeway, nothing in the federal environmental review process precludes you from making the project smaller.  That won’t slow down the environmental review process, and it’s no obstacle to getting federal funding.  For the proposed Interstate Bridge Replacement project, this means that a right-sized bridge, coupled with retaining (rather than replacing) existing interchanges, would likely get FHWA approval.

The bridge is approved:  Now let’s re-design it

The current design for the Brent Spence Bridge is now 40 percent smaller than it was a year ago—but the re-design is not over.  Like the proposed Interstate Bridge Replacement Project, the Brent Spence “Corridor” project calls for an expensive set of on- and off-ramps to connect to the new bridge.  On the Cincinnati side of the river, this spaghetti of ramps and intersections would foreclose the urban use of more than 30 acres of prime real estate in the city’s downtown.  Rather than repeat the devastating mistakes of past freeway construction–which obliterated most of Cincinnati’s historically Black neighborhoods–local leaders are calling for a re-design of Brent Spence’s ramps and connections to restore urban use.

 

Keep in mind that President Biden announced the approval of federal funding in January of this year.  Right now, in May, 2023, the Cincinnati City Council is pushing forward with plans to re-design the project as it passes through the city. The current Cincinnati Mayor, Aftab Pureval, and two former Mayors, John Cranley and Mark Mallory—have all spoken out in favor of a fundamental re-design of the Brent Spence Bridge to dramatically shrink its complex of interchanges and off-ramps, and free up more than 30 acres of land that were devastated by freeway construction. They’re calling on the Ohio Department of Transportation, and US DOT Secretary Pete Buttigieg, to give them flexibility to re-design the project—something the city has done successfully with other highways in Cincinnati:

We also applaud Transportation Secretary Pete Buttigieg (whom we have both known for many years) for implementing new rules that reward designs that are urban friendly. The federal government now embraces the kind of progressive vision our city showed in redoing Fort Washington Way and the I-71/MLK interchange.

The progressive design build process that ODOT has rightly put in place requires that local input be an official part of selecting a contractor and finalizing with that contractor a design that meets local goals and ambitions. That process has only just begun and any suggestion that it is “too late” to make design improvements isn’t paying attention to recent changes ushered in by Secretary Buttigieg.

A local group, called Bridge Forward, has come up with a plan to reduce the footprint of the onramps, and trigger urban renovation:

The Bridge Forward Plan

This has a direction implication for the Interstate Bridge Replacement Project in Portland and Vancouver.  In their $7.5 billion project, ODOT and WSDOT are proposing to re-create, and rebuild at great expense, seven closely spaced freeway interchanges, which they—and independent consultants they hired—have said are a fundamental cause of the highway congestion and which are a majority of the cost of the bloated project.

As Cincinnati’s experience shows, even after the bridge has been down-sized, and the federal money committed, there’s still the opportunity to get a more sensible, sensitive design.

The Ohio experience with the Brent Spence Bridge shows that, if local leaders are in agreement, we can shrink the size of the project to reduce its cost, and continue to explore designs that are less disruptive to the urban fabric without slowing down the federal funding, environmental approval, or construction contracting processes.

No money down:  The Feds contributed to the project in return for partial state funding and vague commitments, not hard cash

A key talking point of the Oregon and Washington DOTs is that Oregon has to put $1 billion on the table in order to apply for federal funds.  That’s clearly not the case with the Brent Spence Bridge.  Local television news station WKRC reported that President Biden committed $1.6 billion in federal funds for the project’s total cost, estimated at $3.5 billion.  So far the only state commitments are a $250 million pledge from the Kentucky legislature and an vague statement from Ohio Governor that his state would “pay its share:  That leaves more than $1.5 billion that the two states expected have yet to come up with, as WKRC reported:

That leaves another $1.5 billion in costs to be split between Ohio and Kentucky. The Kentucky General Assembly last year pledged $250 million toward the project, with Ohio Gov. Mike DeWine also promising his state would pay its share.
Ohio and Kentucky have gotten the federal government to commit its $1.6 billion from the bipartisan infrastructure law well before nailing down the local revenues for the project.  The lesson for Oregon and Washington is that they should instruct their state transportation departments to proceed to get the federal funding in place, without insisting on a full up-front payment from the states.  Knowing exactly how much the federal government will contribute will tell the states how big a hole they have to fill, rather than signing them up to pay whatever the project ends up costing.
Editor’s Note:  This commentary has been revised to correct errors in the summary (May 10).

 

Bus on shoulder: Stalking horse for freeway widening

ODOT isn’t giving buses the shoulder, it’s giving transit the finger.

IBR is planning a transitway for the new $7.5 billion interstate bridge that can’t be used by buses.

It’s sketching in a “bus on shoulder” option as an excuse to justify building an even wider highway crossing.

Meanwhile it plans to place light rail tracks on raised concrete blocks, rather than embedding rails in the road surface, so that the transit right of way can be used by both light rail trains and buses.

Direct Fixation:  No buses allowed

Each of the bridge’s carrying Portland’s light rail and street car lines across the Willamette River has flush-mounted, recessed rails, that allow both trains and rubber-tired vehicles to use the same travel lanes.  Even the original Interstate Bridge, built more than a century ago, had recessed rails for interurban trains.  Flush mounted rails allow buses and light trains to use the same lanes.

 

Yet, IBR’s plan is to use embedded LRT track at only at intersections and “direct-fixation track throughout the rest of the program improvements.”  As the Portland Mercury reported:

In their plans for the MAX light rail extension, IBR program leaders have indicated the train will travel across the bridge on direct fixation rails. Direct fixation rails are raised on blocks above the surface of a roadway, making it so non-rail vehicles can’t utilize the same road space. In comparison, the Broadway, Steel, and Tilikum Crossing bridges in Portland all have embedded railways, allowing for increased transit capacity on the same roadway.

Only where the light rail line crosses or intersects with a roadway will they use embedded tracks—so as not to inconvenience cars.

“Direct fixation” is techno-speak for rails raised up on blocks above the surface of the roadway.  In contrast to embedded or flush-mounted rails, the “direct fixation” rails create a roadway that can’t be navigated by non-rail vehicles, in this case, specifically, buses.  It’s marginally cheaper to do direct fixation, but it means that the roadway then can’t be used by buses (or emergency vehicles). A short section of Portland’s streetcar crosses over Interstate 84 on a bridge that has rails mounted in raised concrete blocks:

“Direct Fixation”. — Mounting rails on raised concrete blocks

Every other light rail and streetcar bridge in Portland has surface-mounted rails that allow rail vehicles and rubber tire vehicles to use the same right of way.

The proposed IBR will have about one train every ten or fifteen minutes in each direction.  There’s no reason why buses can’t run on the same exclusive transit right of way as the light rail trains.  The bus-on-shoulder option requires buses to run in mixed traffic, and for all entering traffic to cross the bus-shoulder lane to reach the auto travel lanes.

This is a reprise of a tactic that the “Power Broker” used in the 1930s to block transit service to suburban parks and homes.  Just as Robert Moses intentionally designed all of the overpasses on the Long Island’s South Shore Parkway to be too low to be cleared by buses, this one design decision is designed specifically to thwart optimal use of this roadway for transit.

Bucolic sure, but built so no buses will ever travel here. Just like the transitway on the Interstate Bridge Replacement.

As Robert Caro wrote in his biography of Moses, a New York planner observed:

In practice, no practical bus operator would run his buses on any road on which the clearance at the curb wasn’t at least fourteen feet . . .  the old son of a gun had made sure that buses would never be able to use this godamned parkways. (The Power Broker, page 952).

Bus-on-shoulder

IBR advertises “bus-on-shoulder” as a key part of its high capacity transit plans for the $7..5 billion project.

In reality, though, bus-on-shoulder is just an excuse to build an extra wide bridge.  Once the bridge is built, ODOT and WSDOT can easily re-stripe the bridge to include more travel lanes.  Elsewhere, ODOT and other highway agencies have used the excuse that they aren’t expanding the footprint of the roadway (or acquiring added right-of-way to argue that adding a lane via painting new lines qualifies for a categorial exclusion from environmental review—so no one ever looks at the added driving and pollution caused by widening the roadway.

It then becomes the justification for widening the roadway deck of the bridge, ostensibly to provide for “bus on shoulder” but in reality, to rationalize building a roadway that with a few hours work by a paint truck, can be re-striped to the full twelve-lane freeway ODOT and WSDOT have always wanted to build.

Instead of bus-on-shoulder, IBR could much more readily build “bus-on-shared-transitway” with embedded track, that would give buses an exclusive lane with no conflicts with road traffic.  On a double-decker bridge design, the transitway would be 35 feet lower than the highway deck (and its shoulder), meaning that buses would have less of a grade to climb, and would perform better.  And the last new bridge completed in the Portland area, the Tillikum Crossing, has exactly this technology which allows both buses and light rail vehicles to use the same right of way.

 

 

Why can’t ODOT tell the truth?

The Oregon Department of Transportation (ODOT) can’t tell the truth about the width of proposed $7.5 billion Interstate Bridge Replacement Project

ODOT is more than doubling the width of the bridge from its existing 77 feet to 164 feet.

The agency can’t even admit these simple facts, and instead produces intentionally misleading and out of scale drawings to make their proposed bridge look smaller.

If engineers can’t answer a simple question about how wide a structure they’ll build, why should anyone have any confidence in their ability to accurately estimate costs or revenues?  

Why is the width of this bridge, and its actual appearance a state secret?

It’s a simple question, really:  How wide is the $7.5 billion”Interstate Bridge Replacement” that Oregon DOT is trying to sell the Oregon Legislature?  Several members of the Joint Transportation Committee put that very question to ODOT leaders, and simply got a gibberish non-answer.

Oregon DOT’s lobbyist, Lindsay Baker wrote a rambling response to the question, which alternately, offered a long digression on the history of the existing bridges, answered a question the legislators didn’t ask (combined the over-water space covered by the bridges, including the space between the bridges), and offered absurd and meaningless statistics (28 percent of structure area would be “dedicated” to transit.). Baker’s response even included a couple of diagrams—which as we will see were purposely altered to conceal the actual width of the proposed bridge, and make it look smaller. Instead, the chief ODOT talking point is that they are merely adding “one auxiliary lane” in each direction to the existing bridge footprint.

Nothing in Baker’s non-response reveals the actual measurements of either the existing or proposed structures.  Let’s cut to the chase, because these are, ODOT obfuscation notwithstanding, simple facts (the kind the real engineers actually excel at).  The existing bridges have a combined roadway about 77 feet wide.  The proposed bridges would have a roadway that is 164 feet wide. ODOT proposes to more than double the width of the roadway across the river.  The existing bridges carry six lanes of traffic (three lanes in each direction).  The proposed structure is easily wide enough to carry twelve lanes—six in each direction.

Old Bridge:  77 Feet Wide; New Bridge 164 Feet Wide

How do we know this?  It takes a combination a two-second Internet search (the existing bridge) and a public records request and some algebra (the proposed bridge). First, for the record, the existing bridges have roadway widths of 38 feet and 39 feet respectively., for a total roadway width of 77 feet. 

It’s harder—much harder—to find the width of the structure ODOT is proposing.  In describing the width of the “locally preferred alternative” at the time it was approved by local governments, ODOT declined to say how wide the actual structure was, instead it cryptically reported that the LPA will be 16 feet narrower than the Columbia River Crossing proposed a decade ago. 

So, in order to know the width of the IBR, you have to know the width of the CRC.  And, the width of the CRC is effectively a state secret.  In its environment impact statement of 2011, ODOT erased all the actual measurements showing how wide that bridge would be—it took a public records request to get them to disclose that it would be 180 feet wide.  Here’s an excerpt of the plans we obtained via public records request, showing the CRC had a minimum width of roadway of more than 90 feet on the top decks of each of its two spans (other portions of the bridge are even wider, to accommodate a horizontal curve, as the bridge crosses the river). 

So the answer to the ODOT bridge width riddle is that the LPA is 164 feet wide:  180 feet (the width of the CRC) minus 16 feet equals 164 feet.  For the record, ODOT is planning two side-by-side, double-decker bridges with 82 foot top decks and 48 foot bottom decks.  That creates 164 feet of roadway on the top-deck of the two bridges.  In addition, there’s even more space on the bottom deck of these double decker bridges; the bottom decks are about 47 and a half feet wide, meaning that there’s a total of 95 feet of additional travel capacity on the two bottom decks.  ODOT’s plan is for the highway to be carried on the top decks of the two bridges, and for light rail to be located on the bottom deck of one bridge, and a bike/pedestrian path on the bottom deck of the second bridge.

Intentionally Misleading Images

In her letter Joint Transportation Committee, ODOT lobbyist Lindsay Baker waxed poetic about the width of the existing bridges, and included a couple of extremely misleading and not-to-scale drawings of the existing bridge and their proposed alternatives.  We’ll focus on the double-decker bridge alternative which ODOT has characterized as the official “Locally Preferred Alternative” (LPA).  Keep in mind, the Interstate Bridge Program has spent tens of millions of dollars on engineering; its predecessor spent $200 million on the nearly identical Columbia River Crossing, and when asked to provide a drawing, ODOT offers up some “not-to-scale” cartoons to answer a simple quantitative question.

Here are the illustrations Baker provided.  Above is the existing bridge, below is the proposed bridge


We’ve added one small annotation—a red bar showing the width of the wider of the two current bridges (39 feet).  We’ve copied that 39 foot measuring stick to ODOT’s drawing.  It seems to show that ODOT is squeezing four lanes of traffic into the same space as the current bridges three lanes.  But of course that isn’t true:  ODOT has rendered the two bridge images at different scales.  The first clue is that the cars and trucks in the lower, IBR drawing are much smaller than the cars and trucks in the upper (existing) drawing).  We printed out and measured their diagrams.  The top diagram is drawn at a scale of about 1:250 (about one inch equals 20 feet).  The bottom diagram is diagram is drawn at a scale of about 1:375 (one inch equals about 30 feet).  The scales are chosen explicitly to make the new bridge seem smaller than it really is.

We’ve corrected ODOT’s drawing by re-projecting their image at a comparable scale.  (This makes the trucks and cars roughly the same size in both drawings).  With this correction it’s now apparent that the ODOT plan is to more than double the width of the current roadway, from a combined 77 feet between the two existing bridges, to a total of 164 feet between the two proposed bridges.

More than Doubling the width of the I-5 highway bridges—Enough for a 12 full lanes

We know, that at a minimum, ODOT’s plan is to increase the roadway width across the Columbia River from 77 feet to 164 feet–more than doubling the width of the roadway.  The new bridge is 164 feet wide.  How wide is that exactly:  well, its almost exactly as wide as a football field (160 feet).

A 164 foot wide roadway can easily accommodate 12 travel lanes.  Standard travel lanes are 12 feet wide.  Twelve twelve foot travel lanes would occupy 144 feet of the 164 feet of roadway that ODOT proposes for its bridge structure—leaving 20 feet for shoulders.  It is not uncommon on urban roadways, especially on bridges, to accommodate shoulders in this area:  ODOT’s plan would allow for 4 foot inside (left) shoulders) on each crossing and 8 foot outside (right) shoulders.  For reference, as part of its $1.45 billion  I-5 Rose Quarter project, ODOT is proposing 11 foot travel lanes and shoulders of between 3 and 6.5 feet on a viaduct section of the project near I-84.  There’s nothing illegal, unusual, or substandard about 11 foot lanes and somewhat narrower shoulders on urban roadways:  In fact, the Federal Highway Administration prominently praised ODOT for narrow lanes and narrow shoulders on Portland’s I-84 Interstate Freeway.  Here is a page of the USDOT report, “USE OF NARROW LANES AND NARROW SHOULDERS ON FREEWAYS: A Primer on Experiences, Current Practice, and Implementation Considerations.” FHWA HOP-16-060.  The narrow shoulders on I-84 are also featured on the cover of the document.

IBR:  A Pattern of mIsleading, “not to scale” drawings.

Lying with pictures is nothing new for the IBR project.  As we’ve noted before, despite spending tens of millions of dollars on planning, and more than $1.5 million to build an extremely detailed “digital twin” of the proposed bridge, IBR has never released any renderings showing what the bridge and its mile long approaches will look like to human beings standing on the ground in Vancouver or on Hayden Island.  And the IBR also released similar misleading and not-to-scale drawings that intentionally made the height and navigation clearance of their proposed bridge look smaller than it actually is.

ODOT’s not-to-scale image to make the IBR look smaller than the existing I-5 bridge

Hiding the actual width of the bridge they intend to build is a scene-for-scene remake of false claim made for the preceding project—the failed Columbia River Crossing (CRC). In 2010, in response to objections from the City of Portland and Metro, ODOT and WSDOT announced they were reducing the size of the CRC bridge from 12 lanes to 10 lanes. But in reality, all they did was change the references in the project documents to that number of lanes, while literally erasing from the Final Environmental Impact Statement every single reference to the actual widths of the bridges and other structures they intended to build. A public records request showed the actual plans for the bridges — which were not published — were exactly the same size (180 feet in width) as they were for the 12-lane version of the bridge.

ODOT seems to be congenitally incapable of revealing the actual width of any of the major projects it is proposing.  As we’ve pointed out at City Observatory, it has gone to great lengths to conceal the width of the proposed I-5 Rose Quarter project, which as it crosses under the Broadway and Weidler interchanges in Portland will be 160 feet wide.  While the project’s Environmental Assessment pretended the project was 126 feet wide (again, based on cartoon “not to scale” images), secret ODOT documents confirmed that the agency has always been planning a 160-wide roadway.

 

 

Reference

Here’s the full letter from ODOT’s Lindsay Baker to the Oregon Legislature’s Joint Transportation Committee.

https://olis.oregonlegislature.gov/liz/2023R1/Downloads/CommitteeMeetingDocument/271285

 

A blank check for the highway lobby: HB 2098-2

The HB 2098 “-2” amendments  are perhaps the most fiscally irresponsible legislation ever to be considered by the Oregon Legislature.  They constitute an open-ended promise by the Oregon Legislature to pay however much money it costs to build the Interstate Bridge Replacement and Rose Quarter freeway widenings—projects that have experienced multi-billion dollar cost overruns in the past few years, before even a single shovel of dirt has been turned.

HB 2098-2 amendments would:

  • Raid the Oregon General Fund of $1 billion for road projects
  • Give ODOT a blank check for billions of dollars of road spending
  • Allow unfettered ODOT borrowing to preclude future Legislatures from changing these projects and forcing their funding
  • Eliminate protective sideboards enacted by the Legislature a decade ago
  • Enact a meaningless and unenforceable cap on project expenses.

Oregon’s transportation department is going broke:  Its major source of revenue, the gas tax, is in terminal decline, thanks to growing vehicle fuel efficiency and electrification.  The agency doesn’t even have enough money to maintain current roads, and has been cutting back on maintenance, and yet is set to embark on an unprecedented spending spree.

The “-2” Amendments will serve as a pretext for ODOT to borrow money to get each of these projects started, regardless of how much the projects will actually cost, and whether federal grants for these projects or toll revenues will cover even a fraction of their cost.

The bill does this because it knows that if legislators were asked to come up with the money for these projects today, by raising gas taxes or other road user fees, there’d be no stomach (or votes). So, instead, they’s simply let ODOT max out its credit cards, and sign construction contracts, and come back to the 2025 Legislature with a giant bill that it will have to pay.

“If wishes were horses, beggars would ride”

The Legislature seems bound and determined to enact into law this old Scottish proverb.  Section 3 of the -2 amendments declares the Legislature’s “intent” to borrow $1 billion in General Obligation Bonds, to be repaid over the next couple of decades or more from the state General Fund.  Section 11 of the -2 amendments further declares the Legislature’s “intent” to appropriate whatever it ends up costing to build the I-5 Rose Quarter project, with no reference to a specific dollar amount or source of funds.

The -2 amendments to HB 2098 don’t contain an explicit appropriation of funds, or a new source of revenue, or even a specific authorization to issue new debt.  Instead, we have just vague indications of intent:

“The Legislative Assembly intends to support the Interstate 5 bridge replacement project through an investment of $1 billion . . ”

“The Legislative Assembly affirms its intent to fully fund the Interstate 5 Rose Quarter Project in the 2024 and 2025 regular sessions of the Legislative Assembly.”

It’s far from clear what the legal meaning of these statements of “intent” have.  But the authors of the -2 amendments are trying to have it both ways:  they are trying to appropriate money, without actually appropriating money.  They’re not actually taking the step to spend these funds (and say where the money will come from) but are trying to commit future Legislatures to making those difficult decisions.  It might seem that statements of intent (like legislative resolutions and memorials) are merely legislative window-dressing, with no legal weight.  But it’s clear that the Oregon Department of Transportation has other plans.

“Intent” plus debt:  Committing future Legislatures to pay billions

Superficially, HB 2098-2 might seem like an empty letter—the Legislature often makes sweeping, feel good statements of intent—but the danger with this one is that it could serve as the basis for the Oregon Department of Transportation to pull out its credit card and borrow hundreds of millions of dollars, based on the vague promise that some future Legislature will pay these bills.  And this is no idle speculation:  this is exactly what ODOT did with the I-205 Abernethy Bridge Project.

It’s worth spending a minute to review that project.  In 2017, the Oregon Legislature adopted a major transportation package, which provided $450 million for the I-5 Rose Quarter project (paid for with a $30 million per year increase in gasoline and weight mile taxes).  That package conspicuously did not provide funding for the Abernethy Bridge, but instead the Legislature directed ODOT to come up with a plan to use tolling to pay for I-205 improvements, and to report back with a “Cost to Complete” report that would tell how much this project would cost.  In 2018, the Cost to Complete report came in with a $250 million price tag for the Abernethy Bridge.  The I-205 project languished for a couple of years, and in 2021, ODOT persuaded the Legislature to adopt HB 3055, which made two significant changes.  HB 3055 authorized ODOT to dip into the $30 million per year fund designated for the Rose Quarter project to pay for I-205 (as well as the I-5 Boone Bridge), and also gave ODOT the authority to issue short-term bonds (the public sector equivalent of a payday loan).

In 2022, ODOT used the newly granted authority in HB 3055 to move forward with the Abernethy Bridge Project.  First, it told the FHWA that it could build the project entirely without toll financing—thus evading federal environmental review of tolling on the Abernethy Bridge.  Second, it took advantage of its short term borrowing authority and the HB 2017 Rose Quarter funding to start construction on the Abernethy Bridge, even though the price tag of the bridge had doubled to $500 million from the number it quoted the Legislature.  As a result of ODOT’s action, Oregon is now obligated to pay the full price of the Abernethy Bridge project, presumably through the HB 2017 $30 million appropriation and toll revenues.

It’s likely that the Abernethy Bridge project will use up all of the $30 million per year available from HB 2017, leaving little or nothing to pay for the I-5 Rose Quarter project, which meanwhile, has tripled in cost to as much as $1.45 billion—and which still faces major questions over its design.

A Blank Check for the Highway Lobby

Combining Oregon DOT’s short term borrowing authority from HB 3055 (its basically unfettered ability to get a payday loan of hundreds of millions of dollars), with a statement of “intent” that the Legislature will some day deliver whatever money is needed for the I-5 Interstate Bridge Replacement Project and the Rose Quarter freeway widening is likely all ODOT needs to get these projects started.  It will issue perhaps $500 million in such bonds, covering the initial interest and principal repayments from its current revenue and with the assumption that it will ultimately refinance the balance of the costs in balloon-mortgage fashion with the “intended” funding from some future Legislature.

And when these blank checks are filled in, the numbers will be very large.  The Interstate Bridge Replacement Project’s estimated cost has risen from a supposed maximum of $4.8 billion in 2020, to a new maximum of $7.5 billion today.  Similarly, the cost of the I-5 Rose Quarter project was sold to the 2017 Legislature as being $450 million.  The latest estimate now runs to $1.45 billion–and that figure is already out of date.  And these are just preliminary, pre-construction estimates;  if past experience is any guide, both of these projects will both end up costing significantly more once actual construction begins.

Once started, both the IBR and the Rose Quarter projects are designed in such a way that it may be impossible or prohibitively expensive to reduce their scope.  The IBR is planned as a fixed, high-level crossing that will necessitate lengthy elevated viaducts and the rebuilding of freeway interchanges (which constitute a majority of project costs).  Once the bridge is started to that design, it will be difficult to reduce its cost.  Similarly with the Rose Quarter project, where its 160 foot width dictates excavation costs and drives up the cost of proposed covers.  If ODOT starts these projects, the state will be stuck with bloated, over-sized projects it can’t change.  And that, as we have long said, is the point:  This is the classic Robert Moses strategy of “driving stakes and selling bonds” and putting the Legislature in a position where it has no ability to control what the highway building agency does.  That was tragic and stupid when Moses first did it in New York in the 1930s; it is even more tragic and stupid today, when we know with a certainty that highway widening doesn’t reduce congestion, that it destroys the fabric of urban neighborhoods, and worsens air pollution and climate destruction.

Eliminating the Sideboards

In legislative parlance, “sideboards” are conditions or limits included in legislation to prevent bad things from happening.  In 2013, the Oregon Legislature was considering spending $450 million for the I-5 bridge project, and after lengthy debate, it approved a series of such sideboards, trying to limit the cost of the project (more about that in a minute), and then also prohibiting the state treasurer from issuing any bonds for the project until after the Washington has contributed its share of the project’s costs, the federal contribution to the project was clearly committed, there had been prepared an independent financial plan for the project, and the state had conducted an “investment grade analysis of possible toll revenues.  All of those provisions are still codified in Oregon Law (Section Chapter 4 of Oregon Laws 2013).

And every one of those sideboards is eliminated, without acknowledgement.  Even the amendment’s “Staff Measure Summary” which is meant to disclose to Legislators the impact of the bill only cryptically and opaquely says:

“Repeals sections of House Bill 2800 (2013).”

Project Cost “Cap”–a legal limit from “Camelot”

We already know that a project cost cap is meaningless and unenforceable.  We already have such a cap!  It was enacted into law a decade ago and officially limited the total cost of the IBR to not more than $3.4 billion. 2013 Oregon Laws, Chapter 4, (Enrolled House Bill 2800) reads:

Conveniently, the “-2” amendments to HB 2098, without any fanfare, simply repeal this limit.  In its place, is an entirely new limit, which is worded identically–except of course that now the cost is more than twice as much.

As the Oregonian‘s “Politifact” reporters noted when they looked at the original so-called “cost cap” provisions for the Columbia River Crossing adopted by the Oregon and Washington Legislatures a decade ago, the caps are meaningless and unenforceable.

. . . if legislators greenlight the CRC, the state could ultimately owe more than $450 million on its share of the bridge. But setting a cap on the project or limiting Oregon’s share with legislative riders won’t stop that. And thanks to the agreement between Oregon and Washington to pay for the bridge jointly, if Oregon ever needs to pay more, Washington would need to join in.

PolitiFact Oregon doesn’t do prophecy. We can’t say whether the bridge will be over budget — as much as history might tempt us to offer a guess.

What we can say is that the Washington toll rule won’t matter. The Washington Legislature’s cap won’t matter.

The Legislature has no more ability to prescribe the cost of this project by edict, than it has to regulate temperature or rainfall.  Yet, the author’s of the “-2” amendments are simply performing a refrain from Camelot:

It’s true! It’s true! The crown has made it clear.
The climate must be perfect all the year.
A law was made a distant moon ago here:
July and August cannot be too hot.
And there’s a legal limit to the snow here
In Camelot.
The winter is forbidden till December
And exits March the second on the dot.
By order, summer lingers through September
In Camelot.

Crossing the Rubicon:  Raiding the General Fund for Road Projects

For the better part of a century, Oregon has prided itself on its “user-pays” transportation finance system.  Oregon was the first state to adopt a gasoline tax to pay for roads, and has observed a long tradition of having a “State Highway Fund” that is strictly segregated from other tax revenues and dedicated exclusively to paying for roads.  For the first time, the -2 amendments to HB 2098 would raid the General Fund to the tune of $1 billion to pay for a road project–which we’ve pointed out at City Observatory chiefly benefits residents of Washington State, as 80 percent of daily commuters and two-thirds of all bridge users live across the border in Washington.

Repealed Sideboards from HB 2800.

Here’s the language that would be repealed, featuring the provisions that weren’t disclosed in the text of the “-2” amendments or the Staff Measure Summary.

SECTION 3. (1) As used in this section, “Interstate 5 bridge replacement project” means the project described in section 2 of this 2013 Act.

(2) The total cost of the Interstate 5 bridge replacement project may not exceed $3.413 billion after the effective date of this 2013 Act.

(3) For the purpose of financing the Interstate 5 bridge replacement project, the State Treasurer may not have outstanding, at any one time, bonds in an amount exceeding $450 million of net proceeds, plus an amount determined by the State Treasurer to pay estimated bond related costs of issuance, for the purpose of funding Oregon’s share of the aggregated contribution to the project from Oregon and the State of Washington as described in the Final Environmental Impact Statement submitted to the United States Government for the project. It is the intent of the Legislative Assembly that moneys from the United States Government or toll revenues be used to directly fund the project, be used to repay other borrowings for the project or be pledged alone or with other security to lower the costs of other borrowings for the project

(4) The Department of Transportation may not request and the State Treasurer may not issue any bond to finance the Interstate 5 bridge replacement project unless:

(a) No later than September 30, 2013, the State of Washington has appropriated, authorized or committed sufficient funds to:

(A) Satisfy the United States Department of Transportation requirement for a proposed full funding grant agreement application; and

(B) Meet the requirements of the finance section included in the project’s Final Environmental Impact Statement published on September 11, 2011, and endorsed by the Federal Transit Administration and the Federal Highway Administration in the record of decision dated December 7, 2011;

(b) The United States Department of Transportation has submitted a full funding grant agreement application, in an amount of at least $850 million of Federal Transit Administration funds, for congressional review;

(c) The State Treasurer has participated in and approved the findings of an investment grade analysis of toll revenues associated with the project’s application for a loan from the Federal Highway Administration’s Transportation Infrastructure Finance and Innovation Act program, and provided for ongoing financial analysis of the project;

(d) The State Treasurer has reviewed and approved a comprehensive financing plan for the project, after making written findings that there are sources of funds committed by contract or law or otherwise obligated that are reasonably expected to be available and that will provide sufficient cash flows to pay the estimated costs of the initial phase of the project described in the full funding grant agreement without revenues from borrowings in addition to those described in subsection (3) of this section; and

(e) The United States Coast Guard has issued a general bridge permit for the main channel of the Columbia River for the project.

 

Proposed Amendments to HB 2098-2

If the author’s of the “-2” amendments were being candid, there’s what their amendments should actually say:

  • This act shall be known as the Blank Check, Pass-the-Buck, Cost-overrun, Send the Bill to our Kids Act of 2023.
  • The Legislature finds and declares that it doesn’t have the guts to pay for any of the billions of freeway widening projects ODOT is pursuing, and that it is unwilling to raise gas taxes to pay for them.
  • The Legislature intends that ODOT borrow billions of dollars based on vague “intentions” that the Legislature will miraculously find the will and the money to pay for these projects two or four or six years from now, and that ODOT should go ahead and borrow the money to get these projects started so that the Legislature will have no choice but to raise money someday in the future.
  • The Legislature intends that it will spend billions of dollars today to widen freeways that will increase car dependence and greenhouse gas emissions, and send the bill to future generations of Oregonians, who will also have to deal with the increasingly devastating effects of climate change.
  • The Legislature finds and declares It is powerless to do anything to limit ODOT cost overruns and that it will simply sign a blank check to ODOT for whatever amount of money it wants to spend on the Rose Quarter project.  that even though it approved the I-5 Rose Quarter project at a cost of $450 million in 2017, and that the cost has tripled to as much as $1.45 billion now, that it will fully fund whatever ODOT decides to spend on this project.
  • The Legislature finds and declares that the reasonable and prudent “sideboards” adopted by the Legislature a decade ago, when the state’s expected contribution to the IBR project was only $450 million, should be eliminated.

 

IBR’s plan to sabotage the moveable span option

IBR officials are planning to sabotage the analysis of a moveable span options as part of the Interstate Bridge Project

The Coast Guard has said a replacement for the existing I-5 bridges would need a 178 foot navigation clearance.  The highway departments want a 116′ clearance fixed span.

The Oregon and Washington DOTs say they are going to study a “moveable span” as a “design option” but are plainly aiming to produce a costly design that just grafts a lift-span on to their current bridge design.

A moveable span would enable a lower crossing, eliminate the need for lengthy viaducts, and reduce construction costs—but ODOT is refusing to design an option that takes advantage of these features.

And the DOTs have completely ignored an immersed tube tunnel option, implying that the Coast Guard directed them to study the moveable span (which it didn’t).

IBR staff have signaled they have no intention of seriously considering the fixed span, and are engaged in malicious compliance

Our story so far:  Oregon and Washington highway departments have proposed a new, fixed span highway bridge over the Columbia River between Portland and Vancouver as part of their massive $7.5 billion I-5 freeway widening project.  The bridge would have a 116 foot clearance over the river, but that’s not enough to satisfy the Coast Guard–which regulates bridge heights–and says a 178 foot navigation clearance is needed.

IBR simply chose to ignore the Coast Guard’s determination, and decided to move ahead with only the 116 foot clearance fixed span design.

The Coast Guard objected, saying this violated the terms of a 2014 memorandum of agreement between USDOT and the USCG.  (Ironically the MOA was created in the wake of the highway agency’s efforts to subvert and undercut Coast Guard review of the Columbia River Crossing, the previous iteration of this project).

Coast Guard officials wrote the FHWA and FTA to insist that they include an alternative in the project’s supplemental environmental impact statement that complies with the 178 foot height requirement.  The Coast Guard warned that the IBR should not proceed with an environmental impact statement that omitted a 178 foot clearance option:  “Including only one alternative in the Supplemental Environmental Impact Statement (SEIS) introduces risk that no permittable alternative will be evaluated in the SEIS.”

Importantly, USCG did not specify whether this should be a moveable span or a tunnel.

In response, IBR said it would look at a moveable span as a “design option” for the IBR.  That may sound like an “alternative,” but in fact when it comes to complying with environmental review requirements, it plainly is not.  A “design option” means that IBR will build exactly the same bridge it would build if it were a 116 foot fixed span, but they’d simply graft a moveable span (either a lift span or a bascule bridge) onto that very tall structure.  The IBR plan will likely look something very much like this:

“High Bascule”. — Bascule bridge grafted on to IBR’s 116 foot clearance fixed span

A camel is a horse designed by an devious highway engineer

Simply adding a moveable span to a high-level fixed span design eliminates the key design and cost advantage of the moveable span.  Because the moveable span allows tall vessels to pass through a very high (178′ in the case of lift span, or unlimited height, in the case of a bascule) there’s no reason why the remaining fixed portions of the bridge need to be nearly as high as the IBR’s current 116′ design.  The bridge can be built at a much lower level.  Conceptually, a bascule bridge would allow a much lower and shorter bridge structure, roughly like this:

“Low Bascule”. — Bascule bridge at profile of current I-5 bridges

That’s hugely important because the bridge can be much cheaper:  The current high IBR design requires half mile long elevated viaducts on both the North and South ends of the bridge in order to get the I-5 roadway from ground level in Vancouver up to the 150 height of the bridge roadway (the road level of the bridge is about 35′ to 40′ feet above the bottom of the double-deck bridge structure).  Lowering the height of the bridge makes it much cheaper to build; it also eliminates the need to rebuild intersections North and South of the river to reach up to the new higher bridge.  In addition, the lift span will have different and mostly fewer environmental impacts.  Because it will be less tall, it will be less steep, meaning trucks can get over it without slowing (which is a hazard to other traffic), plus all vehicles will burn less fuel (and create less pollution) on a shorter, less steep bridge.

It’s clear, however, that IBR officials have no intention of looking at using the lift span to reduce costs or minimize environmental impacts.  Greg Johnson, the IBR administrator, has fully indicated his intent to sabotage the moveable span design.  It is highly likely that they will specify a moveable span that is impractical and excessively expensive.  Greg Johnson telegraphed as much in his comments to the Columbian

The “movable span” option, which came at the request of the Coast Guard and federal government, will be explored in addition to the program’s original plan of a fixed-span bridge with 116 feet of vertical clearance.
The program will study both a lift span like the current Interstate 5 Bridge and a bascule bridge like the Burnside Bridge in Portland.
Program Administrator Greg Johnson said he believes a fixed-span bridge will ultimately end up spanning the Columbia.
He said a movable span would likely cost $500 million more than a fixed-span bridge and noted that the Columbia River Crossing project received a record of decision from the Federal Highway Administration and Federal Transportation Agency for a fixed-span bridge with the lower river clearance.
“I would be totally shocked if we can’t get to a fixed-span,” Johnson said.

(emphasis added)

The missing tunnel option

Press accounts, fueled by IBR statements, create the false impression that it was the Coast Guard that insisted on the inclusion of a moveable span option.  Oregon Public Broadcasting reported:

Planners in charge of the new, multibillion-dollar overhaul have recently been told by federal regulators they must include plans for “moveable span” on the bridge. Greg Johnson, who is leading the team of planners, said federal regulators made the order in late February.

The Vancouver Columbian reported:

The “movable span” option, which came at the request of the Coast Guard and federal government, will be explored in addition to the program’s original plan of a fixed-span bridge with 116 feet of vertical clearance.

In fact, the Coast Guard made no recommendation as to the kind of option that the project should study.  Either a moveable span or a tunnel under the river could satisfy the Coast Guard’s 178 foot height requirement.  Here’s what the Coast Guard letter, from Rear Admiral M. W. Bouboulis (not included in any press accounts) actually says:

I recommend that the Notice to Supplement clearly state the alternatives to be evaluated in the SEIS to include the no build alternative, the locally preferred alternative (116-foot vertical clearance), and an alternative that meets the 178-foot vertical clearance established in the PNCD. This will ensure that an alternative that meets the initially identified needs of navigation is evaluated in the SEIS and could be adopted by the Coast Guard.

(emphasis added)

This wasn’t the Coast Guard asking for something new in February, 2023–it was actually the Coast Guard repeating pretty much exactly what it asked for in its Preliminary Navigation Clearance Decision in June of 2022.  The Coast Guard made it clear that a 116 foot bridge interfered with river navigation:

Our PNCD concluded that the current proposed bridge with 116 feet VNC [vertical navigation clearance], as depicted in the NOPN [Navigation Only Public Notice], would create an unreasonable obstruction to navigation for vessels with a VNC greater than 116 feet and in fact would completely obstruct navigation for such vessels for the service life of the bridge which is approximately 100 years or longer.

B.J. Harris, US Coast Guard, to FHWA, June 17, 2022, emphasis added.

In response to the Bouboulis letter,  IBR (through the FHWA and FTA) replied that it would study a moveable span.  This was IBR’s decision, not Coast Guard’s decision.

What ends up on the cutting room floor, here, is the possibility of an immersed tube tunnel, a technology that is widely used around the word, and which would provide unlimited vertical (and horizontal) navigation clearance.  The immersed tunnel would also remove the visual blight and noise pollution from downtown Vancouver and its rapidly redeveloping waterfront.  To hear the IBR tell it, the reason the immersed tube tunnel isn’t being considered is because the Coast Guard directed them to study a moveable span.  That’s simply untrue.  In its June 2022 preliminary determination of navigation clearance, the Coast Guard specifically identified the tunnel option as one way to comply with its navigation requirements.  It is IBR, not the Coast Guard, that is declining to take a hard look at the immersed tube tunnel.  This seems likely to be a violation of the National Environmental Policy Act, because the immersed tube tunnel would have very different (and much reduced) environmental impacts than the bridge options.

A “Design Option” not an “Alternative”

There’s one other seemingly minor wrinkle in the IBR’s latest gambit.  They’re talking about including the moveable span as a “design option.”  While that might sound like an “alternative” to the layman, it actually has important legal and practical implications.  “Design option” means they’ll look at the moveable span not as a full fledged separate alternative, but rather as just simply one feature grafted on to the existing IBR design.  As noted above, this means we’ll get something that looks almost exactly like the IBR 116′ clearance bridge with a bascule or lift-span “cut and pasted” on it.
The reason for calling it a “design option” rather than an alternative is to escape a requirement that the highway department’s fully evaluate the environmental and other impacts of the moveable span design.  A moveable span would be expected to have very different cost, traffic, and environmental impacts than IBR’s proposed high fixed span.  Under the National Environmental Policy Act (NEPA) the two state highway departments should fully flesh out this alternative, and evaluate those differing impacts.  Treating the moveable span as a design option is a transparent ruse to avoid NEPA scrutiny.  This could turn out to be a fatal legal error by the project:  NEPA is clear that sponsoring agencies have to give a “hard look” to reasonable alternatives, something this “design option” approach is designed to avoid.

Coast Guard Letter, February 8, 2023

USCG_IBR_8feb2023

The Color of Money: Bailing out highways with flexible federal funds

ODOT grabs a billion dollars that could be used for bikes, pedestrians and transit, and allocates it to pay highway bills.

Oregon highways are out of compliance with the Americans with Disabilities Act, and the cost of fixing them can–and should–be paid for out of the State Highway Fund. But instead, ODOT plans to take more than a billion dollars in future federal grant money over the next decade or more, and use it to pay off this highway liability.

What this strategy does is to take money that could be used for a wide variety of different transportation needs and use it only to bail out the State Highway Fund.

By taking this liability out of the State Highway Fund, ODOT can then claim it has plenty of funds for highway expansions. This shell game uses the ADA liability as cover to use flexible federal funds, in essence, to build more highway capacity.

Oregon’s constitution contains a retrograde provision that has been interpreted to require that moneys from gas taxes be used only to build roads, based on a fallacious argument that we have a “user pays” transportation system. The state highway departmetn, ODOT, routinely inovkes that constitutional argument when asked by the public to spend more on things like transit, pedestrian improvements or cycling. We can’t because that money can’t be used for anything other than building roads, they say. As a result, the truly innovative and “multi-modal” uses of funds in Oregon have been paid for disproportionately from federal funds, which are much more flexible. Not only does the Oregon constitutional limit not apply to federal funds, but federal law explicitly allows states to transfer funds among a variety of different categories. You’d think that flexible federal funds would be a key way to diversify our transportation portfolio. But ODOT has hit on a new gimmick to grab these federal funds and use them to bail out the struggling State Highway Fund.

It’s a complex story, and it involves changing the “color of money.”

For decades, the Americans with Disabilities Act has required private businesses and public agencies to provide accommodations for persons with disabilities. For nearly all of that time, the Oregon Department of Transportation has largely flouted that requirement, seldom providing sidewalks and ramps on state highways. As a result, disability advocates hauled them into court, and In 2020 reached a billion dollar settlement, in which ODOT agreed to make the necessary investments to bring highways into compliance with this long-established federal law.

Let’s just talk for a moment about what people in transportation finance call “the color of money.” You may think that all money is green, but in the transportation world, there are different kinds of money, with different strings attached. Funds raised by the state, for example, from the gasoline tax, are governed according to the dictates of state law, and importantly, constitutional restrictions.

ODOT loves to tell advocates it would gladly do more to help promote transit, but its hands are tied by the state constitution: It simply has no choice but to spend these dollars in the roadway.

There’s another color of money in the transportation world, though, “federal money.” Federal money is not governed by the state constitutional restrictions on state taxes. Federal money can be used for a wide variety of purposes, and the federal law gives the state wide flexibility to reallocate money among different categories. It doesn’t all have to be spent on highways.

Consequently, that’s why, when it comes to how we should use federal funds, there’s a lot more debate. In 2022, the Oregon Transportation Commission had a lengthy debate about how to allocate more than $400 million in federal funds coming to the state under the IIJA. Transportation advocates around the state came up with an alternate scenario to allocate about $130 million to local transportation projects. The OTC largely rejected this path.

Transportation Commission makes final decision on $412 million in federal funds

When it comes to getting a different allocation of these highly flexible federal funds, advocates are largely fighting for crumbs—and getting very little.

And ODOT is largely pre-empting any future option to use these funds differently by proposing to use them to repay a huge pile of debt. By pledging to use these federal grants to pay back debt, it will be impossible to use them for other purposes.

So let’s go back to ODOT’s ADA settlement: Under the terms of the deal, ODOT needs to bring its highways into compliance with the Americans With Disabilities Act by spending $1 billion. To be clear, this is a cost of the highway system—these ODOT roads don’t comply with ADA requirements. This spending is plainly a liability and a responsibility of the highway system. There’s no question that it can constitutionally be paid for with state gas tax revenues. But instead of paying for this cost with those monies, ODOT instead is planning to pay these costs by diverting flexible federal funding for the next decade, to the tune of a billion dollars, to pay these costs.  It will issue $600 million in “GARVEE” bonds (grant anticipation revenue bonds), and then use future federal funds to pay the debt service.

In essence, this reduces the amount of money potentially available for alternative transportation investments, unfettered by the state constitutional limits, by a billion dollars. It amounts to tying up a big chunk of potential revenue.

And what’s worse, ODOT is planning to bond against these federal revenues, spending the money now, and paying it back, with interest, over the next decade or more. So that means a substantial portion of those federal revenues are spent on interest payments, rather than on transportation projects.  At 5 percent interest, $600 million in bonds paid back over 15 years would mean that the state would pay about a quarter of a billion dollars in interest.  In the end, the State Highway Fund would be bailed out by more than $1 billion, and there would be that much less flexible federal funding for other projects around the state.

As we’ve said, when it comes to transportation finance, ODOT is the master of three-card monte: It’s ability to move dollars among categories–to change the color of money–systematically advantages its policy priorities (chiefly building more and wider highways) and leaves advocates of other policies fighting over crumbs.

By using flexible federal funds to pay the costs of the state highway system—plus a hefty pile of interest—ODOT is foreclosing the possibility that future decision-makers will have any ability to use these funds for alternatives in the future. It’s literally the priorities of the past dictating the choices of the future. If ODOT paid its ADA liability out of the State Highway Fund, as it legally can, and arguably should, it would have even less money to spend on road expansion projects.

Houston’s I-45: Civil rights or repeated wrongs?

Editor’s Note:  For the past two year’s the Federal Highway Administration has been investigating a civil rights complaint brought against the proposed I-45 freeway expansion project in Houston.  This week, FHWA and TxDOT signed an agreement to resolve this complaint.  

Urban freeways have been engines of segregation and neighborhood destruction for decades, a fact that even highway builders are now acknowledging.  You might think that civil rights laws might provide some protection against a repetition of the devastating consequences of such projects, but this case shows that federal and state highway builders aren’t about to make any serious changes to their plans to either right historical wrongs or avoid making them worse.  Officials will give lip service to the egregiousness of past mistakes, but then blithely repeat them.

Kevin DeGood of the Center for American Progress has taken a close look at the agreement, and published his analysis as a series of tweets.  His analysis deserves a wider audience, and with his permission, we’re repeating it here.  The twitter original is available here.

1/ FHWA and TxDOT have signed a Voluntary Resolution Agreement (VRA), which allows TxDOT to build the massive $10 BILLION I-45 North Houston Highway Improvement Project (NHHIP). It’s not good. Let’s take a look.
Image

Image

2/ Numerous complaints against the NHHIP project argued the design violates Title VI of the Civil Rights Act of 1964. Why? Because the expansion will cause massive displacements — especially of low-income residents in communities of color:
Image

3/ The disparate impacts from the NHHIP are not limited to affordable housing loss. The Air Alliance complaint states the project would degrade air quality in “predominantly lower-income communities of color…” (Note: MSATs are mobile source air toxics)

Image

4/ I-45 will also supercharge sprawl. The transportation improvement program (TIP) for 2021-2024 shows how the exurban growth machine leverages highway expansions like I-45. This $386M for highway widening in Montgomery County is a small sample of the region total.
Image

Image

5/ This is FHWA’s quick summary of the Voluntary Resolution Agreement (VRA). Let’s dig in a little, starting with “Highway ‘Footprint’ Reduction.” Wow, that sounds promising. But what has TxDOT actually promised to do…?

Image

6/ Short answer: Nothing. TxDOT has only committed to…”evaluating reasonable opportunities to reduce the project footprint…” but ONLY if they would “not compromise the integrity and functionality of the purpose and need…” Welp.

Image

7/ Ok, what about “Mitigating displacements”? Since TxDOT isn’t reducing the project footprint, the displacement totals will not change. Instead, TxDOT will provide $30M for affordable housing. But TxDOT already agreed to $27M in the ROD [Record of Decision]! The $3M is an…11% increase.
Image

8/ Ok, what about “Air Quality Mitigation“? TxDOT has agreed to install air quality monitors. Be still my heart. Billions of additional VMT producing emissions and PM 2.5 and residents will get a few monitors and some data buried on TxDOT’s website.

Image

9/ Ok, what about “Structural Highway Caps“? Again, FHWA is bragging about something that TxDOT already agreed to. TxDOT is set to build 4 caps. But, it’s up to a third party to “fund the design, construction, operations and maintenance of amenities…” Piece of cake!

Image

Image

10/ The NHHIP will result in:
– More VMT, GHG emissions, & auto dependence
– Worse air quality
– Huge residential displacements
– Huge business displacements
– More sprawl
– More wetlands loss
The VRA has not meaningfully changed the project design or its negative impacts.

Why does a $500 million bridge replacement cost $7.5 billion?

The “bridge replacement” part of the Interstate Bridge Replacement only costs $500 million, according to new project documents

So why is the overall project budget $7.5 billion?

Short answer:  This is really a massive freeway-widening project, spanning five miles and seven intersections, not a “bridge replacement”

Longer (and taller) answer:  The plan to build half-mile long elevated viaducts on both sides of the river, and the need to have interchanges raised high into the air make the project vastly more complex and expensive.

In November of 2022, the Interstate Bridge Replacement team (a collaboration of the Oregon and Washington highway departments), released a document called the “River Crossing Option Comparison” sketching out the advantages and disadvantages of several different alternatives crossing the Columbia River.  The alternatives examined included tunnels under the river, and a series of bridge designs—two different moveable span bridges, and two fixed spans, a high level and and mid-level (116 foot clearance crossing.)

Here’s the bottom line of the report—buried away on page 50 of a 68-page PDF file—the IBR’s preferred design, a mid-level fixed span, is supposed to cost $500 million.

That’s a fascinating number, because in December, the IBR team released another document, a long-awaited financial plan describing the total cost of the project.  It told a joint committee of legislators from Oregon and Washington that the project’s budget had increased from a maximum of $4.8 billion (estimated in 2020) to a new “maximum” of $7.5 billion (although the two agencies still maintain that they’re trying to bring it in for a mere $6 billion).

All this raises a fascinating question:  Why does this project cost $7.5 billion when the price tag for actually replacing the bridge is only $500 million?

Most of the project cost is highway widening, not the bridge

More recently, the project has offered a few additional details, summarized in the graphic below.  As we’ve noted at City Observatory, the name “bridge replacement project” is clearly misleading.  The IBR is really a five-mile long freeway widening project that requires rebuilding seven closely spaced interchanges.  According to the IBR, the cost of the four major segments of the project is about 1 to $1.5 billion each for the Oregon and Washington interchanges and highway widenings (segments A and D), about 1.3 to $2 billion for the transit portion of the project, and about 1.6 to 2.5 billion for the bridge and approaches (segment C).

At between $2 and $3 billion, it’s clear that the interchange rebuilding and roadway widening is more expensive than the river crossing. And an earlier expert review of the Columbia River Crossing version of this same project, commissioned by the two state highway departments and the behest of the then Governors, recommended strongly that the project eliminate one or more interchanges, to save cost, improve safety and performance, and enable a better bridge design.  By rebuilding these too closely spaced interchanges, the panel warned, the DOTs were repeating–at enormous cost–a decades old design error..

A high bridge requires long, steep approaches

The IBR budget breakdown unhelpfully combines the cost of the “bridge” and its “approaches.”  As this illustration shows, what IBR calls the combined “bridge and approaches”—shown in red—extend for about a half a mile on either side of the river:  to Evergreen Boulevard (more than half a mile north of the riverbank on the Vancouver side of the river, and almost all the way across Hayden Island (a bit less than half a mile) on the Oregon side of the river.

We know from the “River Crossing Options” report that the actual bridge itself—that is the portion between the north and south river banks—would cost approximately $500 million to build.  What the IBR doesn’t talk about is the “approaches” which are actually elevated viaducts that have to reach 100 feet or more into the air in order to connect to the high level crossing.  These are vastly higher (and wider) than the existing bridge approaches, which are fully at grade on the Oregon and Washington sides of the river with the current low-level lift bridge.

The mile of elevated freeway that IBR plans to build to connect its high level bridge to the existing freeway at either end of the red-shaded area is what is driving the cost of this segment of the project. If, as IBR says, the bridge structure costs $500 million, this means that most of the cost of this part of the project—as much as $1.5 to $2.0 billion—are the lengthy, elevated approaches.  What IBR has failed to do is consider how much less expensive the approaches could be if it chose one of the alternate bridge designs (either a moveable span or immersed tube tunnel).  Either of these designs would allow approaches to be built mostly or entirely at grade, eliminating the expense and environmental impact of elevated viaducts.  The lower level would also greatly simplify and reduce the expense of the SR 14 interchange, which currently involves convoluted spiral ramps with grades of 6 or 7 percent.

It’s also worth noting that the IBR project hasn’t itemized the cost of demolishing the existing I-5 bridges.  Because these structures cross over sensitive river habitat, and because the bridges themselves have toxic lead paint and other environmental contaminants, the cost of bridge removal could be enormous.

Engineers gone wild, said then-Congressman DeFazio

Clearly, what’s going on here is that highway engineers at ODOT and WSDOT see this project as their opportunity to build the project of their dreams.  Not just a giant bridge, but massive new interchanges, wider freeway lanes, and if people insist, a short light-rail extension.  The bigger, the better.  The grandiose and costly bias of the state highway departments has been long known to key local leaders.  Former Congressman Peter DeFazio (until last year, Chair of the House Infrastructure Committee), in a characteristically frank admission said:

“I kept on telling the project to keep the costs down, don’t build a gold-plated project,” a clearly frustrated DeFazio said. “How can you have a $4 billion project? They let the engineers loose, told them to solve all the region’s infrastructure problems in one fell swoop… They need to get it all straight and come up with a viable project, a viable financing plan that can withstand a vigorous review.”
(Manning, Jeff. “Columbia River Crossing could be a casualty of the federal budget crunch”, The Oregonian, August 14, 2011).

Later, DeFazio told Oregon Public Broadcasting:

“I said, how can it cost three or four billion bucks to go across the Columbia River?  . . . The Columbia River Crossing problem was thrown out to engineers, it wasn’t overseen: they said solve all the problems in this twelve-mile corridor and they did it in a big engineering way, and not in an appropriate way.”
“Think Out Loud,” Oregon Public Broadcasting, August 18, 2011.

At long last, there are some signs that the problems with their super-sized design are dawning on IBR staff.  Project director Greg Johnson recently let slip that IBR is now looking at a “single-level” design—something they ruled out more than a decade ago.  This may mean the states are actually going to consider a lower level crossing. IBR has also conducted a “Cost Estimate Validation Process” or CEVP—which they’ve declined to reveal to the public.  This engineering review likely highlights the cost and risk of the project’s current bloated design.

There’s no reason a $500 million bridge replacement should cost $7.5 billion.  If this project were right-sized—simply replacing the bridge structure, and maintaining a low-level crossing that could connect to existing approaches, and eliminate the need to rebuild seven different intersections and widen miles of freeway, the cost could be brought down substantially.

 

More induced travel denial

Highway advocates deny or minimize the science of induced travel

Induced travel is a well established scientific fact:  any increase in roadway capacity in a metropolitan area is likely to produce a proportional increase in vehicle miles traveled

Highway advocates like to pretend that more capacity improves mobility, but at best this is a short lived illusion.  More mobility generates more travel, sprawl and costs

In theory, highway planners could accurately model induced travel; but the fact is they ignore, deny or systematically under-estimate induced travel effects.  Models are wielded as proprietary and technocratic weapons to sell highway expansions.

Induced travel, or as its otherwise known, the fundamental law of road congestion, is a particularly inconvenient fact for highway boosters.  A growing body of evidence confirms what has been observed for decades:  adding more un-priced roadway capacity in urban settings simply generates more and longer trips, and does nothing to eliminate congestion.  Day by day, the popular media are starting to communicate this seemingly counter-intuitive fact to the public.

Highway boosters either simply ignore the entire concept of induced demand, or pretend that it doesn’t exist.  A new chapter in this effort to avoid this inconvenient fact comes from  Arizona State University Professor Steven Polzin, writing at Planetizen.

Polzin isn’t a complete induced travel denier; he’s more an induced travel apologist and minimizer.  It may be a real thing—or might have been in the past, he assures us—but it’s not a big deal and is now adequately being thought about by state highway departments and can safely be ignored.

Induced travel is scientific fact

Polzin derides induced demand as “a popular concept among urbanists” and argues that it’s given too much publicity in the media, by the likes of the New York Times.

But induced travel is not simply a “popular concept,” it’s a well researched scientific fact.  The best available evidence from a series of studies, shows that there’s essentially a unit elasticity of travel with respect to the provision of additional highway capacity.  A whole series of studies supports this estimate, some of which are shown here.

Duranton, Gilles, and Matthew A. Turner. 2011. “The Fundamental Law of Road Congestion: Evidence from US Cities.” American Economic Review, 101 (6): 2616-52.

Hymel, Kent, 2019. “If you build it, they will drive: Measuring induced demand for vehicle travel in urban areas,” Transport Policy, Elsevier, vol. 76(C), pages 57-66.

Hsu, Wen-Tai & Zhang, Hongliang, 2014. “The fundamental law of highway congestion revisited: Evidence from national expressways in Japan,” Journal of Urban Economics, Elsevier, vol. 81(C), pages 65-76.

Miquel-Àngel Garcia-López, Ilias Pasidis, Elisabet Viladecans-Marsal, Congestion in highways when tolls and railroads matter: evidence from European cities, Journal of Economic Geography, Volume 22, Issue 5, September 2022, Pages 931–960,

It’s odd that Polzin, a university professor, provides only a list of popular media articles (which he disbelieves) and provides not  not a single footnote or reference to a peer-reviewed academic study to dispute the notion of induced travel.

Purported mobility gains are an illusion

Sure there may be some induced travel, Polzin argues, but don’t overlook the benefits of greater mobility.  This misses the point that mobility (i.e. driving more and further) is evidence of induced travel, not a refutation.  And mobility tends to be short-lived and costly. Our friend and colleague, Todd Litman of the Victoria Transportation Policy Institute has a compelling rebuttal to Polzin on this point at Planetizen.

Polzin pleads with us to recognize the “mobility” benefits that come from increased highway capacity.  He misses two things here:  first, the key insight from the research on induced travel is that the mobility gains are at best a temporary illusion.  Somewhat faster moving traffic prompts more trip taking and longer trips, which quickly erodes any mobility gains.  And greater mobility simply prompts greater decentralization and sprawl, so even in places where traffic moves faster, everyone has to travel farther—and that comes at a real social, environmental and economic cost.

In effect, Polzin says that traffic growth is just due to population growth, and is inevitable, and good.  But he completely ignores the clear cross-sectional evidence from US metropolitan areas:  Metro areas with fewer lane miles of roads have shorter travel distances.  And far from being economically constrained, metro areas with less roadway capacity sprawl less, reducing public sector infrastructure costs, and creating a “green dividend” for their residents, who don’t have to drive as far.  The average resident of Portland drives about half as far every day as the average resident of Houston.  And, as we’ve documented at City Observatory, people who live in cities where people drive less are happier with their transportation systems.

Predict and provide = Prevaricate and pave

For decades, state highway departments have used their control over opaque and technocratic travel demand models to build a case for ever more highway capacity. Their “predict and provide” approach is the bureaucratic manifestation of induced travel.  Polzin never quite acknowledges this history, but instead suggests that we should simply trust highway planners to build new  traffic models that account for induced demand.

Much of the reporting on induced demand gives the impression that the transportation planning community is oblivious to this phenomenon or is comprised of road-building zealots. Newer activity-based transportation models are designed such that activity generation (trip generation) is sensitive to travel times. Consequently, improvements in travel speed will contribute to predictions of increased trip-making and travel distance. Even without the newest models, scenario testing and careful analysis of changes in demographics, mode choices, and flow volumes and patterns can give insight into the nature of demand on new facilities.

In theory, state highway departments could build models that accurately reflect induced travel.  But the simple fact is that they don’t.  To the contrary, a recent published article on the practice of state highway travel forecasting looked at this specific issue, and found just the opposite:  Induced travel effects are routinely ignored by state highway departments, and induced travel is generally introduced into highway environmental assessments only at the behest of public critics.  Those few state highway efforts that do consider induced demand, wildly understate likely effects.  Highway departments continue to produce models that exaggerate future travel demand growth even in the face of demonstrable capacity constraints, as Norm Marshall puts it “forecasting the impossible.”  And some, like Oregon, simply deny that induced travel is real, and prohibit their modelers from using scientifically based tools that estimate induced travel.

In a similar vein, Polzin solemnly intones that future transportation projects ought to be based on sound projections of future.

Roadway investments in new capacity should be based on up-to-date and sound demand estimates. They can’t just fulfill out-of-date plans or serve as ill-advised opportunities to create jobs or garner state and federal resources for local use. They should not use twentieth-century per capita travel growth rates or chamber of commerce-inspired population growth assumption

But there’s precious little evidence that state highway departments do anything of the sort.  They routinely plan for highway capacity expansions on roads where traffic is declining.  The Oregon Department of Transportation proposes expanding capacity at the Rose Quarter at a cost of $1.45 billion, even though traffic levels on that particular roadway have been declining for 25 years.  Cincinnati’s Brent Spence Bridge is slated for a massive $3.5 billion expansion, even though its traffic has been flat for more than a decade.  And other state highway departments routinely produce “hockey stick” traffic forecasts that are simply never realized.

The underlying problem that highway advocates fail to acknowledge is that road users will typically only use added highway capacity only if they don’t have to pay for it.    In the very limited instances in which drivers are asked to pay for even a fraction of the cost of providing increased road capacity, demand disappears.  The evidence from tolled roadways like Louisville’s I-65 bridge is that most people are unwilling to pay even a small fraction of the cost of freeway widening projects that would save them travel time.  That shows that the only reason people drive on expanded roadways is that someone else pays for them.  That’s pretty much the definition of induced travel.

Polzin’s piece is subtitled: “Induced demand is a popular concept among urbanists, but does its pervasiveness obscure the true costs of mobility?” This is a classic example of Betteridge’s law of headlines, the adage that states: “Any headline that ends in a question mark can be answered by the word no.”  Induces travel is real, and at this point, only highway advocates, and their apologists, like Polzin, are in any doubt about what this means.

The Case Against the Interstate Bridge Replacement

Here are our 16 top reasons Oregon and Washington need to re-think the proposed Interstate Bridge Replacement Project.  The bloated size of the project and its $7.5 billion cost, and the availability of better alternatives, like a bascule bridge, call for rethinking this project, now.

  1. It’s not a bridge, it’s a freeway widening and interchange rebuilding project.  Contrary to the project’s name, it’s not merely a “bridge replacement.”  The bulk of the cost is widening 5 miles of freeway and rebuilding 7 major interchanges.  IBR’s own “River Crossing Options” study says the proposed IBR bridge only costs $500 million.

  2. The budget is out of control: $7.5 billion.  In 2020, the IBR was projected to cost a maximum of $4.8 billion. The price tag for the project jumped 54 percent in December, 2022.  The total cost is now estimated at $7.5 billion, but ODOT has a long history of having its major projects end up costing twice as much as budgeted.  Contrary to claims made by the IBR, recent construction cost inflation accounts for only $300 million of the more than $2.5 billion cost increase since 2020.
  3. A tunnel or bascule bridge would be vastly cheaper, avoiding the need to widen the freeway and rebuild intersections. IBR’s design will allow only 116 feet of navigation clearance, and IBR has refused to seriously consider either an immersed tube tunnel or lower level bascule bridge, both of which would eliminate most or all bridge lifts, and eliminate the need to rebuild intersections on I-5. The I-95 Woodrow Wilson Bridge in Washington DC is recently constructed bascule, and carries twice as much traffic as the I-5 bridges.
  4. Its really a 12-lane wide freeway.  The IBR likes to describe the project as just adding “auxiliary lanes” to I-5, but a close look at its actual plans shows it will build a 164-foot wide highway bridge–enough for as many as 12 lanes.  Once built, ODOT and WSDOT can easily re-stripe this very wide structure as a 12-lane roadway.
  5. ODOT is ignoring the Coast Guard’s direction.  The Coast Guard, which has authority to regulate bridge height–says that IBR’s bridge needs to have a 178-foot clearance over the Columbia River.  With the CRC, the failure to follow Coast Guard guidance resulted in a costly year-long delay as the project was redesigned.
  6. ODOT’s high, fixed span crossing creates dangerous and expensive elevated roadways and steep on-and-off ramps. The IBR would have a main span with a grade of 4 percent, higher than almost every interstate bridge in the US, and ramps would have 6-7 percent grades.  The steep grades will slow trucks and create dangerous conditions in winter weather.
  7. Planned tolls of up to $5.69 each way will permanently reduce traffic to less than 90,000 vehicles per day (from 135K today).  IBR has refused to release its proposed toll rates.  Documents obtained by public records request show IBR is looking at tolls as high as $5.69 each way at the peak hour.  According to the Investment Grade Analysis performed for the Columbia River Crossing in 2013, even $3 tolls would permanently reduce traffic on I-5 to less than 90,000 vehicles per day–dramatically below its current traffic level of 135,000.
  8. High IBR tolls would produce gridlock on I-205.  The IBR project plans to toll the new I-5 bridge, but not the parallel I-205 Glen Jackson Bridge.  The Investment Grade Analysis prepared for the Columbia River Crossing in 2013 concluded that this would divert tens of thousands of vehicles to I-205, producing gridlock on the I-205 bridge.
  9. ODOT has ignored its own expert panel which recommended breaking the project into three independent phases.  In 2010, Governors Kulongoski and Gregoire appointed a panel of national bridge and highway experts to review the Columbia River Crossing.  They recommended that the project be broken into three separate, independent phases, to minimize financial risk.  They also recommended eliminating one or more interchanges to improve traffic flow, reduce cost and simplify bridge design.
  10. IBR traffic projections have been proven dramatically wrong:  They grossly over-estimate future traffic levels on the existing bridge, which is capacity constrained.  The CRC FEIS predicted I-5 traffic growth of 1.3 percent per year; actual growth was 0.3 percent per year through 2019. They also fail to accurately predict future traffic levels.  The independent Investment Grade Analysis in 2013 showed that the IBR forecasts overstated future I-5 traffic levels by about 80,000 vehicles per day, leading to the design of a grossly over-sized project.
  11. IBR staff altered the output of Metro’s traffic models, and increased predicted peak hour traffic on the existing I-5 bridge above that predicted by the Metro model, and in excess of the actual physical capacity of the bridge.  This so-called “post-processing“–which isn’t documented according to ODOT’s own analysis procedures–inflated no-build traffic volume artificially worsened predicted future congestion, and created a false baseline for assessing the need for and impacts of the proposed bridge widening.
  12. The IBR project mostly benefits Washington residents.  According to Census data produced by IBR, approximately 80 percent of daily commuters across the Columbia River are Washington residents.  According to a license plate survey conducted for the two states, twice as many Washington cars use the I-5 bridge as do Oregon cars.  Yet Oregon will have to pay just as much as Washington state, plus pay for the entire cost of the $1.45 billion Rose Quarter project (which is heavily used by Washington commuters).
  13. IBR has falsely portrayed the income, race and ethnicity of typical bridge users.  The median peak hour drive-alone commuter from Clark County Washington to jobs in Oregon has a household income of $106,000.  About 86 percent of these commuters are non-Hispanic whites.  These commuters are whiter and have higher incomes that the rest of the Portland metropolitan area, and are half as likely to be people of color as the region’s population.
  14. IBR has no meaningful cost controls.  ODOT & WSDOT claimed in legislative testimony in December 2022 that future cost escalation would be managed using a “Cost Estimate Validation Process (CEVP)” that they say that had already completed.  A public records request showed that no documentation existed for the CEVP.
  15. IBR has put off doing an “Investment Grade Analysis” which will be required for federal TIFIA loans andtoll bonds.  The investment grade analysis done for the CRC showed that traffic would be dramatically lower, and tolls would have to be dramatically higher than the figures ODOT and WSDOT used to sell the CRC.
  16. A massive IBR will be a visual blight on Vancouver’s revitalized waterfront, and a massive viaduct across Hayden Island.  The elevated approaches required by IBR’s 116 foot high fixed span are the equivalent of three Marquam Bridges side by side as they cross the waterfront in downtown Vancouver. Seattle just spent several billion dollars to remove a similar waterfront eyesore.

What we should do instead.

  1.  Refocus the project on replacing the bridge, not widening the freeway
  2. Re-appraise low cost options to a high, fixed span  (a bascule bridge or immersed tube tunnel) that could use existing approaches and eliminate the expense of rebuilding interchanges and creating massive elevated viaducts.
  3. Right-size the bridge’s capacity to reflect the traffic levels that can be expected with tolling

Note:  This commentary has been updated to include additional images and links.

What new computer renderings really show about the IBR

The Interstate Bridge Project has released—after years of delay—computer graphic renderings showing possible designs for a new I-5 bridge between Vancouver and Portland.  But what they show is a project in real trouble.  And they also conceal significant flaws, including a likely violation of the National Environmental Policy Act.  Here’s what they really show:

  • IBR is on the verge of junking the “double-decker” design its pursued for years.
  • It is reviving a single decker design that will be 100 feet wider than the “locally preferred alternative” it got approved  a year ago.
  • The single deck design is an admission that critics were right about the IBR design having excessively steep grades.
  • The single deck design has significant environmental impacts that haven’t been addressed in the current review process; The two states ruled out a single deck design 15 years ago because it had greater impacts on the river and adjacent property.
  • IBR’s renderings are carefully edited to conceal the true scale of the bridge, and hide impacts on downtown Vancouver and Hayden Island.
  • IBR has blocked public access to the 3D models used to produce these renderings, and refused to produce the “CEVP” document that addressed the problems with the excessive grades due to the double-deck design.
  • The fact the IBR is totally changing the bridge design shows there’s no obstacle to making major changes to this project at this point.

The actual appearance of the proposed $7.5 billion Interstate Bridge Replacement project has been a carefully guarded secret. IBR has finally produced renderings of what the bridge might look like, and they conceal more than they reveal.  All of the renderings are shown from a distant vantage point—probably a mile or so away from the actual bridge—making it look tiny.  And the renderings don’t show how much larger the proposed new bridges are than the existing bridge.  The renderings are also carefully crafted so you can’t tell how tall the bridge will be in relation to the buildings in downtown Vancouver (it will be taller than most of them), nor does it show a lengthy elevated viaduct that will tower over most of Hayden Island. What the renderings do show is that IBR is now almost fully committed to a single-level bridge design.  Whereas prior renderings never showed a single-level bridge, five of the six designs presented on the IBR website are single-level bridges, and only one is the double-decker design the IBR has been pushing for more than a decade.

 

And none of these renderings show the actual width of either the single- or double-deck versions.  Other ODOT documents—not included with the renderings—show the singe-deck designs will be more than 270 feet wide—nearly as wide as a football field is long.  We know that IBR has developed a sophisticated 3D model—a “digital twin” of the project.  IBR consultants bragged about the state-of-the-art detail of the model in a presentation to a professional group in Seattle earlier this year, but said they couldn’t share the illustrations, because:

 “There is a very detailed 3D model. I was going to try and show it . . . It’s very, very, it’s kept under wraps quite a bit, and I think it’s because of their experience with the first round, trying to tread carefully.” 

We filed a public records request with WSDOT and in response, they claimed that the only “model” was a rendering released in January 20, 2022, and that they are ignorant of this work—even though contractor WSP and software provider Bentley prominently tout this “digital twin” work for IBR on their websites.  And obviously, IBR had this 3D model in place to produce the renderings it released on May 25.  It’s plain that ODOT and WSDOT don’t want people to see what they are planning to build.

Junking the double decker design

What these new renderings signify is  that the Oregon and Washington DOTs are junking the double-decker design they’ve been pushing for the Interstate Bridge Replacement for more than a decade, and instead are planning a much wider single-level bridge.

Since 2008, ODOT and WSDOT have only been looking at a pair of double-decker bridges to replace the existing I-5 crossing.  Each of these bridges would be about 90 feet wide, with room for six highway lanes on the top deck of each bridge, and provision for light rail, bikes and pedestrians on lower levels.

As part of the project’s draft environmental impact statement, the two highway departments considered, and rejected, a single-level design, because it would have had greater impacts on the river (more piers in the river, more cover over the river, and greater visual impacts).  Only the double-decker design was advanced to the Final Environmental Impact Statement, adopted in 2011.

Now, suddenly, IBR is pushing a slew of single-level designs.  We say “suddenly” because IBR made no mention of a single level option until February of 2023—almost a year after it asked all of its local partners to sign off on a “Modified Locally Preferred Alternative” that consisted solely of the double decker bridge.

As we wrote in February, this sudden change of heart vindicates one of the key criticisms of the IBR design—that its high fixed span necessitates very steep grades, both for the mainline highway section, and especially for the bridge’s off-ramps.  The grade for the mainline would be as much as 3.99 percent—well in excess of the DOT’s own guidelines for freeway grades, and among the steepest interstate bridges in the nation.  The grades on on- and off-ramps would be even higher, as much as 6-7 percent.  Notably, each of the single-level designs allow the roadway to be set much closer to the river, enabling shorter structures and shallower grades.

The key factors increasing the grades of the highway section of the project is the combination of its high river clearance (the IBR design calls for a 116′ navigation clearance underneath the bridge), and the proposed double-decker design (with the top highway deck being elevated about 35 feet above the lower transit/active transportation deck).

A Bridge Too Steep and the Secret CEVP Report

What prompted the sudden inclusion of the single deck design?  As we wrote in February, the key intervening event was a project evaluation called the “Cost Estimate Validation Process” or CEVP, which is designed to identify and assess risks to project costs and completion. It seems highly likely that this review identified the steep grades on the bridge and approaches as a cost, schedule and approval risk.  That’s almost undoubtedly what prompted the sudden interest in the single-deck design, after years of exclusion.  We say “almost undoubtedly” because IBR has refused to release the CEVP analysis.  When we first asked, in December 2022 for the CEVP, WSDOT claimed that “no such document exists.”  Subsequently it has released only a cursory and uninformative one-page summary of the CEVP, even though it has subsequently reported that the CEVP consisted of creating a “risk register” that identified more than 200 risks.

A QRA [quantitative risk assessment] was performed for the IBR program based on CEVP methodology. The objectives of the QRA were to provide independent review of program cost and schedule estimates and to quantify the uncertainty and risk associated with those estimates. A risk assessment workshop was held October  10 to 14, 2022, and was attended by IBR program team members, partners, and subject matter experts (SMEs) from WSDOT, ODOT, local agency partners, and industry. A risk register was developed for the program; the register identified specific risks (threats and opportunities) to the program cost and schedule. A total of 201 risks were identified, of which 121 were determined to be significant. Risks were characterized and quantified by consensus (i.e., collective professional judgment) of the SMEs assembled for the workshop.
Financial Plan, March 2023, page 4-2 to 4-3,

It’s not unusual for agency’s to make some tweaks to a project once it has gone through the environmental review process, but the usual claim that the DOTs make is that these tweaks are okay as long as they don’t increase the project’s “footprint.”  That’s a legally dubious assertion, but, in this case, shifting to a single level bridge actually increases the project’s literal footprint by over 50 percent:  According to ODOT’s own estimates, the double-decker bridge design would be about 173 feet wide, while the single-level bridge would be about 272 feet wide.

For four years, the Oregon and Washington highway departments have been pushing a revival of the failed multi-billion dollar I-5 Columbia River Crossing.  Their key sales pitch is that the size and design of the project can’t vary in any meaningful way from the project’s decade-old record of decision, for fear of delaying construction or losing federal funding.

Far from being a minor change, this represents the revival of an alternative design that was ruled out more than a decade ago.  It also shows that the IBR project is effectively conceding that its critics, who’ve alleged that its double-decker “modified locally preferred alternative” has a serious safety and cost problem due to its excessive grade and elevated off-ramps.  Finally, and perhaps most importantly, it shows that warnings that major changes couldn’t be made to the project out of a fear of delays were simply baseless manipulation—a familiar highway department tactic.

Resurrecting a discarded 15-year old alternative

When he first revealed that IBR was considering a single level design in February of 2023, IBR administrator Greg Johnson made a point of claiming that the single-level design isn’t “new.”  It isn’t, it’s quite old, and to have listened to the Oregon and Washington transportation departments, it’s so old that it’s been dead and buried for almost 15 years.

The last official ODOT and WSDOT document featuring a single level crossing design was nearly 15 years ago:  the 2008 Draft Environmental Impact Statement.  It proposed two possible designs for replacement bridges for the current I-5, a pair of side-by-side double-decker bridges (which were chosen as the preferred design), and a trio of single-level bridges, as shown here.

The project’s Final Environmental Impact Statement, issued in 2011, abandoned the single-level option, and chose to proceed only with a pair of double-decker bridges (with transit and bike-ped access placed on the lower level of each structure).  Also:  Notice that the Final Environmental Impacts Statement omitted the notations showing the actual width of the proposed structures—part of an effort to conceal the fact that the bridges would be build wide enough to accommodate 12 full lanes of traffic.

 

The Final Environmental Impact Statement made a strong series of findings rejecting the single-level three-bridge design, because it would have more in-river impacts, a larger surface area with more runoff, and would have larger visual impact.  [CRC FEIS, Page 2-83]

The single-level design is considerably wider than the two-bridge double-decker design, as shown in this 2007 rendering prepared by IBR.

it’s back. An even wider bridge across the Columbia.

 

It’s not too late to make fundamental changes to the plan

Greg Johnson has cried “wolf” about making serious changes to the IBR project, even as its budget has ballooned by 54 percent in a little over two years, to a total price tag of as much as $7.5 billion.  But this latest—and very late—change to the project design is an indication that it’s not too late to fix the fatal flaws in this project.  Right now the fatal flaws revolve around its bloated design and price.  The reason the project is so expensive has little to do with the bridge structure itself, but rather the extravagant plans of ODOT and WSDOT to widen I-5 for miles on either side of the Columbia River, and rebuild, at much greater expense than the bridge itself, seven different freeway interchanges.  If this were simply a bridge replacement—as its name claims—the project would be vastly simpler, less expensive, and likely not controversial.

For the past four years, IBR has maintained it’s far too late to make any design changes to the IBR project.  Ever since he took the job of IBR administrator Greg Johnson has been warning elected officials not to make any significant changes to the project design included in the 2011 FEIS for fear of delaying it further.  An immersed tunnel?  More consideration for climate?  A lift-span?  A narrower freeway?  None of these can even be studied, or advanced into the environmental review process, for fear that it will cause some additional delay.

But now, what about that inviolable “Modified Locally Preferred Alternative” that you couldn’t touch in any way without endangering the project’s schedule and jeopardizing federal funding?  Well, IBR staff have unilaterally decided it won’t actually work, and their pushing ahead with an entirely new and much wider design, any trying to shoehorn it into the federal environmental review process without honestly disclosing the major changes they’ve made.

More than six months after theoretically getting buy-off from all of the project’s eight partners for this untouchable design, and spending tens of millions of dollars defining the “modified locally preferred alternative,” Johnson has suddenly decided that he can unilaterally inject back into the discussion an alternative that the project ruled out more than a decade ago. And make no mistake, changing from double-decker bridges to a single level crossing has significant impacts.  It almost certainly means more piers in the Columbia River, and more real estate disruption, particularly on the steadily redeveloping Vancouver waterfront.

For the record this isn’t the first, or even the second, time the engineers at ODOT and WSDOT have screwed up the design of the proposed river crossing.  In 2010, an Independent Review Panel appointed by Oregon Governor Ted Kulongoski and Washington Governor Chris Gregoire found that the “open web” design the agencies proposed was “unbuildable.”  It was replaced by the double-decker truss.  And then, in 2011, the bridge had to be re-designed again to achieve a river clearance of 116 feet, because the two highway departments couldn’t bludgeon the Coast Guard into accepting their preferred 95 foot clearance.  Both these engineering errors delayed the project and raised its cost—something you’ll never hear ODOT and WSDOT admit.

Why now? 

The problems with the bridge grade were first identified more than a decade ago, when the Coast Guard objections let ODOT and WSDOT to hastily redesign the Columbia River Crossing to provide a 116-foot navigation clearance (21 feet higher than what the two highway agencies were then planning).  ODOT and WSDOT never resolved the questions that were raised about the project’s excessive grade, particularly concerns that steep bridge grades would cause large trucks to slow and impede traffic flow.  Following Johnson’s insistent demand that no changes be made to the project defined in the Columbia River Crossing FEIS, IBR has stuck to the steep, double-deck design, never questioning the grade.

But late last year, IBR has had to produce a new cost estimate.  Embarrassingly, the cost of the IBR project has ballooned by 54 percent to nearly $7.5 billion.  To deflect criticism about higher costs, IBR officials testified in December that the project was also subjected to a “Cost Estimate Validation Process,” or CEVP, which the state DOTs advertised as a sure-fire cure for future cost escalation.  As we pointed out at City Observatory, no documentation exists for that claimed CEVP.  The Washington Department of Transportation responded to a public records request for copies of the CEVP by saying “no documents exist.”  Because the agencies have shrouded this process in secrecy we can’t say for sure, but it seems likely that a CEVP meeting likely identified the bridge grade, and expense of elevated interchanges as major cost, schedule and design risks to the project.  That would explain why, more than six months after locking down a double-decker “modified locally preferred alternative,” that Johnson and the IBR team are suddenly reviving the discarded single-level bridge plan.

IBR’s Stacked Highway Bridge Alternative (2021)

For reference, we’re providing details of the alternative designs that have been considered by the IBR in the past decade.  As noted above, the last time any of the project’s documents mentioned a single-level crossing was in the 2008 Draft Environmental Impact Statement.  Most recently, in October 2021, when it last listed the alternative bridge designs it was studying, IBR made absolutely no mention of a “single-level bridge”.  In fact, the only alternative design they showed was pretty much the opposite:  a single and larger stacked bridge, with highway lanes on the upper and lower levels of the double-decker bridge, and with transit and bike-pedestrian routes cantilevered on the sides of the lower level of the double decker.  And now, when it comes time to produce actual renderings, the single bridge stacked alignment has simply disappeared without a trace.

 

IBR floats new bridge design, proving critics right

For four years, the Oregon and Washington highway departments have been pushing a revival of the failed multi-billion dollar I-5 Columbia River Crossing.  Their key sales pitch is that the size and design of the project can’t vary in any meaningful way from the project’s decade-old record of decision, for fear of delaying construction or losing federal funding.

Months after choosing a “locally preferred alternative” and after years of warning people that moving away from the 2011 design of the CRC would cause enormous delays, IBR is moving to resurrect a bridge design it ruled out more than a decade ago.

A single level crossing would be significantly wider than the current proposal for a pair of double-decker bridges.  Instead, the project would consist of two or three side-by-side, single level bridges, carrying multiple lanes of traffic, light rail trains, bikes and pedestrians all one one level.

The single level crossing would dramatically increase the I-5 footprint, particularly where it crosses the shoreline into downtown Vancouver.

The sudden decision to revive this long-discarded alternative clearly vindicates criticisms raised by independent engineers that the proposed double-decker bridge is too steep; the single level design enables a lower bridge grade.  It also shows that the highway department’s claims that the project’s design can’t be changed are simply false.

IBR Suddenly Announces a New Bridge Design

On February 9, 2023 IBR Administrator Greg Johnson off-handedly slipped this little gem into a routine briefing for the project’s community advisory group.

He told them:  “We’re looking at a bridge configuration of a single level.”

And Johnson immediately interjected, “that is something that is not new.”

He went on to explain that this gives them added choices for “bridge types and bridge aesthetics.”

Here’s the full quote, and following it a link to the meeting video:

Right now we are on target, we’re on task. And the team is driving forward with technical reports that will go out to the cooperating agencies and partners. We’re also working on within the supplemental we’re working on different technical aspects to make sure that we are covering potential design elements. We are looking at a bridge configuration of a single level. So that is something that is not new, but it is something that we wanted to make sure within the draft Supplemental Environmental Impact Statement so folks can see the potential impacts of, of what having all of the modes on one level rather than having transit underneath the lane and having the Bike Ped underneath the lane, we have an option that shows them all at one level. So once again, it’s something that we’re studying the impacts of and we will have those two bridge configurations going forward. We know that one level gives us some some some interesting options as far as bridge types and bridge aesthetics that we don’t get with having transit underneath and having Bike-Ped underneath. So we will be looking at that and you will be seeing at an upcoming meeting some renderings that display these potential configurations.

 

Far from being a minor change, this represents the revival of an alternative design that was ruled out more than a decade ago.  It also shows that the IBR project is effectively conceding that its critics, who’ve alleged that its double-decker “modified locally preferred alternative” has a serious safety and cost problem due to its excessive grade and elevated off-ramps.  Finally, and perhaps most importantly, it shows that warnings that major changes couldn’t be made to the project out of a fear of delays were simply baseless manipulation.

Resurrecting a discarded 15-year old alternative

As we mentioned, IBR administrator Greg Johnson made a point of claiming that the single level design isn’t “new.”  It isn’t, it’s quite old, and to have listened to the Oregon and Washington transportation departments, it’s so old that it’s been dead and buried for almost 15 years.

The last official ODOT and WSDOT document featuring a single level crossing design was the 2008 Draft Environmental Impact Statement.  It proposed two possible designs for replacement bridges for the current I-5, a pair of side-by-side double-decker bridges (which were chosen as the preferred design), and a trio of single level bridges, as shown here.

 

The project’s Final Environmental Impact Statement, issued in 2011, abandoned the single level option, and chose to proceed only with a pair of double-decker bridges (with transit and bike-ped access placed on the lower level of each structure).

 

The Final Environmental Impact Statement made a strong series of findings rejecting the single level three-bridge design, because it would have more in-river impacts, a larger surface area with more runoff, and would have larger visual impact.  [CRC FEIS, Page 2-83]

The single-level design is considerably wider than the two-bridge double-decker design, as shown in this 2007 rendering prepared by IBR.

it’s back. An even wider bridge across the Columbia.

Apparently we can reconsider the design of the crossing, even at this late date.  Ever since he took the job of IBR administrator more than three years ago, Greg Johnson has been warning elected officials not to make any significant changes to the project design included in the 2011 FEIS for fear of delaying it further.  An immersed tunnel?  More consideration for climate?  A lift-span?  A narrower freeway?  None of these can even be studied, or advanced into the environmental review process, for fear that it will cause some additional delay.

But now, more than six months after theoretically getting buy-off from all of the project’s eight partners for this untouchable design, and spending tens of millions of dollars defining the “modified locally preferred alternative,” Johnson has suddenly decided that he can unilaterally inject back into the discussion an alternative that the project ruled out more than a decade ago.

And make no mistake, changing from double-decker bridges to a single level crossing has significant impacts.  It almost certainly means more piers in the Columbia River, and more real estate disruption, particularly on the steadily redeveloping Vancouver waterfront.

A bridge too steep

While Johnson claims that the single level design is allows some more aesthetic options, that’s simply misdirection.  The real reason that IBR is changing its design at this extremely late date is that it has suddenly realized that one of its most persistent critics was right, all along.  For years, engineer Bob Ortblad—who advocates for an immersed tube tunnel crossing—has been pointing out that the proposed IBR bridge design has a dangerously steep grade (nearly 4 percent).  This would make it one of the steepest interstate highway bridges in the country.  Just to hammer the point home:  the Biden Administration just approved a grant of $150 million toward the reconstruction of the I-10 bridge in Louisiana, currently the steepest interstate, to reduce the grade of the bridge to improve safety.  It’s also worth noting that the current IBR bridge design violates ODOT’s own standards for interstate highway bridge grades, and would require a design exception.  In addition to the safety hazard caused by the bridge grade, the extreme elevation of the roadway requires very steep on- and off-ramps, especially those connecting the bridge with Washington State Route 14, which runs very near the riverbank.   Those ramps would have even steeper and more dangerous grades than the bridge itself, a point Ortblad has made graphically:

Proposed IBR would have 4% mainline grades and 6-7% ramp grades (B. Ortblad)

What Johnson didn’t say—and what’s plainly the real reason for a single level crossing—is that it enables the engineers to lower the roadway by as much as 30 and 35 feet, consequently reducing the overall grade, and importantly, lowering the height of on- and off-ramps at either end of the bridge crossing.  The current LPA design calls for a minimum river clearance of 116 feet for the bottom level of each double-decker bridge.  The roadway would be on top of the double-decker, about 30-35 feet higher.  A single level design could lower the maximum height of the bridge by about 30-35 feet, enabling a lower grade.

Of course, the last thing IBR officials want to do is concede that Ortblad was right—that would damage their disinformation campaign about the merits of the immersed tube tunnel.  Instead, they’re suddenly concerned about bridge type and aesthetics.

Why now? 

The problems with the bridge grade were first identified more than a decade ago, when the Coast Guard objections let ODOT and WSDOT to hastily redesign the Columbia River Crossing to provide a 116-foot navigation clearance (21 feet higher than what the two highway agencies were planning).  ODOT and WSDOT never resolved the questions that were raised about the project’s excessive grade, particularly concerns that steep bridge grades would cause large trucks to slow and impede traffic flow.  Following Johnson’s insistent demand that no changes be made to the project defined in the Columbia River Crossing FEIS, IBR has stuck to the steep, double-deck design, never questioning the grade.

But in the past two months, IBR has had to produce a new cost estimate.  Embarrassingly, the cost of the IBR project has ballooned by 54 percent to nearly $7.5 billion.  To deflect criticism about higher costs, IBR officials testified in December that the project was also subjected to a “Cost Estimate Validation Process,” or CEVP, which the state DOTs advertised as a sure-fire cure for future cost escalation.  As we pointed out at City Observatory, no documentation exists for that claimed CEVP.  The Washington Department of Transportation responded to a public records request for copies of the CEVP by saying “no documents exist.”  Because the agencies have shrouded this process in secrecy we can’t say for sure, but it seems likely that a CEVP meeting likely identified the bridge grade, and expense of elevated interchanges as major cost, schedule and design risks to the project.  That would explain why, more than six months after locking down a double-decker “modified locally preferred alternative,” that Johnson and the IBR team are suddenly reviving the discarded single level bridge plan.

It’s not too late to make fundamental changes to the plan

Greg Johnson has cried “wolf” about making serious changes to the IBR project, even as its budget has ballooned by 54 percent in a little over two years, to a total price tag of as much as $7.5 billion.  But this latest—and very late—change to the project design is an indication that it’s not too late to fix the fatal flaws in this project.  Right now the fatal flaws revolve around its bloated design and price.  The reason the project is so expensive has little to do with the bridge structure itself, but rather the extravagant plans of ODOT and WSDOT to widen I-5 for miles on either side of the Columbia River, and rebuild, at much greater expense than the bridge itself, seven different freeway interchanges.  If this were simply a bridge replacement—as its name claims—the project would be vastly simpler, less expensive, and likely not controversial.

IBR’s Stacked Highway Bridge Alternative (2021)

For reference, we’re providing details of the alternative designs that have been considered by the IBR in the past decade.  As noted above, the last time any of the project’s documents mentioned a single level crossing was in the 2008 Draft Environmental Impact Statement.  Most recently, in October 2021, when it last listed the alternative bridge designs it was studying, IBR made absolutely no mention of a “single-level bridge”.  In fact, the only alternative design they showed was pretty much the opposite:  a larger stacked highway bridge, with highway lanes on the upper and lower levels of the double-decker bridge, and with transit and bike-pedestrian routes cantilevered on the sides of the lower level of the double decker.

Nothing but double deckers in 2011 in the Bridge Review Panel Report of 2011

In 2010, an expert review panel appointed by Governor’s Kulongoski and Gregoire found that the proposed “open-web” design being pushed by ODOT and WSDOT was “unbuildable.”  That led to the appointment of a “Bridge Review Panel” to quickly come up with a new alternative.  They recommended three possible alternatives in their 2011 report:  the composite truss design (which became the locally preferred alternative), and two other designs:  a cable stayed bridge and a tied arch bridge.  All three designs shared a common feature:  they were double-deckers with the transit component on a lower level of the bridge.  The cable stayed and tied arch designs had elevated bike-pedestrian paths in the center of the bridge, between the north and south bound highway lanes.

Here’s the Bridge Review Panel’s illustration of the cable stayed bridge.  The two dotted outlines in the center of the bridge structure on the cross-section illustration are the profile for the light rail transit.

Here’s the Bridge Review Panel’s illustration of the tied arch bridge.  Again, the two dotted outlines in the center of the bridge structure on the cross-section illustration are the profile for the light rail transit.

 

 

 

Why should Oregonians subsidize suburban commuters from another state?

Oregon is being asked to pay for half of the cost of widening the I-5 Interstate Bridge.  Eighty percent of daily commuters, and two-thirds of all traffic on the bridge are Washington residents.  On average, these commuters earn more than Portland residents.

The 80/20 rule:  When it comes to the I-5 bridge replacement, users will pay for only 20 percent of the cost of the project through tolls.  Meanwhile, for the I-205 project in Clackamas County, users—overwhelmingly Oregonians—will pay 80 percent (or more of the cost in tolls).

Meanwhile, state legislators are looking—for the first time—to raid the state’s General Fund (which is used to pay for schools, health care, and housing) to pay for roads by subsidizing the Interstate Bridge Replacement Project to the tune of $1 billion.

The proposal for Oregon to fund half of the cost of the Interstate Bridge Replacement is a huge subsidy to Washington State commuters and suburban sprawl.

A draft proposal currently circulating in the Oregon Legislature—the so-called “-2” amendments to HB 2098—would have Oregon General Fund taxpayers contribute $1 billion to the cost of the proposed Interstate Bridge Replacement Project.  That’s a huge break from established tradition.  For the better part of a century, Oregon has theoretically had a “user pays” transportation system, which pays for roads out of the State Highway Fund.  The state’s constitution supposedly draws a hard line around the state highway fund (which is filled from gas taxes, weight mile fees and vehicle registration charges) to pay for the cost of building and maintaining roads.

But the HB 2098 “-2” amendments would, for the first time, use General Fund money to subsidize road construction.

The Oregon Constitution contains provisions that have been interpreted to limit the State Highway Fund revenues to only road expenditures, a key part of a “user pays” system that the state has ostensibly had for nearly a century.  This would be a massive break from that philosophy, taking money from the general fund—something that is used to pay for schools, for health care for the poor, and for social services for the homeless.

Twice as many Washington cars on the bridge as Oregon cars.

On any given day, twice as many Washington residents cross the Columbia River as Oregon residents.  These data are from a license plate survey conducted in 2012 for the Oregon and Washington Departments of Transportation.

Four-fifths of all commuters on the I-5 and I-205 bridges are from Washington State.

The Census Bureau regularly surveys Americans about their commuting patterns.  We very detailed data on who commutes within the Portland metropolitan area, and these data confirm what everyone already knows:  vastly more Washington residents commute to jobs in Oregon than vice-versa.  These data show that 80 percent of all commute trips across the Columbia River are Washington residents; only 20 percent are Oregonians commuting to jobs in Washington.

The real reason for expanding the I-5 bridge is to deal with traffic congestion, and especially peak afternoon traffic congestion in the Northbound direction:  specifically, Washington residents driving home from their jobs in Oregon.  The I-5 bridges are typically not congested in the off-peak direction—because there are far fewer Oregonians driving to jobs in Washington than vice-versa. The highest levels of traffic congestion are Southbound in the morning peak hour (Washington residents commuting to jobs in Oregon), and Northbound in the afternoon peak hour (Washington residents returning home from their Oregon jobs).  In a very real sense, the cost of the I-5 bridge expansion is to serve these commuters.  There is no need to expand capacity on the I-5 bridges for Oregon workers because their commutes are not congested.

Washington Commuters have higher incomes than Oregonians

Peak-hour, drive alone commuters from Washington state to jobs in Oregon have average household incomes of $106,000 according to Census data—about 25 percent higher than for residents of the Oregon side of the Portland metropolitan area.  Clark County’s median household income of $80,500 is higher than for the region ($78,400) and for the City of Portland ($76,200).

Much of the traffic across the river is Washington residents driving to Oregon to evade Washington State sales tax.  Estimates are that the average Clark County household avoids more than $1,000 in state sales taxes each year by shopping in Oregon.  Collectively Clark County households avoid $120 million in state sales taxes per year, and this tax evasion accounts for 10 to 20 percent of traffic across the I-5 and and I-205 Columbia River Bridges.

A tale of two counties, and two toll bridges

Why do Washington residents get a big taxpayer subsidy from Oregon, and Clackamas County residents get a high toll bill?

There are two toll bridge projects before the Oregon Legislature right now.  One is the I-5 bridge, which as noted above, largely serves Washington residents, and the other is the I-205 Abernethy Bridge and I-205 freeway widening project in Clackamas County.  The I-205 project serves mostly Oregon residents, and most of them live in Clackamas County.  There’s a world of difference between how these two projects are going to be financed.

Oregon is being asked to pay for half the cost of the I-5 bridge, even though 80% of commuters and two-thirds of users are from Washington.  Oregon, of course, will pay for all of the cost of the I-205 projects.  And both projects will be paid for in part with tolls, but the tolling policy of the two projects couldn’t be more different.  The IBR project will ask users to pay only about 20 percent of the total cost of the project (about $1.5 billion of a total $7.5 billion price tag).  Meanwhile, users of the I-205 project will be asked to pay 80 percent or more of the cost through tolls.  The Oregon Department of Transportation estimates that tolling will cost the typical Clackamas County family in the project area about $600 per year in toll payments.

ODOT currently says that the I-5 bridge tolls will be as high as $3.55, while the tolls for using I-205 will be $4.40.

A key part of the reason that the tolls will be lower on I-5 is that Oregon is being asked to chip in $1 billion for the Interstate Bridge Replacement, with a HB 2098 “-2” amendment saying that money will come from General Funds.  So while Clark County commuters are getting a $1 billion subsidy from Oregon for their new bridge—and enjoying lower tolls that cover only 20 percent of the cost of the project, Clackamas County drivers on I-205 will get little or no subsidy from the State, and bear 80 percent or more of the cost of this new project.

If we’re going to ask Oregon residents, especially those from Clackamas County to pay tolls to cover nearly all of the cost of new Tualatin River and Willamette River Freeway Bridges, which are we requiring Oregon taxpayers to pay half the cost of the I-5 bridges?  Put simply:

  • Clackamas County residents will be asked to pay a $4.40 toll to cover the cost of a $1 billion project.
  • Clark County WA residents will be asked to pay a toll of $5.60 (or as IBR claims, $2-3) to cover the cost of a $7.5 billion project.
  • ODOT’s plan will charge much relatively much higher tolls to Clackamas County residents for I-205 than it proposes to charge Clark County residents for the I-5 bridges. (I-205 is $4 of toll per billion dollar of project cost; I-5 IBR is $1 of toll per billion of project cost).

It’s hard to understand why the Oregon Legislature would treat Oregon voters and constituents in Clackamas County so much less generously than  it is proposing to treat the people in Clark County, Washington.

Subsidizing Sprawl

The effect of building more road capacity to Clark County is essentially to encourage more people to live in Clark County.  And Washington’s land use laws are far less strict that Oregon’s, meaning that much of that growth is car-dependent sprawl.  When we look at the pattern of urban growth over the last couple of decades, its apparent that Clark County Washington has grown substantially through ex-urban sprawl.  While most new growth on the Oregon side of the Columbia occurred within the Urban Growth Boundary, Clark County Growth sprawled widely.

Why should Oregon taxpayers subsidize yet another round of exurban housing development in Washington?

Fairness and the User Pays Principle

For nearly a century, Oregon has relied on the “users pays” principle to guide road finance.  In theory, gasoline consumption is roughly proportional to miles driven, and apportions to users the costs of the system in direct relation to how much they drive.  Raiding the Oregon General Fund is a dramatic break with that principal, and deserves to be questioned in any event.  But its really hard to understand why Oregon taxpayers should take money that could be used to educate children, care for the sick, or address homelessness, and use it to subsidize commuters (and shoppers) from another state.  And it’s doubly hard to understand why we’d do that, while we’re asking another group of Oregonians, those living in Clackamas County to pay for almost the entire cost of fixing another bridge.

As Governor Tina Kotek said, the financing plan for the I-5 bridge shouldn’t unfairly burden low income Oregonians.

. . When the bridge that we have now across the Columbia, the I-5 bridge, that was tolled at some time to create that bridge. I’m going to always be honest with Oregonians. We have to figure out how to pay to maintain and modernize our system of bridges and roads. And the plan right now to pay for the the improvements on the Abernathy bridge on I 205 and to pay for new I-5 bridge is planned on tolling. Now I’m open to other ideas but I think we should be honest, if we need those types of infrastructure. We’re gonna get as much money as we can from the federal government, and we have to have a conversation about how to pay for it locally. My goal is to make sure whatever we do, it does not unfairly burden our lowest income Oregonians who need to be on those roads. We have to figure out how to modernize and maintain our infrastructure.

Governor Kotek interviewed on KOIN-TV February 24, 2023 (Emphasis added)

But that’s exactly what this proposal does.  It keeps tolls low for Washington residents (they cover only 20 percent of the cost of the bridge they use) while it charges high tolls to Oregon residents (who pay 80 percent or more of the cost of their bridge).  And the HB 2098 “-2” amendments propose to take money that is key to helping low income Oregonians (the State General Fund) and use it to subsidize out of state travelers.

It’s worth keeping in mind that the original bridge (built in 1912) and the parallel second span built in 1958, were both paid for entirely with toll revenues. In theory, we have a “user pays” transportation system—although that’s increasingly become a myth, as nationally we’ve bailed out the federal highway trust fund with general revenues to the tune of more than $200 billion, and we grossly subsidize heavy, over-the-road freight trucks that cause vastly more damage to roads, the environment and people.  Tolling is a “user pays” system:  If 80 percent of the peak hour users of the I-5 bridge are Clark County commuters, and we’re expanding the capacity of the bridge to meet their peak hour travel choices, its incredibly fair and reasonably to ask them to pay for most of the cost of the project.  Washington taxpayers are getting a great deal:  even though they account for roughly twice as much bridge traffic as Oregonians, Oregon is going pay just as much as they are toward the bridge.

 

 

 

CEVP: Non-existent cost controls for the $7.5 billion IBR project

Oregon DOT has a history of enormous cost overruns, and just told the Oregon and Washington Legislatures that the cost of the I-5 Bridge Replacement Program (IBR) had ballooned 54 percent, to as much as $7.5 billion.

To allay fears of poor management and further cost overruns, IBR officials testified they had completed a “Cost Estimate Validation Process” (CEVP).  They assured legislators they had consulted independent subject matter experts and assessed more than 100 risks.

But asked for copies of the CEVP under the public records law, agency officials reported “no records exist” of the CEVP.

And the supposedly “nationally recognized” CEVP process has been around for more than a decade, was judged inadequate and error-filled for the Columbia River Crossing, and failed to detect key cost and schedule risks.

ODOT and WSDOT are more interested in deflecting criticism than in being accountable for—and correcting—runaway project costs.

 

IBR, December 2022 Legislative Testimony:  “A CEVP was recently completed.”

IBR, January 2023 response to public records request for the CEVP:  “No records exist.”

 

The Oregon and Washington highway departments are pushing forward with something they call the “Interstate Bridge Replacement Project.”  As we’ve pointed out at City Observatory, this project, which is actually a clone of the Columbia River Crossing that died a decade ago, is really a 5 mile long freeway widening project.  And its one whose cost has ballooned to as much as $7.5 billion, according to estimates revealed in December 2022. This is part of a consistent pattern, the Oregon Department of Transportation has a long string of 100 percent cost-overruns on its major projects.  Almost every large project the agency has undertaken in the past 20 years has ended up costing at least double—and sometimes triple—ts original cost estimate.

While the agency wants to blame recent construction cost inflation for the increase, that’s simply wrong.  The transportation agencies official projections of future construction price inflation show a negligible change from 2020 levels. Higher construction cost inflation accounts for only $300 million of a $2.7 billion cost increase over their 2020 estimate.

Don’t Worry About Cost Overruns, We did a CEVP™!

At the December 12, 2022 meeting of the Joint Oregon-Washington I-5 bridge legislative oversight committee, IBR administrators tried to buffer concerns about rising project costs by invoking a Cost Estimate Validation Process  or “CEVP “process as a way to diagnose and prevent further cost escalation.

IBR administrator Frank Green testified:

Its a process that enables us to identify costs . . .we also go through a process where we bring subject matter experts to identify, on a program like this, what are some of the potential risks that we may encounter as we’re moving through development of the program.
. . . as we produce our CEVP report and publish it, it will show the list of risks, well over a hundred, that our team and our partners and our subject matter experts identified. It’s important to understand that we also identified strategies, that we as a team and our partners can take to minimize the potential impact of these risks.

Joint I-5 Committee Meeting, December 12, 2022 

This explanation of the Cost Estimate Validation Process was also posted to the IBR project website (emphasis added):

A Cost Estimate Validation Process (CEVP) was recently completed to provide independent review and validation of project cost and schedule estimates.

A CEVP is an estimation process that analyzes risks specific to the project to quantify the impacts and possible mitigation strategies in seeking to limit the impacts of costs and or delays. Cost risks identified for the IBR program are primarily tied to possible schedule delays, although market uncertainties, changes during construction, and design modifications can all pose a risk to cost escalation. Some specific risks identified in the CVEP include:

▶ Possible legal challenges of program environmental process

▶ In-water work complexities during bridge construction

▶ Delay in state matching funds

“No Records Exist” of a current CEVP

Intrigued to learn more, City Observatory filed a public records request with WSDOT (one of IBR’s two parent state agencies) asking for copies of the CEVP.  We were told that there were no written or electronic records pertaining to the CEVP, and that none would be available before March of 2023—more than ninety days after the IBR testified to the Legislature that the CEVP was “completed.”  Their official response to our request—”No Records Exist”–is shown here:

At this point, there’s simply no evidence that WSDOT undertook any kind of analysis.  They just gravely intoned the words “CEVP” and assured that this would insulate the project from future costs and risks.  If there’s no documentation, no electronic files there’s simply nothing to substantiate that any kind of analysis was actually performed.  It’s hard to see how such an insubstantial or poorly documented process  will do anything to prevent or manage future cost overruns.

One has to believe that IBR, according to its own testimony, generated (and analyzed) a list of more than 100 risks, and reviewed them with subject matter experts, without creating a single document, electronic file or other public record.

Apparently, just as former President Donald Trump can declassify a document just by thinking about it, WSDOT and ODOT can perform a CEVP without creating a single document or electronic file.  This strongly suggests that the real purpose of a CEVP is to distract legislators, not identify or prevent budget or schedule risks.

Deja Vu All Over Again:  The CEVP has proven a failure at predicting or preventing cost-overruns for this very project

Whether a CEVP actually exists as a tangible object or not is an open question. An equally important question is whether a CEVP, if one existed, would do anything to accurately predict, or prevent further cost escalation and schedule delays.  Unfortunately, the history of CEVP with exactly this project shows it did nothing to forestall mistakes, delays and cost increases.

It’s too bad that none of today’s Oregon legislators were on hand the last time they were discussing a huge and risky bridge over the Columbia River, because “CEVP!” is exactly what ODOT officials claimed would avoid cost overruns, when they were asking for funding for the then $3 billon failed Columbia River Crossing (CRC) project (which has been revived as the IBR).  Twelve years ago, in 2011, ODOT consultant and gubernatorial advisor Patricia McCaig confidently told Oregon Legislators that they had a handle on project costs, because of Washington’s CEVP process.

“There is a cost estimating validation process called CEVP from Washington, that is a nationally known model that is applied to the Columbia River Crossing and we will spend as much time as you as like to go through that with you.”

Hearing on HJM 22, House Transportation and Economic Development Committee, March 30, 2011

Despite these assurances that CEVP didn’t head off either delays or cost-overruns on the CRC.  An Independent Review Panel for the CRC appointed by Oregon Governor Ted Kulongoski and Washington Governor Christine Gregoire found that there was a “significant risk” that CEVP “was not accurate enough” for financial purposes, and that “the reliability of the final outputs for cost and schedule are seriously suspect.” 

And the panel’s warnings proved correct: Critically, the CEVP prepared for the Columbia River Crossing completely failed to predict the schedule and cost risk from the project’s intentional—and ill-advised—decision to ignore the Coast Guard’s direction about the appropriate height for the bridge.  In 2012, the Coast Guard blocked the project’s record of decision, forcing a year-long delay as the project was re-designed to provide a higher navigation clearance, a change delayed the project a year and added tens of millions of dollars to the project’s cost.  The CEVP also failed to predict that the original design for the project, a so-called “open-web” was unbuildable, and had to be scrapped, causing a year-long delay.

Then, as now, the vaunted “CEVP” exists primarily as a fig-leaf and a talking point to insulate the two DOTs from criticism, and deflect attention from their consistent record of enormous cost-overruns.

In addition, an honest “cost estimate validation process” would reveal that the project is taking huge financial risks by failing to advance either a moveable span or an immersed tube tunnel as full options in the environmental review process.  By ignoring the National Environmental Policy Act’s requirements to fully and fairly appraise such alternatives, it is IBR that is adding considerable cost and schedule risk to the project–an a transparent attempt to force adoption of its preferred massive mega-project.

 

 

 

 

 

Another flawed Inrix Congestion Cost report

Sigh. Here we are again, another year, and yet another uninformative, and actively misleading congestion cost report from Inrix.

More myth and misdirection from highly numerate charlatans.

Burying the lede:  Traffic congestion is now lower than it was in 2019, and congestion declined twice as much as the decline in vehicle travel.

Today, Inrix released its latest “Global Traffic Scorecard,” which purports to rank US and Global cities based on traffic congestion levels.Over the years, we’ve reviewed Inrix annual traffic scorecard reports.  They’re monotonous in their sameness.  Congestion, we’re told, is very bad and very costly.  But little of this is true or more importantly, actionable.  The estimates of supposed congestion “costs” simply aren’t true because neither Inrix (nor anyone else) has specified how they’d eliminate congestion at a cost less than the supposed dollar value of time lost.  Without a clear idea of how one could go about eliminating these costs, the information simply isn’t actionable.  As we’ve explained in our “Reporter’s Guide to Congestion Cost Studies,” these reports are rife with conceptual and methodological errors.  Today’s Inrix report is still marred by these same problems.

There are a couple of improvements in this report from the rest of the literature. Inrix spends some time on traffic crashes and deaths, and notes the troubling increase in crashes despite the decline in vehicle miles traveled. To their credit, Inrix this year has carefully avoided claiming or implying that expanding highway capacity would somehow reduce congestion.  That claim has been definitively and scientifically debunked.  We know that, thanks to the fundamental law of road congestion, that more road capacity will simply induce more car travel, fully offsetting any supposed congestion-busting benefits.  But that won’t stop many Inrix clients, notably state highway departments, from pointing to Inrix data as the reason they should be given tens of billions of dollars to widen roads.  And that’s apparently the real purpose of the Inrix report, to curry favor with potential highway department clients.

Most of what we’ve said about previous Inrix congestion reports apply with equal force to this one.  We’ll highlight a few points.

First, if you read closely, you’ll learn that time lost to congestion in the US is still lower than it was three years ago, prior to the pandemic.  Inrix reports that congestion time losses were 20 percent lower in 2022 than 2019, 4.8 billion hours, down from 6 billion hours.  This is good news.

Second, that reduction in congestion should be celebrated, and should also be a teachable moment. If we’re so concerned about congestion, then the experience of the past few years ought to be studied to see if we can learn something.  Right off the top, there’s a really important fact that’s buried in the Inrix report: While congestion declined by 20 percent from 2019, traffic (vehicle miles traveled or VMT) went down by just 9 percent.

The fact that congestion declined more than twice as much as VMT is a critical observation:  It means that demand management can reduce congestion, and that modest changes in travel volumes produce disproportionately large improvements in transportation system function.  If instead of managing demand with a pandemic and lockdowns, we did something a little more nuanced, like road pricing, we could achieve real and lasting congestion reductions.  That’s exactly the sort of actionable information that ought to be in this report, but which is missing.

Third, there are a whole bunch of other important things that are missing as well.  If you search through the latest Inrix report, here are some words you simply won’t find:  “sprawl,” “pollution,” “emissions,”  “carbon,” “climate,”  “induced demand,” “pricing,” and “tolling.” Trying to talk about urban transportation systems without considering their effects on these other pressing problems is a measure of how detached the UMR is from the reality of the 21st century.  Transportation is the leading source of greenhouse gas emissions in the US, and these emission are increasing. The Inrix report exists solely to feed an overriding obsession with speed and congestion as the. criterion for setting transportation policy.

Fourth, in reality the city rankings are meaningless.  The measure Inrix uses totally ignores the differences in distances among Metro areas.  The fact that you have to drive twice as far, on average, in Houston or Atlanta as you do in Chicago or Boston, doesn’t figure in to the “cost” of commuting.  As we’ve shown, this particular measure inaccurately penalizes compact cities where people make shorter trips, because it looks only at the difference between peak and non-peak travel times.  Cities with shorter travel distances generate less car travel (vehicle miles traveled), emit much less greenhouse gas emissions, and save their residents billions of dollars in avoided travel costs compared to sprawling, car-centric metro areas.  The best way to reduce the cost of transportation, and time lost is to have more compact development, something we’ve demonstrated in in our previous analysis.  And while the Inrix report spends a lot of time talking about the added burden of high gas costs, it completely leaves out the fact that higher gas prices are much more burdensome in cities and neighborhoods where people have to drive long distances.

Fifth, the Inrix rankings are a profoundly car-centric view of the world. Inrix likes to tout its “big data” noting that its estimates are drawn from billions of data points.  But those data points are almost entirely cars and trucks.  There’s an old saying “if you don’t count it, it doesn’t count.” They leaven their reporting with a handful of statistics on bikes and pedestrians, but these are drawn from the rare reports compiled by cities, not from Inrix data. The car and bike data, and the actual variation in commuting distances, simply don’t figure into the Inrix rankings.  In short, if you don’t travel by car, you really don’t count in the Inrix rankings.

Sixth, there’s no evidence that driving faster makes us happier.  Inrix and other congestion reports prey on our sense of annoyance and victimization about traffic congestion.  It’s all these other people who are slowing us down, and we’d be better off if they were gone and we could drive faster.  But cities that are optimized for speed simply sprawl further and require more driving, making us more car dependent and costing us more money.

Finally, it’s truly disappointing that such a rich and detailed source of information should be used largely for car-based propaganda.  Reports like these aren’t really designed to help diagnose or solve problems, but simply to generate heat.  They’ll be used in predictably misleading ways by road-widening advocates.  More or bigger data doesn’t help us solve our problems when its filtered through this incomplete and biased framework.

Our reviews of previous Inrix Scorecards

In 2018, we lampooned the predictable alarmist tone of the congestion report:

Cue the extreme telephoto shots of freeways!

Wallow in the pity of commuters stuck in traffic because of all those other people!

Wail that congestion is getting worse and worse!

We noted that the 2017 Inrix report adopted a new and more expansive definition of congestion costs which further inflated its estimates.

Older studies like TTI, estimated dollar costs based on the additional time spent on a trip due to congestion: So if a trip that took ten minutes in un-congested traffic took a total of 15 minutes in a congested time period, they would monetize the value of the five minutes of additional time spent. The Inrix report appears to monetize the total value of time spent in congested conditions, i.e. anytime travel speeds fell below 65 percent of free flow speeds.

In 2016, we gave the Inrix report card a “D” 

In 2015, we pointed out that the Inrix study had a number of contradictory conclusions, and that Inrix had “disappeared” much of its earlier data showing that high gas prices had demonstrably reduced traffic congestion in US cities.

For more information and analysis about the conceptual and methodological problems in these “congestion cost studies,” see our Reporter’s Guide.

 

 

 

It looks like the Interstate Bridge Replacement could cost $9 billion

Just 13 months after raising the price of the Interstate Bridge Replacement (IBR) project by more than 50 percent, the state DOTs ay it will cost even more

We estimate project costs are likely to increase 20 percent or more, which would drive the price tag to as much as $9 billion, almost double the 2020 estimate..

While the DOTs blame “inflation” their own estimates show construction cost disinflation, with expected increases of no more than 3.5 percent per year for the rest of the decade.

The likely increase in costs will more than wipe out the $600 million in federal funds awarded to the project in December.  The cost of the IBR is increasing faster than the DOTs can find money to pay for it.

The “Cost Estimate Validation Process” (CEVP) that state DOTs implied would remedy future cost increases utterly failied

The use of lowballed construction cost estimates to sell highway megaprojects is part of a consistent pattern of “strategic misrepresentation.”  It’s  the old bait-and-switch:  get the customer to commit to buying something with a falsely low price, and then raise the price later, when its too late to do anything about it.

There will be likely future cost increases:  There are huge and unresolved risks to the projects actual cost, and the DOTs haven’t even turned a shovel of dirt yet.  The real price increases will likely come after construction starts.

ODOT has a consistent track record of lowballing pre-construction cost estimates, and recording huge cost overruns, with the average price of a major project doubling between pre-construction estimates and final costs.

Just 13 months ago, with great fanfare, the Interstate Bridge Replacement Project released a definitive new cost estimate for replacing  the I-5 bridges.  Costs jumped by 54% from earlier estimates, from $4.8 billion to as much as $7.5 billion.

Now IBR leaders are signaling the project will be even more expensive.  Oregon Public Broadcasting reports:

Planners for the effort to replace the aging span revealed Wednesday that it is going to be more expensive than previously thought. Program leader Greg Johnson didn’t put a number on the growing price tag, but he said the replacement project is falling victim to a “continuing creep of costs.”

How big a cost increase?  Likely a $9 billion project

IBR officials are being purposely vague about the cost increase, but given the wide range of their previous cost estimate–anything from $5 billion to $7.5 billion–the increase would have to be significant to lie outside this window.  At a minimum, we should probably expect an increase of 20 percent, with the costs increasing from a minimum of $6 billion to a maximum of $9 billion (or more).  Any smaller increase in costs would not significantly move the project out of the current range.  It seems entirely possible that the increase could be more than 20 percent.

While we’re generally reluctant to speculate on such matters, our earlier prediction of the increase in IBR costs was almost exactly correct.  As Willamette Week reported, our City Observatory prediction—made in May, 2022, seven months before the IBR estimates were released—that the cost of the IBR would balloon to between $5 to $7 billion was spot on, and slightly conservative.

Falsely Blaming Inflation

As they did a year ago, the DOTs are painting themselves as victims of inflation.

“One of the things all mega projects are experiencing is this inflation we’ve seen in the construction industry,” Johnson said. “We are going to be reissuing an overall program estimate probably later this summer.”

The trouble is, all of their earlier estimates–including those in 2020 and 2022–already allowed for inflation.  And more to the point, highway construction cost inflation, which did spike briefly during the pandemic, has subsided to historically typical levels–according to the official revenue forecast of the Oregon Department of Transportation. Here’s ODOT’s prediction of future capital cost inflation from their October 2023 forecast

Construction Cost Inflation is back to historic trend

From 2023 through 2031, ODOT expects that construction cost inflation will be about 3 percent per year—no higher than its long run historic average..

That represents almost no increase over the inflation that IBR officials said they had used in constructing their earlier forecasts of the IBR cost.  Keep in mind that cost estimates are made in “year-of-expenditure” dollars and according to their testimony to the Oregon Legislature, they model assumed the same construction time frame as the earlier estimates.  In January of 2021, the IBR team described the methodology they used to construct their estimates and predicted construction cost inflation of 2.2 percent to 2.3 percent per year after 2020:

As with the construction cost inflation factor, the program team used WSDOT’s Capital Development and Management (CPDM) historical and forecast cost indices for Preliminary Engineering (PE), Right-ofWay (RW) acquisition, and Construction activities (CN), using third-party data sources and statewide experience. The values used to escalate fiscal year (FY) 2012 dollars to FY 2020 are based on these indices by the three expenditure types, which include historical data through FY 2019. The overall effect of the three historical cost indices that were used to inflate from FY 2012 to FY 2020 equates to an average annual inflation rate from 2.0% to 2.2%, depending on which capital cost option is selected. Projected inflation rates by year beyond FY 2020 vary, averaging between 2.2% and 2.3% when applied to the expenditure schedules for the capital cost options.

By not showing their work, and describing exactly how their inflation estimates changed between their 2020 project cost estimate and their December 2022 cost estimate, the IBR is exaggerating the importance of inflation, and downplaying its inability to accurately calculate future costs.  It’s easy to blame inflation, but if a changed inflation outlook is really the cause of the cost increase, they should use their own agencies official estimates to show exactly how much the change in inflation affects the project’s cost: they haven’t.

The failure of “CEVP” to prevent further cost increases

At the time it presented its last set of cost estimates, IBR officials responded to legislative concern about cost increases by claiming that they had a sophisticated risk analysis tool to accurately predict future costs.  That tool, called the “Cost Estimate Validation Process” was presented as a kind of “magic wand” to avoid future increases. 

IBR administrator Frank Green assured the Oregon and Washington Legislators that the CEVP would help them manage costs:

Its a process that enables us to identify costs . . .we also go through a process where we bring subject matter experts to identify, on a program like this, what are some of the potential risks that we may encounter as we’re moving through development of the program.
. . . as we produce our CEVP report and publish it, it will show the list of risks, well over a hundred, that our team and our partners and our subject matter experts identified. It’s important to understand that we also identified strategies, that we as a team and our partners can take to minimize the potential impact of these risks.

Joint I-5 Committee Meeting, December 12, 2022 

As we pointed out a year ago, the IBR actually did not present the results of the “CEVP” when it released its new cost estimates, and claimed, in response to a public records request” that it had “no records” of having conducted a CEVP.

In reality, the CEVP doesn’t so much prevent cost increases as simply document, after the fact, why they occurred.  The next iteration of the CEVP will show how IBR officials made bad assumptions about design, schedule, environmental factors (like the in-water-work-window) that drove up costs or blew up the schedule.  In theory, the CEVP should anticipate these “risks”—in reality, it does nothing to prevent systematically bad or mistaken assumptions about project cost drivers.

ODOT’s Reign of Error:  Consistent Cost Overruns

For anyone who has followed ODOT cost estimates, this latest round of further cost increases comes as no surprise.  ODOT has consistently and badly under-estimated the ultimate cost of virtually every single one of its major highway construction projects.  As we’ve reported at City Observatory, ODOT’s cost estimates are a series of “exploding whales“—the Oregon Department of Transportation has a long string of 100 percent cost-overruns on its major projects.  Almost every large project the agency has undertaken in the past 20 years has ended up costing at least double–and sometimes triple–its original cost estimate.with the average large ODOT highway project seeing a 100 percent cost escalation between the time it is approved and its ultimate completion cost.  The likely $9 billion maximum cost of the IBR project, up from an estimated $4.8 billion “maximum” estimated by IBR in 2020 would put the IBR right in the middle of this cost doubling pattern.

As public finance scholar Bent Flyvbjerg has documented, these consistent errors are no accident:  they are a conscious, institutionalized practice of using low-balled initial cost estimates to secure support for a project, coupled with a strategy of revealing true costs only once the project is committed or under construction.

Cost Overruns Matter

The ever-increasing cost of the Interstate Bridge Project is problematic for many reasons.  First, the agency hasn’t fully identified (much less obtained) the funding needed for the current $7.5 billion cost estimate.  Oregon and Washington taxpayers will be on the hook for these amounts, and every cost increase raises they amount they have to contribute.  In effect, even a 10 percent increase in costs (and its likely to be double that, or more), would more that wipe out the value of the much ballyhooed $600 million grant awarded to the project in December, 2023.  In a real sense, project costs are escalating faster than ODOT and WSDOT can find new revenue.

There’s a second problem:  Rising costs could also invalidate the existing (and future) awards of federal funds.  As we’ve noted, federal law requires that highway projects be “cost-effective” in order to qualify for federal funds.  Cost-effectiveness is judged by a benefit-cost analysis.  In simple terms, if benefits don’t exceed costs, a project isn’t eligible for federal highway funds.  The current benefit cost analysis is already full of errors and suspect assumptions that inflate benefits, and was prepared by an IBR contractor with a clear (but undisclosed) conflict-of-interest.  If the new, higher level of costs were factored into the benefit-cost analysis, the project would be even shakier–and likely ineligible for federal funds because it isn’t cost effective.

Third, rising costs may force the project to try to extract more money from tolls.  If the project raises tolls, it will likely increase traffic diversion to I-205, with adverse effects on congestion and pollution.  The financial need for higher toll revenues may also undercut the viability of proposed toll-discounts for low income commuters.

Blame inflation now: Lying about the latest IBR Cost Overrun

The price of the I-5 “bridge replacement” project just increased by more than 50 percent, from $4.8 billion to $7.5 billion

ODOT and WSDOT are blaming “higher inflation” for IBR cost overruns

As we’ve noted, the Oregon Department of Transportation has a long string of 100 percent cost-overruns on its major projects.  Almost every large project the agency has undertaken in the past 20 years has ended up costing at least double–and sometimes triple–its original cost estimate.

The data don’t support their claim–their own agencies official projections of future construction price inflation show a negligible change from 2020 levels.

Higher construction cost inflation accounts for only $300 million of a $2.7 billion cost increase.

The cost estimate for the I-5 bridges just jumped by 54%, from $4.8 billion to as much as $7.5 billion.  The principal culprit according to the Oregon and Washington highway departments is “higher inflation.”

Project director Greg Johnson lamented to the Portland Tribune:

“Nothing gets cheaper as time goes on. Construction projects across the country are experiencing unprecedented cost increases due to supply chain issues and increasing material and labor costs as well as other factors, and our program is no exception,” Johnson said.

But the project’s earlier projections fully anticipated that there would be inflation—it was no surprise.  The only question is whether the recent spate of construction cost increases somehow account for a greater than 50 percent increase in the total cost of the project in just the three years since its latest “inflation-adjusted” estimate.

The claim that the increase is due to inflation is not borne out by either WSDOT or ODOT’s current official forecasts of future construction cost inflation.  Both Oregon and Washington prepare such forecasts.  The Oregon forecast recognizes a short-term spike in construction costs, but expects construction inflation to settle down to historic levels.  This from their October 2022 forecast


From 2023 through 2031, ODOT expects that construction cost inflation will be about 3 percent per year.

Similarly, Washington’s latest highway construction cost index calls for construction costs to increase in the 2-4 percent range from now through 2030.  WSDOT data show the same spike in 2021, but expect prices to actually decline in 2023, and then stabilize at a little more than two percent per year through the remainder of the decade.

From 2020 through 2030, WSDOT forecasts construction cost inflation of 2.4 percent per year (including the 10 percent increase in 2022).

That represents almost no increase over the inflation that IBR officials said they had used in constructing their earlier forecasts of the IBR cost.  (Keep in mind that cost estimates are made in “year-of-expenditure” dollars and according to their testimony to the Oregon Legislature, they model assumed the same construction time frame as the earlier estimates.  In January of 2021, the IBR team described the methodology they used to construct their estimates and predicted construction cost inflation of 2.2 percent to 2.3 percent per year after 2020:

As with the construction cost inflation factor, the program team used WSDOT’s Capital Development and Management (CPDM) historical and forecast cost indices for Preliminary Engineering (PE), Right-ofWay (RW) acquisition, and Construction activities (CN), using third-party data sources and statewide experience. The values used to escalate fiscal year (FY) 2012 dollars to FY 2020 are based on these indices by the three expenditure types, which include historical data through FY 2019. The overall effect of the three historical cost indices that were used to inflate from FY 2012 to FY 2020 equates to an average annual inflation rate from 2.0% to 2.2%, depending on which capital cost option is selected. Projected inflation rates by year beyond FY 2020 vary, averaging between 2.2% and 2.3% when applied to the expenditure schedules for the capital cost options.

The critical factor here is the increase in expected inflation over the next decade or so between the project’s 2020 estimate and its new estimate.  In 2020, they said the price estimate was based on an expected inflation rate of 2.2 to 2.3 percent.  According to Washington’s official forecast the rate is now expected to be 2.4 percent per year through 2030; and for Oregon, the rate is predicted to be about 3 percent per year through 2031.  This relatively low rate of inflation would do little to raise project costs. Over the next 10 years, 3 percent inflation per year rather than 2.2 percent inflation per year, would be expected to increase a $4.8 billion construction budget by about $300 million.  This hardly accounts for the increase in maximum construction cost to $7.5 billion.

By not showing their work, and describing exactly how their inflation estimates changed between their 2020 project cost estimate and their current 2022 cost estimate, the IBR is exaggerating the importance of inflation, and downplaying its inability to accurately calculate future costs.  Its easy to blame inflation, but if a changed inflation outlook is really the cause of the cost increase, they should use their own agencies official estimates to show exactly how much the change in inflation affects the project’s cost: they haven’t.

IBR officials presented a scary looking, but largely irrelevant chart showing the fluctuation of prices of a number of building materials.  Never mind that at least three of these categories–gypsum, lumber and aluminum–have almost no relevance for bridge construction projects.

Misleading and irrelevant cost indices presented by ODOT.

Why won’t ODOT tell us how wide their freeway is?

After more than three years of public debate, ODOT still won’t tell anyone how wide a freeway they’re planning to build at the Rose Quarter

ODOT’s plans appear to provide for a 160-foot wide roadway, wide enough to accommodate a ten lane freeway, not just  two additional “auxiliary” lanes

ODOT is trying to avoid NEPA, by building a wide roadway now, and then re-striping it for more lanes after it is built

The agency has utterly failed to examine the traffic, pollution and safety effects of the ten-lane roadway they’ll actually build.

The proposed $1.45 billion I-5 Rose Quarter Freeway Project is all about building a wider freeway.  But there’s one question that’s left unanswered in  all of the project’s hundreds of pages of p.r. materials and reports:  How wide a roadway are they actually going to build?

As we’ve repeatedly pointed out, OregonDOT has gone to great lengths to say that they are merely adding “two ‘auxiliary’ lanes” to the existing I-5 freeway.  But they’ve never released clearly labeled, accurately scaled plans that show the actual width of the roadway they’re proposing.  The current roadway has two “through” lanes in each direction as it crosses under NE Weidler Street.  ODOT claims that they’re just adding two more “auxiliary lanes.”  but in reality, they’re building a roadway that could accommodate 10 travel lanes (in addition to lengthy on- and off-ramps for freeway traffic.

That matters, because its a few hours work with a highway paint machine to re-stripe a roadway to get an added lane or two.  And because ODOT’s traffic modeling and environmental analyses are based on the assumption that there will only be two additional lanes, the Supplemental Environmental Assessment doesn’t reveal the true traffic, livability or environmental effects of a likely ten lane roadway.  (ODOT is looking to exploit a loophole in FHWA environmental regulations—which themselves likely violate NEPA—that allow a road to be re-striped without triggering a further environmental assessment).

At City Observatory, we’ve been following plans by the Oregon Department of Transportation to spend upwards of $1.45 billion widening this mile and a half long stretch of Interstate 5 opposite downtown Portland in the city’s Rose Quarter.  As we’ve noted, the agency has gone to great pains to deny that it’s actually widening the freeway at all, engaging in a tortured, misleading and at times absurdist effort.

For more than three years, we’ve e challenged ODOT to reveal the actual width of the project they were proposing to build.  The agency’s 2019 Environmental Assessment (which, by law, is supposed to be a full disclosure of the project’s impacts on the surrounding area) contained just a single crude illustration of a cross-section of the project’s right-of-way.  Using that diagram, we deduced that the freeway was to planned to be at least 126 feet wide–enough, not just for adding a mere two lanes to I-5 existing four, but actually wide enough for eight full travel lanes plus standard urban shoulders.

But that actually understates the true size of the project.  City Observatory later obtained unreleased documents prepared by ODOT and its contractors showing that the agency planned to build a 160 foot wide roadway through the Rose Quarter–easily enough for ten highway lanes.  (We’ve provided a blow-by-blow description of our efforts to pry these secrets from recalcitrant ODOT staff, and copies of the documents we obtained, below).

Still Hiding Freeway Width

In late November, ODOT released its Supplemental Environmental Analysis (SEA) for the Rose Quarter.  It continues ODOT’s strategy of deception and obfuscation about the width of the roadway they are planning to build.  Just as in the 2019 Environmental Assessment, they’ve published a “not-to-scale” drawing of a cross section of the freeway that entirely omits key measurements (while selectively labeling just a few features).

This illustration is plainly deceptive.  The drawing is not to scale, by its own admission.  It appears that there are only 3 northbound and 3 southbound travel lanes (the two central parts of the covered section).  But the actual width of these portions of the project are never disclosed.  By the project’s own admission, each of these spans may be 80 feet (or more), which is easily enough room for five traffic lanes in each direction, with ample provision for shoulders (five travel lanes would occupy only 60 feet of an 80 foot wide covered area).  According the the Supplemental Environmental Assessment, the northernmost third of the freeway cover has spans in excess of 80 feet in length (Figure 2.7, page 19).

Massively wide: 160 to 200 feet of roadway

So how wide is the freeway, really?  ODOT isn’t saying directly, but we can get a good idea by looking at another poorly labeled (but scaled) drawing included in the project’s right of way report.  The diagram (Figure 4 on page 12) shows the existing streets (the grid running North-South and East-West) and the proposed widened I-5 freeway, running diagonally through the Rose Quarter from Northwest to Southeast.  The individual lanes of the freeway are indicated.  This diagram makes it hard to see or measure, so we’ve zoomed in and added a scale (from the original drawing).

This section shows the portion of the freeway as it crosses under the NE Weidler Street Overpass.  Here the freeway is divided into three parts, from West to East a two lane southbound off-ramp from I-5, an eight lane main-line section of freeway, and a two lane North bound off ramp.  Including all the lengthy ramps, the footprint of this freeway is 12 lanes wide.

Again, these lane markings aren’t definitive.  Let’s look at the actual width of the roadway.  We’ve added a 200 foot scale at three points along the freeway.  It’s evident that the freeway is more than 200 feet wide near North Hancock Street (the northernmost scale.  It is nearly 200 feet wide at NE Broadway (the middle scale), and slightly less than 200 feet wide just south of NE Weidler (the southernmost scale).  This width is more than enough to accommodate ten travel lanes, as well as the freeway’s proposed on and off ramps

Violating the National Environmental Policy Act

The purpose of an environmental assessment is to disclose the likely effects of a proposed action, in this case, how a wider freeway will affect the community and the environment.  By concealing the actual physical width of the structure they intend to build, the Oregon Department of Transportation is making it impossible for the public to accurately understand the effects of the project, or gauge the truthfulness of claims made by ODOT that it will only add two “auxiliary” lanes of traffic.  ODOT is in violation of NEPA.  It needs to produce a fully detailed, accurately scaled set of plans showing the actual width of the roadway and the location of all structures.  With that in hand, the public can then gauge the actual size of this proposed freeway widening, and know whether it can trust ODOT’s claims about its impacts.

A short history of ODOT’s Deceptions

We raised this issue at City Observatory, and it was also included in official comments in response to the EIS (March 2019), and in formal testimony to the Oregon Transportation Commission (April 2019).  In response, ODOT said nothing.

In November 2020, the Oregon Department of Transportation and the Federal Highway Administration published a “Finding of No Significant Environmental Impact” or as its known in the trade a FONSI, essentially denying that the project had any environmental effects worth worrying about.  That document, and related supporting materials still failed to answer the basic question about the width of the freeway.

So, on December 1, 2020, I appeared (virtually) before the Oregon Transportation Commission, and again asked them to answer this very basic question (as well as several others).  Members of the Commission directed their staff to meet with me, which we did, again virtually, on December 16, 2020.

The December 2020 “meeting” was an extremely stilted, and one-sided conversation because the ODOT staff in attendance (nine in total), declined to answer any questions during the meeting. Instead, they simply took notes, and said they would respond, later, in writing.

On January 14, ODOT sent their response.  Here, is there response to the question about the width of the freeway.

As you can see, there’s not a single number present.  This, for the record, is an agency that has spent several years, and tens of millions of dollars planning and designing this project, and yet wouldn’t answer this basic question.  And just for clarity about the level of detail of those planning efforts, the agency said with some certainty that it would need to take a couple hundred square feet of on hotel parking lot (the area of one good sized bedroom or one smallish living room), as part of the freeway right of way.

So, how wide is it?

In a separate e-mail to me, ODOT’s Brendan Finn, head of the Office of Urban Mobility that supervises the project, said:

“Regarding the “width of the built right-of-way of the Rose Quarter project, . . . I believe you received a response to the width of the Rose Quarter Project, it being within the EA document.”
(Finn to Cortright, February 12, 2021)

In an email to Willamette Week reporter Rachel Monahan, on January 22, one of ODOT’s public affairs persons said:

“Yes, the right of way as stated in the Environmental Assessment is 126 feet.

For your reference, Figure 2-4, located on page 10 within the Project Description of the February 2019 Environmental Assessment, available at https://www.i5rosequarter.org/library/, illustrates the proposed lane configuration which includes an inside and outside shoulder, two through lanes, and one auxiliary lane for the highway in each direction. All shoulders and lanes are 12 feet wide. The anticipated right of way would also provide the opportunity for bus on shoulder use and the space needed for fire, life, and safety requirements and provisions under the highway covers.”

None of this, of course, was actually true.  City Observatory obtained three different sets of documents prepared by ODOT contractors showing the actual width of the roadway to be approximately 150 to 160 feet.  As early as 2016, the project’s contractors drew up plans for a 160 foot roadway–something that was never disclosed publicly by IBR, but which we obtained via a Federal Freedom of Information Act request.  One of the project’s consultant’s drew up a landscape plan for freeway covers, clearly showing at 150 plus roadway (the contractor deleted this image from her website after we published this at City Observatory).  Finally, CAD drawings prepared by the project, obtained by public records request show a 160 foot wide roadway.

What this really means is that the I-5 Rose Quarter project is easily large enough to include a ten-lane freeway.  Here, we’ve adjusted the diagram contained in the original ODOT Environmental Assessment to accurately reflect the number of travel lanes that could be accommodated in a 160 foot roadway.  This illustration contains generous inside and outside shoulders, as well as full 12-foot travel lanes.  (Ironically, ODOT’s own design for the southern portion of the Rose Quarter project calls for 11-foot travel lanes on the viaduct section of I-5 near the Burnside Bridge).

 

ODOT doesn’t care about covers, again

ODOT’s Supplemental Environmental Analysis shows it has no plans for doing anything on its vaunted freeway covers

It left the description of cover’s post-construction use as “XXX facilities” in the final, official Supplemental Environmental Impact Statement

The report makes it clear that “restorative justice” is still just a vapid slogan at the Oregon Department of Transportation.

In theory, the Oregon Department of Transportation is proposing to spend $1.45 billion on freeway covers to somehow repair the damage it did when highways it built largely destroyed the Albina neighborhood in the 1950s, 1960s and 1970s.

ODOT has invested considerable resources in creating the fiction that highway covers will the the ideal environment for new development.  Never mind the agency isn’t planning to contribute a dime toward building anything on said covers, even though its highways directly destroyed hundreds of neighborhood homes, which it never replaced.

It should be clear to anyone watching that talk of developing the covers is purely a woke-washing ploy:  The agency’s real agenda is a wider highway.  Last year, it sent a typo-ridden mailer to thousands of North and Northeast Portland households featuring a purely fictional “Workforce Development Center” built by African-American Artisans–which doesn’t exist and isn’t a part of the project at all.  Other planning documents have illustrated imaginary housing that might be built (if somebody other than ODOT pays for it). There’s abundant evidence that, beyond fictional illustrations, OregonDOT doesn’t really care about the covers or what happens on them.  It’s designed a roadway so wide that on most of the covers, it will be impossible to building anything other than a “lightweight” building, no more than three-stories tall.  And, as noted, somebody else will have to pay for those buildings.

Mythical, multi-story buildings to be built by someone, not us (ODOT, 2019 Rose Quarter EA).

The latest bit of evidence of ODOT’s profound indifference is in its recently published “Supplemental Environmental Assessment.”  Turn to the “Right of Way” report that is one of the project’s attachments. This is an extremely detailed document which lists every square foot of property that will be acquired for the project (or which will have even a temporary easement associated with construction).  At the very end of the document (page 26 of 28-pages) , ODOT speaks to what will happen on those very expensive covers it develops.

This public review document has a highlighted section which somebody forgot to finish editing that explained what ODOT would do “as an interim measure” when the project is completed.  Whatever these “xxx facilities” are, we can only guess, but it’s apparent that even after years of touting the covers, ODOT has no idea, and certainly no plans to do anything meaningful on the highway covers.  Keep in mind:  This is the official Supplemental Environmental Assessment, not some working draft.

Image of I-5 Rose Quarter SEA Right of Way Report: Yellow-highlighted “xxx facilities” in original.

 

The preceding paragraph of the section quoted above makes it clear that ODOT has no intention to develop this property, and it is not going to be a picnic for anyone else, either.  ODOT would continue to own the cover, and would insist on some vaguely described air rights and lease agreements.  It also makes it clear that some additional regulatory processes, including further review under the National Environmental Policy Act would likely apply as well.  Developing this property will be vastly more expensive and complex than developing property elsewhere in the neighborhood.

In short, ODOT has no plans to construct covers that will support significant buildings, no plans for any meaningful use of the covers after the highway is complete, and no funding for it (or anyone else) to develop anything on the highway covers.  And if somebody else does have an idea, they’ll have to pursue it with their own money, and they’d better bring lots of lawyers, because it’s not going to be easy.  In the meantime, Albina, enjoy your “XXX facilities”—we’re sure they’ll be special.

ODOT: Our I-5 Rose Quarter safety project will increase crashes

A newly revealed ODOT report shows the redesign of the I-5 Rose Quarter project will:

  • creates a dangerous hairpin turn on the I-5 Southbound off-ramp
  • increase crashes 13 percent
  • violate the agency’s own highway design standards
  • result in trucks turning into adjacent lanes and forcing cars onto highway shoulders
  • necessitate a 1,000 foot long “storage area” to handle cars exiting the freeway
  • require even wider, more expensive freeway covers that will be less buildable

A project that ODOT has falsely billed as a “safety” project—based on a high number of fender benders—actually stands to create a truly dangerous new freeway off-ramp, and at the same time vastly increase the cost of the project, while making it harder to build on the project’s much ballyhooed freeway covers.

Earlier, we revealed that the redesign of Oregon DOT’s proposed $1.45 billion Rose Quarter Freeway widening project will a hazardous new hairpin off-ramp from Interstate 5, endangering cyclists.

The safety analysis for the project’s Supplemental Environmental Impact Statement confirms our concerns that ODOT is building a “Deadman’s Curve” off-ramp:  The agency estimates the new ramp will increase crashes 13 percent compared to the No-build, and that the design of the off-ramp violates ODOT’s own Highway Design Manual.

As part of its redesign of the I-5 Rose Quarter Freeway project, ODOT has moved the Southbound off-ramp from I-5, which is now located just North of NE Broadway, to an area just next to the Moda Center, and immediately north of the existing I-5 south on-ramp.  The new ramp fits awkwardly into the existing street grid, and the most troublesome feature is a  hairpin turn for traffic exiting the freeway:  I-5 traffic traveling southbound and leaving the freeway has to do a tight 210 degree turn onto Northbound Williams Avenue.  The proposed off-ramp would have two lanes of freeway traffic negotiating the hairpin turn on to N. Williams Avenue (shown as green arrows in this diagram).

Just a week ago we wrote a scathing critique of the Oregon Department of Transportation’s proposed redesign of the I-5 Rose Quarter project.  The agency is building a new and dangerous off-ramp, that creates a hairpin turn on a freeway exit, funnels traffic across a major bike route, and causes longer travel on local streets.  That’s pretty bad.

But the reality is much worse.  Don’t take our word for it.  Take ODOT’s.  Though its shrouded in intentionally opaque bureaucratic language, it’s clear that the engineers at OregonDOT know this is a very unsafe project.  And not just unsafe for bikes and pedestrians on local streets:  the new ramp configuration creates a dangerous, and higher crash rate facility for cars and trucks

The agency’s safety analysis is contained in a technical safety report, dated, August 15, 2022, but publicly released just last week.  It is worth quoting at length:

Under the HSM method, the number of crashes which may occur on a ramp is sensitive to geometric conditions, traffic volume, and length of the ramp. There are no major changes in geometry in the I-5 southbound exit ramp between the No-Build and Build conditions, hence they have similar forecast crash rates. However, as proposed in the Revised Build Alternative, relocating the I-5 southbound exit-ramp connection to the local system from N Broadway to NE Wheeler Avenue would increase the ramp length from approximately 1,000 feet in the No-Build conditions to approximately 2,000 feet in the Revised Build conditions, which would provide 1,000 feet of additional traffic queue storage. The new ramp design also includes wider shoulders than existing conditions. Based on the HSM, the forecast crash rate at this location would be approximately 13 % higher than the No-Build and Build condition. In the HSM, the number of crashes on a facility is highly sensitive to volume and length. As the length of this ramp increases, the forecast number of crashes increases and therefore so too does the crash rate. However, from a traffic operation perspective, the additional storage on the I-5 southbound exit-ramp would reduce the potential for queue spill-back onto the freeway. Under the No-Build Alternative, queue on the exit ramp is expected to propagate upstream onto the freeway mainline, creating a safety concern. The additional storage provided in the Revised Build Alternative would be able to accommodate the queue on the ramp without encroaching onto the freeway. This is particularly beneficial during peak hours and event conditions. In addition, the lengthening of the ramp will allow motorist to decelerate to a safer speed allowing them to safely navigate through the horizontal curve.

The final 250 feet of this ramp includes a horizontal curve prior to the ramp terminal intersection. The proposed curve would not meet ODOT’s HDM minimum radius for exit ramp curves and could also result in truck off tracking that extends outside of a standard travel lane. Therefore, to mitigate these considerations, the design detail of this curve would include wider shoulders and lanes than other sections of the ramp. Adequate delineation, signing, markings and lighting to inform drivers of the sharp curve as they approach the ramp terminal intersection would also be considered. These design treatments would be refined in the design process as the project proceeds. Figure 11 shows the existing N Williams Avenue/ NE Wheeler Avenue/ N Ramsay Way intersection and the lane configuration for the proposed I-5 southbound terminal.

There’s a lot to unpack here, and it’s written in a way as to be opaque and misleading.  Let us translate it into English:

  • We’re building a freeway off-ramp with an extreme (210 degree) hairpin turn (“the final 250 feet . . . includes a horizontal curve”).
  • That’s going to increase the number of crashes by 13 percent above doing nothing, and our previous design.
  • The hairpin turn and crashes will cause traffic to back up on the freeway off-ramp and could jam the freeway, but don’t worry, because we’ve doubled the length of off-ramp (from 1,000 feet to 2,000 feet) so that it will be long enough to serve as a parking lot for those exiting the freeway (“queue on the exit ramp . . . additional storage”)
  • The turn is so tight that trucks can’t negotiate it without crossing out of their lane, but don’t worry, because the shoulders will be wide, giving cars plenty of room to dodge wide-turning trucks.  “truck off tracking . . outside a standard travel lane”
  • The hairpin turn is so severe that it violates our agency’s own standards for road design (the same standards we use to refuse to build bike lanes and provide pedestrian access). (“does not meet ODOT’s HDM minimum radius for exit ramp curves.”)
  • We know the hairpin turn is dangerous, so we’ll think about putting in big warning signs and flashing lights. “Adequate delineation . . . to inform drivers . . .would be considered”).

More Dangerous, More Expensive, and Less Buildable

And there’s one more kicker that isn’t really mentioned here.  Because the I-5 southbound ramp is now Nouth of Broadway and Weidler, moving the ramp South requires that the freeway be widened even further to provide two ramp lanes that reach all the way to NE Wheeler and the MODA center.  Those lanes now have to go underneath Broadway and Weidler.  That means that the additional one-thousand feet of off-ramp length would mostly be underneath one of ODOT’s much ballyhooed highway caps.  In the diagram below, the two extended Southbound on-ramps are shown on the far left (with turquoise cars).

The proposed cost of the Rose Quarter project has tripled to nearly $1.45 billion, chiefly because of ODOT’s additional widening and the concomitant escalation in the cost of freeway caps. The caps are extraordinarily expensive, and their expense increases exponentially with added width.  Routing two thousand-foot long on-ramps under the structure increases the needed with of the structure by at least 30 feet, and likely more.  And that not only increases its cost, but the added width of the structure makes it more difficult to build a structure that could accommodate buildings.  (As we noted earlier, ODOT says this portion of the freeway caps could handle buildings no higher than three stories (and such buildings would have to be “lightweight.”)

We have rules against such things: but they don’t apply to us.

The safety report makes a cryptic reference to something called the “HDM”: saying the dangerous hairpin turn “does not meet ODOT’s HDM minimum radius for exit ramp curves.”  The “HDM” is Oregon’s Highway Design Manual that specifies all of the standards that govern the construction of major roadways and which sets the maximum radius of turns on roadways and off-ramps.  For obvious reasons, tight-corners and blind turns create serious safety hazards.  Freeway design standards are supposed to create roadways where crashes are less likely.  ODOT is proposing to simply ignore its own rules and build this dangerous on-ramp.

ODOT can’t even apply its standards consistently.  It asserts for example that it must build “full 12-foot” shoulders on much of the Rose Quarter project, ostensibly to improve safety.  But its design manual doesn’t require such wide shoulders, and in fact, the agency has gotten recognition from the Federal Highway Administration for its policies that allow narrower shoulders on Portland-area freeways.  In the same breath it touts widening shoulders (not required by its rules) as a safety measure, it gives itself an exemption from its own rules that explicitly prohibit dangerous hairpin turns on freeway off-ramps.

Transportation agencies routinely use their design manuals and similar rules to prohibit others from doing things.  We can’t build a crosswalk or a bike lane in that location, because it would violate our design manual.  That’s the end of a lot of safety improvements.  Just last week, in Seattle, the city transportation department had dawdled for years with an application to a paint a crosswalk at a dangerous local intersection, acted overnight to erase one painted by fed-up local neighbors–citing non-compliance with similar rules.

 

The Rose Quarter’s Big U-Turn: Deadman’s Curve?

The redesign of the I-5 Rose Quarter project creates a hazardous new hairpin off-ramp from a Interstate 5

Is ODOT’s supposed “safety” project really creating a new “Deadman’s Curve” at the Moda Center?

Bike riders will have to negotiate on Portland’s busy North Williams bikeway will have to negotiate two back-to-back freeway ramps that carry more than 20,000 cars per day.

The Oregon Department of Transportation (ODOT) is moving forward with plans to issue a  Revised Environmental Assessment (EA) for the I-5 Rose Quarter Freeway widening, a $1.45 billion project pitched as “safety” project and “restorative justice” for the Albina neighborhood.

The revised assessment was required in part because community opponents, led by No More Freeways, prevailed in a lawsuit challenging the project’s original environmental assessment; the project’s earlier “Finding of No Significant Impact (FONSI) was withdrawn by the Federal Highway Administration).

We’ve obtained an advanced copy of the Revised EA, and while it shows expanded freeway covers—it’s also clear that ODOT is backing away from doing anything to assure development.  And in expanding the covers, the project has created an entirely new, and hazardous freeway off-ramp.

To expand the covers, ODOT has moved the Southbound off-ramp from I-5, which is now located just North of NE Broadway, to an area just next to the Moda Center, and immediately north of the existing I-5 south on-ramp.  The new ramp fits awkwardly into the existing street grid, and the most troublesome feature is a  hairpin turn for traffic exiting the freeway:  I-5 traffic traveling southbound and leaving the freeway has to do a tight 210 degree turn onto Northbound Williams Avenue.  The proposed off-ramp would have two lanes of freeway traffic negotiating the hairpin turn on to N. Williams Avenue (shown as green arrows in this diagram).

The I-5 Rose Quarter redesign adds a double lane hairpin curve to the I-5 south off ramp. Deadman’s Curve?

Could this become “Deadman’s Curve?” The mainline stem of I-5 has a design speed of 70 miles per hour, and the off ramp would force traffic to slow to 25 miles per hour (or less) to make the u-turn on to Williams.  Traffic exiting the freeway crosses a bike lane running along the west side of Williams Avenue (illustrated with red outlines on the diagram).

A similar low speed, hairpin exit ramp from the I-5 freeway in downtown Seattle has been the scene of a series of repeated and spectacular crashes, as documented on Youtube.

I-5 South Bound Off Ramp in Seattle (Youtube Video)

A hazard for people walking and biking.

The new Southbound off-ramp abuts an existing Southbound on-ramp at the intersection of Williams Avenue and Wheeler Street.  Williams Avenue is a major bike route from downtown to North Portland, and a bike lane runs along Williams, and would cross both these ramps.  The new configuration creates a traffic maelstrom at the intersection of Wheeler, Williams and the I-5 southbound on- and off-ramps.

At one point cyclists and pedestrians will have “refuge” on a tiny triangular island wedged in between a double-lane I-5 Southbound off-ramp (12,500 vehicles per day) and a double-lane I-5 Southbound on-ramp (9,000 vehicles per day).  On one side, they’ll have cars crossing Williams Avenue and accelerating on to the freeway, and on the other side, they’ll have cars coming off the freeway to negotiate the hairpin turn through the intersection on to Williams Avenue.  Green arrows show lanes of traffic entering and leaving the i-5 freeway.  White dots show the path of the bike route.  The red triangle at the center is the cyclists tenuous traffic refuge.

 

Bike route (white dots) crosses multiple freeway on- and off-ramps.  There is a small, “refuge” (red triangle) in the middle of these multi-lane freeway ramps

The Oregon DOT’s Revised EA claims that the project will make conditions better for bikes and pedestrians “on the covers”—but not necessarily elsewhere.  The Rose Quarter project website claims:

Relocating the I-5 southbound off-ramp will reduce interactions between vehicles exiting I-5 and people walking, rolling and biking along local streets on the highway cover.

Notice the qualifier here “on the highway cover.”  What this statement leaves out is the fact that the relocation of the off-ramp will dramatically increase interactions between vehicles and people on streets away from the cover, particularly and Williams and Wheeler.  The new combination of on- and off-ramps here will create many more dangerous interactions, especially for cyclists on Williams Avenue, something that the ODOT Environmental Assessment fails to acknowledge.

The I-5 Rose Quarter project is advertised by ODOT as a “safety” project:  People cycling through this maelstrom of freeway-bound traffic may not agree.

 

Thanks to Bike Portland for its extensive coverage of the bike and pedestrian problems associated with the Rose Quarter re-design.

 

Flat Earth Sophistry

The science of induced travel is well proven, but state DOTs are in utter denial

Widening freeways not only fails to reduce congestion, it inevitably results in more vehicle travel and more pollution

The Oregon Department of Transportation has published a technical manual banning the consideration of induced travel in Oregon highway projects.

The Oregon Department of Transportation wants to pretend that induced travel doesn’t exist.  Using federal funds, it has written a new handbook on how to plan for highways that makes some preposterous and undocumented claims about the induced travel.  It explicitly prohibits planners and consultants from using peer-reviewed, scientifically based tools, like the Induced Travel Calculator, developed by the University of California Sustainable Transportation Center, and mandated by the California Department of Transportation for the analysis of the environmental effects of freeways.

The tortured denial by the Oregon Department of Transportation engages in some blatant sophistry that tries to create a false distinction between “latent” demand and “induced demand.”  If we just call it “latent demand” then somehow it doesn’t count.

Turn to page 6-79 of ODOT’s newly published “Analysis Procedures Manual“.  The APM is a technical guide to using traffic data to plan future roadways.   Here you find a red-bordered text box with a bold graphic STOP sign, explicitly banning planners and analysts from using the induced travel calculator.  “The use of these calculator types shall not be used to estimate induced and latent demand effects on ODOT-funded projects . . . ”

This kind of foot-stomping, hand-waving denial is reminiscent of the Catholic church’s harrumphing denials of Copernicus and Galileo’s observations of the universe. But induced travel is extremely well-established science, and Oregon DOT shows itself to be modern day a flat-earth science denier.

What the Scientific Literature Shows

The economic and scientific literature on induced travel is unambiguous:  Increasing road capacity, by whatever means, lowers the perceived cost of driving and results in more travel.  The phenomenon is now so well-established that its called the “Fundamental Law of Road Congestion.”

The economics are straightforward: expanding the supply of highways lowers the cost of driving, and faced with a lower cost of driving, people drive more.  In this classic diagram, the supply curve shifts outward (to the right) lowering the cost of driving and increasing the number of miles driven.

The best available science shows that this generated travel follows a unit elasticity:  a one percent increase in roadway capacity creates a one percent increase in vehicle miles traveled.  To claim otherwise is to simply be in denial about the fundamental economics of the price elasticity of demand:  lowering the price of something (in this case the time cost of using a particular roadway) tends to increase the volume consumed.

There have been numerous studies which have all reached similar conclusions about the empirical nature of this relationship.  Two of the leading scholars on the subject, the University of California’s Susan Handy and James Volker present a meta-analysis of studies of induced travel.  Their results are summarized on the following table.  In studies in the US and in other developed countries, there’s a strong and consistent relationship between expanded roadways and additional travel.  In the long run, estimates of the elasticity of induced travel are around 1.0, meaning that a one percent increase in road capacity tends to lead to a one percent increase in vehicle miles traveled.

The authoritative Traffic Engineering Handbook summarizes the literature on induced demand as follows:

. . . the long-run elasticities of VMT with respect to road space is generally 0.5 to 1.0 after controlling for population growth and income, with values of almost 1.0, suggesting that new road space is totally filled by generated traffic where congestion is relatively severe.

Kara Kockelman (2011), “Traffic Congestion,” Chapter 22, Transportation Engineering Handbook, McGraw Hill .

ODOT asserts that it can ignore all this literature.  ODOT argues, in essence, that even thought the consensus is for a unit elasticity, that here in Oregon, contra all this published literature, it believes the real coefficient of these equations is zero:  that a one percent increase in roadway capacity would lead to no increase whatsoever in travel demand.  In essence, the ODOT Analysis Methods Manual tells planners to ignore induced demand entirely.

Latent demand is induced demand.

The apparent justification for this conclusion is that there’s something called “latent” demand that’s different from “induced” demand.

Oregon DOT falsely claims that there is a difference between “latent” demand and “induced” demand.  Here’s what they are saying…

Latent Demand – this is demand for transportation that consumers do not utilize because they cannot afford the cost or it is not currently available. Latent demand responses are typically associated with network limitations, such as capacity constraints . . . Latent demand does not include induced demand.

Induced demand – new demand for travel that did not exist prior to the build scenario. This is above and beyond forecasted and latent demand associated with planned land use, it is demand that is the result of changes in land use (zone changes) or economic conditions that create new trips.

(ODOT Analysis Procedures Manual, June 2022, emphasis added).

Denying that “latent” demand is induced demand is not supported in the literature.  No other study uses these terms in this fashion, or makes this distinction between “induced” and “latent” demand.  This is ODOT’s Through the Looking Glass moment:

“When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean- neither more nor less.”

Ben and Jerry observe the latent demand for ice cream every year when they drop the price of a cone to zero, and people line up around the block.  These are all people who would love to have ice cream, if only it were free.  The lines around the block are “induced ice cream eating”, as the zero price of ice cream converts “latent demand” into “actual demand.”

But we know empirically that travel changes rapidly in response to available highway capacity.  That’s true both in the case of expansions and contractions in capacity.  People rapidly and radically change their travel distances and trip making in response to changes in capacity.  Predicted “carmaggedons” in the face of reductions of capacity from bridge closures, highway collapses, construction projects, demolitions of highways, and other similar events cause traffic disappearance.

Ultimately, this is pure sophistry:  Whether you call it “latent” demand or “induced” demand, the effects are exactly the same:  Adding more capacity to existing roadways increases the volume of vehicle travel.

Oregon’s Analysis Procedures Manual vs. California’s Transportation Analysis Framework

While OregonDOT has just published its “Analysis Procedures Manual” banning the use of induced travel calculators, its California counterpart, Caltrans has published guidelines that require the use of such a calculator to highway projects in the Golden State.  What leads one state DOT to require the calculator, while the other bans it.  Who is right?

Let’s consider the processes and documentation that went into the CalTrans and ODOT publications.  CalTrans adopted its Framework after a years-long study and review effort.  It brought in outside experts, it conducted and published a thorough literature review, and the Framework itself was the subject of public meetings.  As the Framework document explains:

Caltrans convened an expert panel of academics and practitioners through UC Berkeley Tech Transfer. The panel chair presented the group’s conclusions to stakeholders at a virtual Technical Roundtable prior to finalizing the group’s recommendations. Caltrans and State partners have accepted the panel’s recommendations, which are reflected in the guidance documents.

In contrast, the Oregon Manual has no identified author, cites no academic literature, has not been subject to outside review by persons independent from the Oregon Department of Transportation.  It is an unsubstantiated, unscientific polemic.

It’s also possible (and indeed likely) that even without changes in land use, households and businesses will sort themselves differently among the existing stock of land and buildings.  If travel is fast and free, people may choose to live at housing a great distance from their jobs (or conversely, commute to jobs at great distance from their homes).  If travel is slower or more expensive, they may seek housing nearer their job, or look for jobs only closer to home in order to minimize the time and money costs of travel.  The redistribution of population and employment among existing buildings in response to changes in travel costs is something that ODOT denies is even possible.

What’s deeply ironic about the denial of induced demand is that highway departments have been counting on it to create an unending demand for their services for decades.  Building more and wider roads has led to more driving and more car ownership, which has jammed existing roads to capacity, and led to calls for further widening.  It’s a Sisyphean cycle that leads to ever more traffic and ever more spending on roads, which is just what highway departments and their vendors want.

Induced Demand and Land Use Changes

As Litman points out there are first-, second-, third- and fourth-order effects from highway capacity increases.   Initially travel times get faster (first order). That prompts people to change whether, when, where and by what means they travel.( second order).  The shift in travel patterns and accessibility may then prompt changes in land use (third order).  Finally, the cumulative effect of a shift to sprawl and greater auto dependence may further amplify trip taking (fourth-order).

Roadway expansion impacts tend to include:

First order. Reduced congestion delay, increased traffic speeds. 

Second order. Changes in time, route, destination and mode.

Third order. Land use changes. More dispersed, automobile-oriented development. 

Fourth order. Overall increase in automobile dependency. Degraded walking and cycling conditions (due to wider roads and increased traffic volumes), reduced public transit service (due to reduced demand and associated scale economies, sometimes called the Downs-Thomson paradox), and social stigma associated with alternative modes.

The ODOT view is that the “second order” effects—changing times, routes, additional trip taking, and more miles traveled—somehow don’t count as “induced travel” if no changes in land use happen.  Or, alternatively, if that travel is accurately predicted by a traffic model or anticipated in a plan (i.e. “above and beyond forecasted”) , that it also doesn’t count.

The Land Use Red Herring

But let’s have a look at the second part of the argument:  That the transportation agency can ignore that part of induced demand that results from land use changes in response to the expansion of roadways, and that somehow, because Oregon has a system of land use planning that those effects simply don’t occur here.  ODOT’s rhetorical position is that “Induced demand” can only occur in response to land use changes, and land use changes are impossible under Oregon’s land use system.

The Oregon Department of Transportation likes to pretend that the only form of induced travel that is real is that which accompanies changes in land use.  And they argue that because Oregon has strict land use laws, that investments in travel infrastructure can’t produce changes in land use.

In general, Oregon faces low risk related to induced demand because of the state’s strong land use laws, which exist to prevent sprawl. Changes to land use must be approved by local jurisdictions, so a facility project cannot induce demand just by itself.

ODOT’s reasoning is this:  Induced demand only occurs when there is a land use change that necessitates a change in a land use plan.  Because Oregon has land use plans, transportation projects somehow can’t create induced demand. This reasoning is wrong for two reasons:  First, as we’ve already explained, “latent” demand–changes in transportation behavior in response to a capacity increase–can happen even without any change in land use, and this “latent” demand is, according to all the scientific literature “induced demand.”  The second reason is that Oregon’s land use law doesn’t prevent or preclude changes in land use in response to changes in transportation infrastructure.

What this misses is that the land use system is a permissive framework, and within that legal framework many possible patterns of population and employment are possible.   For example, new housing can be built in infill locations (near transit, and proximate to more jobs) or it can be built at the urban periphery.  Both outcomes are possible under the Oregon land use system.  The key point about induced demand is that more investment in transportation infrastructure will make lower density, more far flung development even more attractive.  And, importantly, a significant part of the demand for Oregon roadways comes from places not subject to the Oregon land use system (i.e. suburban Clark County Washington).  Investing in more transportation capacity across the Columbia River will facilitate more low density sprawl in Washington, and added automobile trips on the I-5 and I-205 bridges as large fractions of these suburban and exurban households live and shop in Oregon.

A lobbying campaign to deny induced demand

There’s little question that ODOT officials are uncomfortable with the science of induced travel.  And they’re eager to do anything they can to minimize or misrepresent or discredit the application of this scientific fact to transportation planning.  For example, in 2021, ODOT sought funding through AASHTO (the lobbying organization of state highway agencies) to get a project funded to dispute induced demand.  Bike Portland reported that its proposal made it clear that the agency was primarily interested in generating talking points to push back against application of induced demand to metro area freeway expansion projects.

“While the road building era of the 1950s freeway networks is essentially complete, even minor strategies and investment intended to optimize existing roadway system assets are increasingly facing opposition in the name of “induced demand”…”

Even as it is busily ignoring or denying the science of induced travel, the Oregon Department of Transportation regularly repeats the discredited myth that idling in traffic is a significant source of greenhouse gas emissions that can be reduced by widening roadways.

Traffic Projections that Deny Induced Travel Lack Scientific Integrity

To the extent that ODOT’s guidance limits what is included in a federally required environmental impact statement, it’s steadfast refusal to cite any sources for its claims, and its consistent ignorance of published scientific literature on induced travel constitutes a violation of the scientific integrity requirements of NEPA.

§ 1502.23 Methodology and scientific accuracy.

Agencies shall ensure the professional integrity, including scientific integrity, of the discussions and analyses in environmental documents. Agencies shall make use of reliable existing data and resources. Agencies may make use of any reliable data sources, such as remotely gathered information or statistical models. They shall identify any methodologies used and shall make explicit reference to the scientific and other sources relied upon for conclusions in the statement. Agencies may place discussion of methodology in an appendix. Agencies are not required to undertake new scientific and technical research to inform their analyses. Nothing in this section is intended to prohibit agencies from compliance with the requirements of other statutes pertaining to scientific and technical research.

Chuck Marohn, writing at Strong Towns explains that traffic engineers treat travel demand as a fixed and immutable quantity–they’ve build models and a world view that pretends that people will travel just as much whether they build a project or not.  This view helps justify building ever more roads, but doesn’t reflect reality and ought to be treated as professional malpractice:

The concept of “travel demand” is where traffic engineers have stunted their own intellectual development more than perhaps anywhere else. And they’ve done so for two reasons. First, it makes their models easier to run. It’s really difficult (impossible, really) to create models that factor in the behavioral responses of humans. Better to just assume a static level of demand, even though that assumption is a farce (remember, traffic models are all about justifying projects, not actually modeling what is going on in the world).

Second, it allows traffic planners and engineers to position themselves and their craft as responding to demand, not creating it. That’s an important distinction because it allows them to be confident in what they do without having to struggle with the underlying reasons that things aren’t working.  . . .

Engineering in the auto age is about building—build, build, build—and not about optimizing or managing systems. When your ethos is merely to build more stuff, you develop myths and models that support that ethos. That’s what you’re seeing in the patently absurd assertion that additional capacity does not generate more trips. . . .

In 2022, denying how highway expansions induce people to drive more should be considered professional malpractice.

US Secretary of Transportation Pete Buttigieg clearly endorses the science of induced demand.  In a recent television interview, Buttigieg told Chris Wallace:

. . . here’s an entire science to this. And we have a lot of research partners. We have our own research institution called the Volpe Institute, which is in Cambridge, Massachusetts. . . . one of the challenges we have right now is you got more and more people in the country more and more people on the road. Just how to be smart about that. For example,it turns out that sometimes when you just want to get a lot of traffic on the roadway, and you just added lanes to it, all you get is more traffic, because it actually makes more people want to drive on that road and then you’re right back where you were.

The IBR project: Too much money for too many interchanges

The real expense of the $5 billion I-5 bridge replacement project isn’t actually building a new bridge over the Columbia River:  It’s widening miles of freeway and rebuilding every intersection north and south of the river.  A decade ago, an independent panel of experts convened by OR and WA governor’s strongly recommended to ODOR and WSDOT that they eliminate one or more intersections.

The panel concluded that 70 percent of the cost of the project was rebuilding 7 interchanges in five miles.

The experts told ODOT and WSDOT that project interchange spacing violates both federal and state design standards.

The expert panel concluded that eliminating interchanges would reduce project cost, improve safety, and improve traffic flow.

Failing to look at removing or simplifying intersections after getting this expert advice is arbitrary and capricious; ODOT and WSDOT are violating the National Environmental Policy Act’s requirement that they take a hard look at reasonable alternatives

Bridge Review Panel:  A totally new bridge design; eliminate interchanges

Today’s “Interstate Bridge Replacement” project is a warmed-over version of the failed Columbia River Crossing of a decade ago.  Like the current effort, the CRC was controversial and highly criticized.  The Governors of Oregon and Washington intervened and appointed to special, independent review panels of national experts, both of which spotted errors in project.  The first, a 2010 Independent Review Panel, determined that ODOT and WSDOT’s proposed “open-web” design for the river crossing was “unbuildable.”  That led the two governors to appoint another panel, the bridge review panel, to come up with an alternative design.  That panel, also chaired by Tom Warne, issued its 146-page report in 2011.

In addition, to coming up with a buildable bridge design, the Bridge Review Panel recommended that reducing and simplifying the number of interchanges in the project area, rather than repeating and expanding each of the existing interchanges would reduce costs, and make the project function better.  Their comments are worth quoting at length:

The panel concluded that improvements to the functionality of the overall roadway network in the project limits should address urban design issues. The use of a collector/distributor system was found to be unworkable, but reducing and simplifying the number of interchanges would significantly improve both functionality and cost.

Substandard Interchange Spacing and Project Impacts
In the project corridor, seven interchanges in less than five miles results in interchange spacing that does not meet state or federal minimum requirements of one mile for interstates in urban areas. In some circumstances, interchange spacing is half the minimum required. It is not unusual in urban areas to have substandard interchange spacing. However, it is unprecedented that all seven interchanges on a project corridor have less than minimum spacing. Not only are safety and operations an issue, more than 70 percent of the project budget is associated with these interchanges. Minimum interchange spacing is necessary for operational efficiency and user safety. Substandard interchange spacing in the project corridor can be expected to negatively impact both. Interchanges adjacent to the Columbia River and North Portland Harbor also increase environmental impacts and detract from the visual quality of the shoreline and the character of a signature bridge.
It is the view of the panel that some consolidation of the interchanges on the project corridor is warranted. This consolidation would have the following direct benefits to the project:
  • Improved safety and operations.
  • Significant reduction in capital costs.
  • Reduced environmental impacts.
  • Enhanced viewsheds along the Columbia River.
  • Improved opportunities for a signature span, from budgetary, logistical, and performance perspectives.
With respect to interchange spacing, the panel offers the following secondary recommendation:

Review all interchanges, ramps and other geometric features to simplify the overall corridor design for substantial cost savings and to improve safety and corridor operations.

Bridge Review Panel Report, 2011, Page 96 (emphasis added)
The panel reiterated this point in its conclusion, indicating that they felt strongly that much more work needed to be done, and that contrary to what most states are doing (removing closely spaced interchanges), that Oregon and Washington are simply perpetuating a bad design at huge cost.
. . . the panel does feel strongly that much work remains to be done to improve the ramps and interchanges throughout the project and that simplification of these elements will bring about a better and more functional solution. In fact, the panel is struck by the fact that most states are working to remove congested interchanges and ramps rather than building their way towards such a condition: as is occurring here. In addition, the volume of interchange access is not in harmony with state or Federal guidelines. The BRP recommends further study to address interchange geometrics and operations. In addition, the whole corridor would benefit from a more comprehensive urban design review
Bridge Review Panel Report, 2011, (emphasis added)
In spite of this clear advice, ODOT and WSDOT are doing just the opposite: planning for elaborate and expensive reconstructions of each of the seven interchanges in the project area.
IBR project director Greg Johnson testified that the complex Marine Drive interchange would be the second most costly pat of the project after the river crossing itself; Bike Portland reported that the vast majority of the project price tag is due to multi-lane interchanges.  And it’s likely that the cost of these interchange could escalate dramatically, because the current crossing is designed only with a 116 foot clearance, far less than the 178 clearance called for by the US Coast Guard. Raising the bridge and the intersections would make the project even more costly.

Not just forgetful:  Arbitrary, capricious and a violation of the National Environmental Policy Act

As far as ODOT and WSDOT are concerned, the work of the Bridge Review Panel has simply gone down a memory hole.  A decade ago, Oregon and Washington spent about $1.5 million on these independent, expert, outside reviews of the Columbia River Crossing Project.  Their own hand-picked national experts, looking at the proposed project with fresh eyes, said:  If you’re problem is too much weaving because of too many interchanges in too short a distance, then the obvious—and preferred—solution is to eliminate some of those interchanges.  The experts went further, saying that eliminating interchanges would make the project safer, perform better, look better, have fewer environmental impacts, and even be cheaper.  But here we are, a decade later, and the IBR project hasn’t seriously considered  these recommendations.  They’ve completely ignored them.
Why?  We can’t know for sure.  But there’s strong evidence that the real reason ODOT and WSDOT want this project is not so much to replace the bridge, as to gin up support for spending billions to widen the freeway on either side of the river.  They know that freeway widening, if called out as a separate project, wouldn’t generate any public support.  By tying the intersection rebuilds and freeway widening to the “bridge replacement” they avoid any serious public scrutiny of that decision.  And make no mistake:  the wider roadway and rebuilt intersections are nearly twice as expensive as the bridge itself.
This also explains why the two states are wedded to a high fixed span as a replacement for the existing low level crossing.  If they have to rebuild the bridge with a 116 foot (or if the Coast Guard’s guidance prevails, a 178 foot) vertical navigation clearance, the project will require building long elevated approaches on both the North and South of the River.  Interchanges will have to be lifted high into the air to reach the elevated approaches.  Downtown Vancouver and Hayden Island will both have half mile long elevated roadways towering over their communities.  This likely why the DOTs so adamantly oppose either a tunnel or a lower-level crossing with a moveable span:  those designs wouldn’t require rebuilding every intersection, and would demolish their case for wrapping the freeway widening costs into the bridge project.
The failure to consider eliminating or consolidating some intersections is a plain violation of the National Environmental Policy Act.  NEPA requires that sponsoring agencies take a hard look at reasonable alternatives that could potentially meet the project’s purpose and need with fewer environmental impacts.  That’s exactly what this expert review panel—hired by the DOTs—said should be done a decade ago.  Willfully ignoring this information, and not including a serious appraisal of such alternatives in the project’s Environmental Impact Statement rises to the level of an “arbitrary and capricious” decision by the DOTs.  Ordinarily, and for good reason, courts have been loathe to second guess agencies on technical matters.  But this is the kind of egregious and willful disdain for the facts that it rises to a violation of the law.

ODOT’s “Fix-it first” fraud

ODOT claims that its policy is “fix-it first” maintaining the highway system.

But it is spending vastly less on maintenance and restoration than is needed to keep roads and bridges from deteriorating

It blames the Legislature for not prioritizing repair over new construction

But it chooses to advance policies that prioritize spending money on new construction ahead of maintenance

It diverts funds that could be used for maintenance to pay for cost overruns on capital construction projects.

ODOT pleads its maintenance backlog as a “bait and switch” to get more revenue that it then spends on capital construction rather than fixing roads

A proclaimed “Fix-It First” policy.

The Oregon Transportation Commission (OTC) which directs the activities of the Oregon Department of Transportation, has clearly claimed to prioritize maintenance.  In its 2020 Investment Strategy, OTC proclaims it prioritizes maintenance of existing roads:

Oregon is a fix-it first state. The Oregon Transportation Plan and Oregon Highway Plan focus on preserving the system; highway improvements are focused on enhancing efficiency and the capacity of existing facilities rather than building new ones. . . . Funding to preserve state highway assets is not adequate, resulting in a triage approach to preservation, rehabilitation, and repair, and maintaining status quo conditions requires more than doubling current funding.

The Oregon Transportation Commission has adopted the Oregon Highway Plan’s policy 1G for Major Improvements which says it will prioritize maintaining the highway system over expanding capacity.

Since road construction is very expensive and funding is very limited, it is unlikely that many new highways will be built in the future. Instead, the emphasis will be on maintaining the current system and improving the efficiency of the highways the State already has. The Major Improvements Policy reflects this reality by directing ODOT and local jurisdictions to do everything possible to protect and improve the efficiency of the highway system before adding new highway facilities.

Policy 1G: Major Improvements

It is the policy of the State of Oregon to maintain highway performance and improve safety by improving system efficiency and management before adding capacity.

A huge and growing maintenance backlog

So how is Oregon doing in implementing this policy:  Every report and inventory from ODOT shows that we have a major maintenance gap, and it’s getting worse.

ODOT’s June 2022 federally required Transportation Asset Management Plan (TAMP) reports that Oregon is spending $329 million annually less than is needed to keep roads and bridge at their current state of repair>  The state is spending less than half of what it would need to ($156 million of an estimated $320 million) just to “maintain current conditions” of Oregon bridges.  It is also spending only about 40 percent of what it needs to retain existing conditions on Oregon roads ($112 million of an estimated annual need of $273 million).  Bridges would require an additional 164 million and roads an additional $165 million, each year, in order to simply maintain current conditions.

ODOT’s Investment Strategy, adopted in 2020 admits it is dramatically underspending on maintenance, and that Oregon roads and bridges will deteriorate.  The state has other manifold needs that aren’t funded.

  • ODOT’s plans say we need to spend $5.1 billion seismically retrofitting hundreds of Oregon Bridges:  It currently has funding for just 30 of 183 high priority “Phase I” bridges–the balance are unfunded.
  • ODOT says we need to be spending $50 million per year to achieve compliance with the Americans with Disabilities Act on Oregon highways
  • ODOT says we need to be spending $53 million per year to provide or repair walking and biking facilities along state highways.

In the face of a tight budget, ODOT has chosen to cut its operations and maintenance, but still expects an even larger shortfall.  In the years ahead.  ODOT’s January 2022 Budget Outlook predicted a widening budget shortfall:

ODOT now projects that the funding gap has shrunk to $144 million in 2027,   due to stronger revenue growth and larger fiscal year 2021 ending balances through budget discipline. However, revenues and expenditures remain out of alignment, and without additional revenue or expenditure reductions the gap will grow quickly. By 2029 the gap is projected to grow to $515 million.

In short, we’re not spending enough to maintain the current system, we’re cutting operation and maintenance budgets and are facing an even larger shortfall in maintenance funding in the years ahead.  And in the face of this, ODOT is marching forward with unfunded plans for huge construction projects that will plunge the state into debt for decades.

Blaming the Legislature

ODOT blames the Legislature for this policy choice.  In a 2020 memo to employees, published by the Oregonian, ODOT Director Kris Strickler says the reason the agency has to slash operating costs and maintenance is because the Legislature short-changed the agency.  Here’s the Oregonian’s coverage:

“Many will wonder how ODOT can face a shortfall of operating funding after the recent passage of the largest transportation investment package in the state’s history,” Kris Strickler, the agency’s director, said in a Wednesday email to employees, stakeholders and other groups, citing the 2017 Legislature’s historic $5.3 billion transportation bill. “The reality is that virtually all of the funding from HB 2017 and other recent transportation investment packages was directed by law to the transportation system rather than to cover the agency’s operating costs and maintenance.”

The public and likely the Legislature will be surprised to know that “directing money by law to the transportation system” somehow precludes ODOT from spending money to maintain those roads.  The truth is that ODOT’s deceptive cost estimates and discretionary reallocation of funds are really what’s short-changing operations and maintenance.

Constantly proposing new construction and under-estimating its cost

While ODOT blames the Legislature, it is the agency advancing hugely expensive new capacity projects, including the I-5 Rose Quarter (1.45 billion), I-5 Bridge Replacement/freeway widening ($5 billion+), I-205 Abernethy Bridge ($700 million) and Boone Bridge (not revealed).

The Legislature approves these projects based on cost estimates provided by ODOT and then ODOT treats this as a mandate to pay whatever cost-overruns the project incurs.  In the case of the I-5 Rose Quarter project, the Legislature was told it would cost $450 million in 2017; the current price tag is now estimated at as much as $1.45 billion.  ODOT told the Legislature the I-205 Abernethy Bridge would cost $250 million; its price tag has doubled to nearly $500 million.  These cost overruns directly reduce funding available for maintenance.  By failing to correctly estimate costs, and by always paying for cost-overruns, ODOT’s actual policy prioritizes new capacity construction over maintenance.

Diverting maintenance funds to new construction

ODOT routinely diverts funds allocated to and available for maintenance to fund capital construction projects.

It used interstate maintenance discretionary funds to pay for the planning of the failed Columbia River Crossing project.  It diverted funds that could otherwise be used for maintenance to pay for the Interstate Bridge Replacement project.  It routinely prioritizes capital construction in the use of “unanticipated federal funds” and “project savings.”  It cobbled together just these funding sources to pay for the initial work on the I-205 Abernethy Bridge before the Legislature authorized any funding for the project.  Each year it gets a tranche of what it calls “unexpected” federal funds (federal money that is unspent from nationally competitive programs that is allocated to the states).  At its July, 2022 meeting ODOT recommended (and the OTC approved) using this money, which could be applied to the maintenance backlog, to fund $10 million towards the Interstate Bridge Replacement project.

This bias toward highway expansion at the expense of maintenance will be amplified by ODOT plans to issue massive amounts of debt for new highway construction. ODOT is pursuing a risky bonding strategy for billions of dollars of Portland-area freeway expansion projects, that effectively pledges to use maintenance monies to repay bond-holders.  HB 3055 allows ODOT to pledge all of its state and federal funds to the repayment of toll-backed bonds.  If toll revenues are less than projected–which happens frequently–ODOT would be legally obligated to cut funding for maintenance statewide to pay back bond holders.

Bait and switch

For years, ODOT has come to the Legislature, pleading poverty:  It doesn’t have enough money to maintain our roads, therefore, we need to increase gas taxes, weight-mile taxes and registration fees.  Then, when the Legislature authorizes higher taxes, ODOT uses this money not to reduce the maintenance backlog, but instead to fund giant new construction projects.  When these projects go over budget, it cannibalizes funds that could be used for maintenance, and comes back to the Legislature, again pointing to its self-created backlog of funding needs to fix potholes and preserve bridges.  In reality, Oregon is a “fix it last” or “fix it never” state:  the maintenance spending backlog is just a perpetual excuse to force the Legisalture and taxpayers to give the agency more money, which it will then plow into expanding roadways.

 

 

A bridge too low . . . again

Ignoring the Coast Guard dooms the I-5 Bridge Project to yet another failure

The Oregon and Washington DOTs have again designed a I-5 bridge that’s too low for navigation

In their rush to recycle the failed plans for the Columbia River Crossing, the two state transportation departments have failed to address Coast Guard navigation concerns

State DOT PR efforts are mis-representing the approval process:  The Coast Guard alone, decides on the allowable height for bridges, and only considers the needs of navigation.  

Make no mistake, the Coast Guard officially drew a line in the sand–actually, 178 feet in the air above the Columbia River–and has essentially said that the two state DOTs “shall not pass” with a river crossing that doesn’t provide that level of navigation clearance.

What the preliminary determination is intended to do is signal to the DOTs the kind of structure that the Coast Guard will likely approve.  But the Oregon and Washington DOTs aren’t taking the hint.  Instead, they’re pretending that this “determination” is really meaningless, and that if they just show that the height restriction would be inconvenient or expensive for them to comply with, that they can somehow force the Coast Guard to let them build a bridge with a lower navigation clearance.  That’s a clearly wrong reading of the law, and more importantly it means the two State DOTs are embarking on a risky strategy that’s likely to doom the current effort to build a new Columbia River Bridge.

Prolog:  The failure of the CRC

As we’ve pointed out, this is deja vu all over again.  The Columbia River Crossing project similarly ignored Coast Guard signals that a low bridge would be unacceptable.

More than a decade ago, the Oregon and Washington DOTs advanced a plan for a new fixed-span I-5 bridge with a navigation clearance of 95 feet.  The DOTs did their own analysis of shipping needs, and claimed that, in their opinion, 95 feet would meet the reasonable needs of river users.  The trouble is, that determination isn’t up to state DOTs:  it’s the exclusive legal province of the US Coast Guard, which is charged by Congress with protecting the nation’s navigable waterways.  (Despite the moniker “Department of Transportation” state DOTs have essentially no legal or policy responsibility for commercial water traffic.

Early on in the bridge design process–in 2005–the Coast Guard signalled its likely objections to a mere 95-foot river clearance.  But state DOT officials blundered ahead, insisting that their own analysis was sufficient to justify the low design.  At the time it issued its record of decision in December 2011, the Coast Guard filed a formal objection, noting that the two state DOTs had not provided sufficient information for the Coast Guard to make the determination as to the needed clearance.  The Coast Guard wrote:

. . . the Coast Guard’s concerns with the adequacy of the Final Environmental Impact Statement (FEIS) have not been resolved . . . As previously stated, the Coast Guard cannot determine if the preferred 95-foot bridge clearance will meet reasonable navigation requirements based on the information provided for review.

In addition, the Coast Guard noted that the FEIS failed to consider the environmental effects of different bridge heights:

The FEIS does not address current and future impacts to navigation/waterway users as a result of proposed decreased vertical clearance, nor does it study alternatives to a vertical clearance other than 95 feet.

As the bridge permitting agency, the Coast Guard determines the reasonable needs of navigation when acting upon a permit application.

Only after completing the FEIS and getting a ROD did the two state transportation departments start applying for the needed Coast Guard bridge permit.  In December 2012, the Coast Guard made it clear that the proposed 95-foot clearance would not be sufficient.  Ultimately, the Coast Guard insisted on at least a 116-foot river clearance.

Here we go again

Even though they’ve been working on reviving the Columbia River Crossing since 2018, the two state DOTs only submitted a new navigation report to the Coast Guard in November 2021.  For more than three years they’ve been operating under the assumption that the Coast Guard will go along with a 116-foot navigation clearance.  But in its “Preliminary Navigation Clearance Determination” the Coast Guard has said that won’t be nearly enough.

The Coast Guard is crystal clear about its approval standard:

“Generally the Coast Guard does not approve bridge proposal with vertical navigation clearances below the ‘present governing structure’ when the existing VNC has been and is currently needed unless there is a compelling navigational reason to do so.” (Harris to Goldstein, June 17, 2022, p. 2)

The real takeaway from the Coast Guard letter is that the I-5 bridge needs to provide 178 feet (or more) of vertical navigation clearance.

. . .  the Columbia River (specifically the section of the Columbia River immediately east of the existing I-5 twin bridges) has and needs to continue to provide VNC equal or greater than the existing I-5 twin bridges of 178 feet. Our PNCD concluded that the current proposed bridge with 116 feet VNC, as depicted in the NOPN, would create an unreasonable obstruction to navigation for vessels with a VNC greater than 116 feet and in fact would completely obstruct navigation for such vessels for the service life of the bridge which is approximately 100 years or longer. (Emphasis added)

The implication is that if the two DOTs can work out a financial deal with existing river users it can get Coast Guard to approval a lower bridge clearance.  But Coast Guard’s past comments and current review indicate that it is not merely looking out for the interests of current river traffic and industry, but is intent on protecting the current navigational channel for future industry and activities.

The reasons for the Coast Guard’s decision are clearly laid out in its June 17, 2022 letter:

  • Current users need to move structures and vessels with a clearance of between 130 and 178 feet.
  • Vessels and their cargos are growing larger over time.  Marine industries need the flexibility to accommodate larger structures in the future.
  • There are no alternative routes for waterborne traffic to reach areas East of the I-5 bridges; in contrast their are many alternate routes for terrestrial traffic (cars, trucks and trains).
  • Water access to the area East of the I-5 bridges, including PDX airport and the Columbia Business Center marine industrial area in Vancouver may be needed in the event of a natural or national emergency
  • Historically, the Columbia Business Center has been a preferred site for shipyard activity (it housed the Kaiser Shipyard in World War II) and may be needed again for this purpose in the future

The Coast Guard’s conclusion makes it clear that it is strongly committed to maintaining the existing river clearance, that it won’t approve a 116 foot bridge, and that the economic effects of this would be unacceptable.  It also pointedly directs the two state DOTs to evaluate either a tunnel or a moveable span to meet its 178-foot requirement:

The Columbia River System is an extremely important interdependent-multimodal supporting national and international commerce critical to local, national and global economies. Reducing the capability and capacity of the Columbia River System would severely restrict navigation. IBR’s proposed bridge as depicted in Public Notice 02-22 with its 35% reduction of VNC from 178 feet to 116 feet is contradictory to the U.S. Coast Guard’s mandate from Congress to maintain freedom of navigation on the navigable waters of the U.S. and to prevent impairment to U.S. navigable waterways. As new structures are built, navigation clearances should be improved or at a minimum maintained. Any proposed new bridge should have a VNC of greater than or equal to that of the existing I-5 twin bridges of 178 feet or preferable, unlimited VNC, as well as a HNC as permitted during the final USACE 408 permit. There are alternative options to accomplish this VNC to include a tunnel or a high-level lift bridge or bascule bridge, which would provide an unlimited vertical clearance. A modern similar successful project is the Woodrow Wilson Bridge over the Potomac River in Washington, DC that was completed in 2009. It is a higher-level double bascule lift bridge on an interstate (I-95) with transit. The added height of the new bridge reduced the number of bascule bridge openings for vessel passage by 76%. (Emphasis added)

The DOT Strategy:  Maximum Risk

Once again, the state DOTs have delayed as long as possible confronting the issue of the navigation clearance.  This time, having learned from its prior experience, the Coast Guard has insisted that the navigation issue be addressed prior to the environmental impact statement.

Still, the DOTs are equivocating, implying that the Coast Guard decision has no weight, and arguing that the legal standard for review involves some kind of balancing of DOT interests in a convenient and cheaper low clearance bridge and implying that the DOTs and not the Coast Guard are the ones who determine the minimum navigation clearance.

The best way to minimize risk is to advance a series of possible alternative solutions through the SEIS process.  At a minimum, these should include a lower level bridge with a lift span, and some kind of tunnel.  In the event that the Coast Guard sticks to its preliminary determination, which is a strong possibility, if not a very high probability, this will mean that the project will be able to move forward.  The DOTs solution, to move forward with only a fixed span, runs the risk that the Coast Guard will hold firm to its announced intention to require a minimum 178 feet of clearance, meaning that two or three years from now the project will be back to square one with no legally buildable, environmentally reviewed project.  All of the project sponsor’s supposed concern with being able to compete for funding will be jeopardized by this reckless decision to look only a fixed span.

USCG PNCD IBR 17June2022

Oregon and Washington DOTs plan too low a bridge–again.

The Coast Guard has told Oregon and Washington that a new I-5 bridge must have a 178-foot vertical clearance for river navigation–vastly higher than the 116-foot clearance the state’s have proposed

A fixed span with that clearance would be prohibitively expensive and would have to be huge–nearly 2 miles long, and would have steep grades. 

An easier solution would be a new bridge with a moveable span, such as that built for I-95 in Washington DC, yet IBR officials tell falsely claim an I-5 liftwapan would have to be “the world’s largest”

Three Portland area bridges have bascule spans of comparable size to that needed for the I-5 bridge, and much larger bascule and vertical lift bridges have been built elsewhere.

Our story so far:  For the past three years the Oregon and Washington Departments of Transportation have been trying to revive their plans for the failed Columbia River Crossing, a massive freeway expansion project between Portland and Vancouver.  The project would require replacing the existing I-5 bridges over the Columbia River, and the height of these bridges will be determined by the US Coast Guard.  In June, the Coast Guard issued its “Preliminary Navigation Clearance Determination” (PNCD) saying that the bridges would have to at least preserve the current navigational clearances (178 of vertical space.  That immediately threw a wrench into the DOT plans to build a fixed span with just 116 feet of clearance.  The Coast Guard declared unequivocally:

Our PNCD concluded that the current proposed bridge with 116 feet VNC [vertical navigation clearance], as depicted in the NOPN [Navigation Only Public Notice], would create an unreasonable obstruction to navigation for vessels with a VNC greater than 116 feet and in fact would completely obstruct navigation for such vessels for the service life of the bridge which is approximately 100 years or longer.

B.J. Harris, US Coast Guard, to FHWA, June 17, 2022, emphasis added.

The Interstate Bridge Replacement Project is hoping to get the Coast Guard to back down, in part by asserting that it would be impossible to build a lift span to provide the Coast Guard’s requirements.

IBR Administrator Greg Johnson testified to the Joint Oregon-Washington I-5 Bridge Committee, that if the Coast Guard required the I-5 bridge to be built with a lift span, it would be the largest such structure in the world.

Here’s a transcript of that meeting, from approximately minute 18 of the audio-video recording created by the Oregon Legislature.  IBR Administrator Greg Johnson describes the impact on the project of going to a lift span:
.  . .  a cost of putting what would in essence be the largest lift span in the world. We’re talking about an additional $400 million in that way. So, these are the trade offs that we have to look at the viability of a lift span that large which has never been operated at that size before . .  .
(emphasis added)

That’s simply not true.  In fact, there are other lift spans in the Portland area that are as large, or larger, than would be needed to include a lift span on the I-5 bridges, as we document below.

But Administrator Johnson’s claim is both ominous and vague.  How large would the navigation opening in a lift span need to be?   That determination will be made by the US Coast Guard.  The current bridge has a vertical navigation clearance of 178 feet. The 2012 Navigation Impact Report prepared for the Columbia River Crossing documented the existing navigation clearances of the I-5 bridges, which are 178 feet vertically and 263 feet horizontally.  The bridge also has a barge navigation channel with a maximum horizontal clearance of 511 feet, but this barge channel has a vertical clearance of from 46-70 feet.  The Coast Guard’s preliminary navigation report said that a new bridge should at least preserve both of the current horizontal and vertical clearances.

Needed Navigation Clearances:  178 feet high, 263 feet wide

Here’s the text of the Columbia River Crossing’s navigation report, showing existing bridge clearances.  Keeping the existing main shipping channel, shown on the left, would require a vertical clearance of 178 feet and a horizontal clearance of 263 feet.  The Army Corps of Engineers authorized navigation channel under the I-5 bridge is 300 feet wide, but the actual horizontal clearance under the bridge is 263 feet.  .

In addition, maximum river channel on this stretch of the Columbia River is already constrained by the next downstream bridge, which is the Burlington Northern “9.6” bridge (less than a mile West of the I-5 bridges).  This railroad bridge has a swing span with an opening width of about 230 feet. In order to provide a 263 foot wide channel, the I-5 bridge would need two bascule leaves with a length of 135 feet each.

One does not have to look far in the Portland Metro area to find such bridges.  There are three in the center of downtown Portland, the Morrison Bridge and the Burnside Bridge.  The Morrison Bridge (1958) has two bascule lift sections, with an opening of 284 feet.  The Burnside Bridge (1926) also has  two bascule lift sections, with an opening of 252 feet.  The Broadway Bridge (1911-12)–a slightly different kind of bascule–has an opening of 278 feet.

The Burnside Bridge, 1926:  252 foot wide opening.

The Morrison Bridge (1958):  284 foot wide opening

The Broadway Bridge (1911-12):  278 foot wide opening

Nor is the width of the needed roadway an obstacle.  The Morrison Bridge has a roadway width of 90 feet—exactly the same as the width proposed for each of the two aborted Columbia River Crossing bridges.

Woodrow Wilson Bridge:  A modern busy Interstate with a lift-span

But can we have lift spans on Interstate highways?  Actually, the answer is yes.  I-95, the one of the nation’s busiest freeways connecting the major metro areas on the East Coast has a lift span in Washington DC.  The Woodrow Wilson Bridge, opened in 2009, has a modern double leaf bascule bridge that carries 12 travel lanes and 250,000 vehicles per day across the Potomac River.  Also, there’s some question about the width of the roadway on the bascule bridge.  For the record, the Woodrow Wilson Bridge has two separate sets of “leaves” for the north and south bound sections of I-5 (i.e. it’s like two bascule bridges side by side).

Woodrow Wilson Bridge

The Woodrow Wilson Bridge allows for a relatively low level crossing of the Potomac River, minimizing the height and footprint of interchanges on either side of the river (shown above)

Rather than towering over the Vancouver waterfront, and requiring lengthy elevated roadway sections across downtown Vancouver and over Hayden Island, a bascule lift-span bridge could be built at a much lower level, eliminating the need to rebuild intersections high into the air to meet a fixed span high enough to clear 178 feet.

In contrast, a fixed-span high-level bridge violates both the pledges to respect the environment and promote equity.  It hurts the environment, because the high bridge requires vehicles to climb over a much higher elevation, leading them to consume more fuel and emit more pollutants than would be the case with a lower elevation lift-span crossing.  This is especially true for heavy trucks that will struggle to climb the high bridge’s steep grades, and which will create a safety hazard for faster moving cars.  The high bridge is also inequitable for those who are not traveling by car:  those who walk on bike or on foot will find the steep grades associated with the high bridge much more taxing the motorists, who will simply have to press harder on the accelerator pedal.

Not the world’s largest lift bridge

Contrary to what IBR staff imply, there’s nothing unusual about the size of the possible lift-span for the I-5 bridge.  Large bascule bridges are not uncommon.  The Rethe bridge in Hamburg Germany, built in in 2016, has an opening of about 308 feet.  The Erie Avenue Bridge in Lorain, Ohio, built in 1940, has an opening width of 330 feet. The Market Street Bridge in Chattanooga, has an opening that is 358 feet wide.

Rethe Bridge, Hamburg:  308 foot opening

An I-5 lift span meeting the Coast Guard’s requirements would not only not be the largest lift-span in the world; it wouldn’t be even the largest lift span in the neighborhood  That particular honor belongs to the Burlington Northern Willamette River Bridge 5.9, which has a vertical clearance of over 200 feet, and a moveable span that is more than 500 feet long–higher, and almost twice as wide as the needed opening for a new I-5 moveable span meeting Coast Guard requirements.

The Burlington Northern Willamette River Bridge: 200 feet high, 500 foot opening.

The Sauvie Island Bridge arch is barged under the BN Willamette River railroad bridge
This lift span was paid for by the federal government in 1989 and has a lift span that is 516 feet long and it provides a vertical clearance of 200 feet.  The lift span was added to the existing bridge in place of a swing span  for a cost of less than $40 million (about $125 million in today’s construction expense).

Repeating Past Mistakes:  Planning a Bridge Too Low

A decade ago, the Oregon and Washington Transportation Departments tried to force the Coast Guard to agree to a fixed Columbia River Crossing I-5 bridge with a height of just 95 feet over the river, arguing (exactly as they are now) that this lower level best balances the needs of different forms of transportation.  Balancing needs of road users, though, is not the legal standard applied by the Coast Guard, which following federal law, prioritizes the needs of river navigation.  As the Coast Guard said in its review, road users have many alternate routes for crossing the Columbia River; waterborne commerce has none.

 

The two DOTs attempt to force the Coast Guard to agree to a lower bridge height added more than a year of delays to the CRC process (which ultimately failed) as well as millions of dollars in added planning costs.  The IBR team has no plans to seek a bridge permit before 2025, and thereby seems intent on repeating this mistake–moving forward with attempts to convince the Coast Guard to approve a lower navigation clearance, while spending tens of millions of dollars planning a bridge that may not meet the Coast Guard’s legal requirements.

References:

Price Indexes for highway construction.
  • Federal Highway Administration (1989-2003).  https://www.fhwa.dot.gov/programadmin/pt2006q1.cfm
  • FHWA, Highway Construction Cost index (2003=1)  2021Q4=2.19
    https://explore.dot.gov/views/NHIInflationDashboard/
Coast Guard Preliminary Navigation Clearance Determination
https://www.interstatebridge.org/media/fi2b3xei/ibr_next_steps_bridge_permitting_june2022_remediated.pdf

ODOT’s Reign of Error: Chronic highway cost overruns

Nearly every major project undertaken by the Oregon Department of Transportation has ended up costing at least double its initial estimate

As ODOT proposes a multi-billion dollar series of highway expansions, its estimates pose huge financial risks for the state

ODOT refuses to acknowledge its long record of cost-overruns, and has no management strategy to address this chronic problem

Costs are escalating rapidly for more recent and larger projects, indicating this problem is getting worse

The Oregon Department of Transportation is proposing to move forward with a multi-billion dollar series of highway expansion projects in the Portland metropolitan area, including the $5 billion Interstate Bridge Replacement project, the $1.45 billion Rose Quarter freeway widening project, the likely $1 billion I-205/Abernethy Bridge/I-205 widening project and an as yet un-priced Boone Bridge project.  Collectively, these projects would be by far the most expensive infrastructure investment in department’s history.  But the quoted prices for each project are just the tip of a looming financial iceberg.

A quick look at the agency’s history shows that it has invariably grossly underestimated the actual cost of the major projects it has undertaken in the past two decades.  Using data from ODOT’s own records and other public reports, we’ve compiled data on the initial project costs estimates (those quoted before construction commenced) and compared them with the latest estimates (either the actual final amount of spending in the case of completed projects, or the latest cost estimates for projects that have not yet been finished).  In every case, the ultimate price of a project was more than double the initial cost estimate.

This is important because ODOT is asking for permission to undertake a series of highway expansion projects, which, once started, will create a huge financial liability for the state of Oregon.  For three projects (the I-5 Bridge Replacement, the Rose Quarter and the Abernethy Bridge I-205 widening), ODOT is planning to sell toll-backed bonds to pay for part of project costs.  But if toll revenues are insufficient to pay bonds, or if costs escalate beyond current estimates, the state is fully liable to repay all these costs, and debt service on bonds, and these payments will take precedence over all other expenditures from state and federal transportation funds.  The failure to accurately forecast project costs for Portland freeway expansions, coupled with an unavoidable obligation to repay bondholders means that all other state transportation priorities, including even routine maintenance, would be in jeopardy.

Here is a closer look at seven major ODOT construction projects undertaken in the past twenty years.  Every one has experienced enormous cost overruns.

The Interstate 5 Rose Quarter Freeway project would widen a 1.5 mile stretch of freeway in Portland and was originally represented to the 2017 Oregon Legislature as costing $450 million. The latest estimates from the Oregon Department of Transportation are that the project could cost as much as $1.45 billion.   

The Legislature directed ODOT to prepare a “cost to complete” report for the I-205 Abernethy Bridge project.  The bridge connects Oregon City and West Linn, and would be widened and seismically strengthened.  ODOT’s 2018 report said the bridge would cost $248 million.  When the agency put the project out to bid in 2022, the actual cost came in at $495 million–essentially double ODOT’s estimate.

ODOT estimated the 5 mile long Highway 20 Pioneer Mountain-Eddyville project would cost $110 million when the project completed its environmental reviews in 2003 (Federal Highway Administration and Oregon Department of Transportation. (2003). Pioneer Mountain to Eddyville US 20, Lincoln County, Oregon, Draft Environmental Impact Statement, Executive Summary).  After years of delay, and including a design-build contractor withdrawing from the project, and ODOT having to demolish bridge structures and redesign significant parts of the project, its total cost was $360 million.

 

The Newberg-Dundee Bypass has been under consideration for almost two decades; a portion of the project was completed five years ago.  The initial estimate of the project’s total cost was $222 million (Oregon Department of Transportation. (2005). Newberg-Dundee Transportation Improvement Project Location (Tier 1) Final Environmental Impact Statement (News Release 06-132-R2).  The latest estimate of the cost of completing that full bypass project is now $752 million (Federal Highway Administration and Oregon Department of Transportation. (2010). Newberg Dundee Bypass, Tier 2 Draft Environmental Impact Statement (FHWA-OR-EIS-10-0-1D). Salem: Oregon Department of Transportation.

In 2002, the Oregon Department of Transportation told the City of Portland that rebuilding the Grand Avenue Viaduct (Highway 99E) in Southeast Portland would cost about $31.2 million (Leeson, Fred, “Council Backs Long Bridge in Viaduct’s Spot”  Portland Oregonian, July 19, 2002) .  The project was completed seven years at a total cost almost three times higher:  $91.8 million (ODOT, ARRA Project Data for ODOTas of 8/31/2010) .

When proposed in 1999, it was estimated that the I-5 South Medford Interchange would cost about $30 million  (Rogue Valley Area Commision on Transportation meeting notes, September 13, 2005).   In 2013, after the project was completed the agency said the cost was $96 million.

 

The original cost estimate for the I-5 Woodburn interchange project was $25 million in 2006 (FHWA & ODOT, Woodburn Interchange Project, Revised Environmental Assessment, November 2006).  The completed price was $68 million.

It’s always possible to make excuses for cost-overruns on any single project.  And if cost-overruns had happened only once, or maybe twice, it might make sense to dismiss them as aberrations.  But as the record of these seven projects makes abundantly clear, major ODOT highway projects almost invariably ending up costing twice as much as the original price quoted at the time the project is approved.  Cost overruns are a systematic and predictable feature of ODOT’s approach to highway building, not an aberrant bug.

No Accountability for Cost Overruns

In an attempt to quell concerns about the ODOT’s managerial competence, in 2015, Governor Kate Brown directed that the agency hire an outside auditor to examine its performance.  ODOT did nothing for the first five months of 2016, and said the project would cost as much as half a million dollars. Initially, ODOT awarded a $350,000 oversight contract to an insider, who as it turns out, was angling for then-ODOT director Matt Garrett’s job.  After this conflict-of-interest was exposed, the department rescinded the contract in instead gave a million dollar contract to McKinsey & Co, (so without irony, ODOT had at least a 100 percent cost overrun on the contract to perform their audit.)

McKinsey’s work consisted mostly of interviews with agency-identified “stakeholders” and a superficial analysis of ODOT date.  Its report focused on largely meaningless or trivial indicators such as “average time needed to process purchase orders.”  One part of the report purportedly addressed the agency’s ability to bring projects on time and under budget.  McKinsey presented this graphic, showing the variation between initial and finished costs for a series of mostly small projects.

There’s a striking omission, as revealed in the fine-print footnote:  McKinsey excluded data for the Highway 20 Pioneer Mountain Eddyville project.  This project, the single most expensive project that ODOT had undertaken, had a 300 percent cost-overrun, which the McKinsey report both failed to report correctly and which it described  as “performed 27 percent higher.”

The Oregon Department of Transportation doesn’t accurately forecast the cost of its projects, and refuses to be held accountable for a consistent pattern of errors.  Relying on ODOT’s cost estimates exposes the state to enormous financial risk, something that is likely to be magnified as the department moves ahead relentlessly with plans for billions of dollars of freeway expansion projects in the Portland area.

 

 

How ODOT & WSDOT are hiding real plans for a 10- or 12-lane I-5 Bridge Project

Ignore the false claims that the Oregon and Washington highway departments are making about the number of lanes on their proposed I-5 project:  its footprint will be 164 feet—easily enough for a 10- or 12-lane roadway.

This commentary was originally published at Bike Portland, and is re-published here with permission.

If you followed Tuesday’s Portland City Council work session or have been reading press reports about the Interstate Bridge Replacement project, you’ve probably noticed claims that the size of the project has somehow been reduced to adding “just one auxiliary lane” in each direction to I-5. The implication is that they’re only building enough capacity to expand the existing I-5 bridge from its current six lanes (three in each direction) to eight lanes (three plus a so-called “auxiliary” lane in each direction).

This claim is false.

A close look at the materials prepared by the Oregon and Washington departments of transportation shows they plan
to build a new I-5 bridge at least 164 feet wide — easily enough for ten or even twelve traffic lanes.

A close look at the materials prepared by the Oregon and Washington departments of transportation shows they plan to build a new I-5 bridge at least 164 feet wide — easily enough for ten or even twelve traffic lanes. While the glossy materials describing the project prominently talk about “one auxiliary lane” (in each direction), they almost completely omit a description of the actual width of the bridge. The IBR documents show only crude and misleading cartoon-like drawings of the bridge, without any actual measurements. That’s intentional: because they don’t really want you to know how wide a structure they’re planning.

But in a cryptic note in their presentation, they do refer to the width: The so-called ten lane bridge (two auxiliary lanes each direction) is said to have the same “footprint” as the 2013 Locally Preferred Alternative (LPA, a step in the federal NEPA review process). For the record, that footprint is 180 feet. For the so-called eight lane bridge (one auxiliary lane in each direction), the footprint is described as “2013 LPA Minus 16 Feet” which works out to 164 feet wide.

The broader context is this: the so-called “bridge replacement” is really a five-mile long, ten or twelve lane wide highway widening project that will cost $5 billion, and potentially a lot more.

ODOT’s actual plans for a 180′ wide CRC obtained by public records request.

This is a repetition of the false claim made for the preceding project — the failed Columbia River Crossing (CRC). In 2010, in response to objections from the City of Portland and Metro, ODOT and WSDOT announced they were reducing the size of the CRC bridge from 12 lanes to 10 lanes. But in reality, all they did was change the references in the project documents to that number of lanes, while literally erasing from the Final Environmental Impact Statement every single reference to the actual widths of the bridges and other structures they intended to build. A public records request showed the actual plans for the bridges — which were not published — were exactly the same size (180 feet in width) as they were for the 12-lane version of the bridge.

 

The limited materials released by the IBR project to date make it clear that they are engaged in exactly the same deception.

With standard-width 12 foot wide freeway lanes, this 164 foot wide bridge would accommodate ten traffic lanes (120 feet), with 11 foot shoulders on either side of the travel lanes, or as many as twelve travel lanes (144 feet) with five foot shoulders on either side of the twelve travel lanes). (Alternatively, the 164 foot width would allow construction of 12 travel lanes with 2 foot wide left shoulders and 8 foot wide right shoulders, which would be common, if not generous for an urban bridge.)

While they’re calling it an eight-lane bridge, it’s really a 10 or 12 lane bridge.

ODOT and WSDOT will no doubt say they’re “only” adding two lanes, and point to the supposed safety benefits of wider shoulders; but nothing prevents them, after building a 164-foot wide bridge, from coming back with a paint truck and re-striping it for ten or twelve lanes. In fact, they’ll claim that they can do that without any further environmental analysis under a “categorical exclusion” to the US DOT claims to the National Environmental Policy Act.

This isn’t an aberration or an accident, it’s an intentional strategy to evade environmental review: ODOT and WSDOT did this a decade ago on the failed Columbia River Crossing. It did the same thing with the I-5 Rose Quarter project, again claiming it was merely adding one auxiliary lane in each direction. Meanwhile its actual plans (which it kept secret and didn’t include in the Environmental Assessment) showed it planned to build the I-5 Rose Quarter project to be 160 feet wide, easily enough to accomodate 10 lanes of traffic.

The highway builders know — though they refuse to admit — that more lanes induce more traffic and more pollution. That’s why they’re engaging in this highly deceptive process of claiming they’re just adding a single “auxiliary” lane, when in fact, they’re engineering structures that can be repainted in a day to be vastly wider. This subterfuge enables them to claim minimal environmental impacts now, and then with no further review, create exactly the wider roadway they wanted all along.

Ten unanswered questions about the IBR Boondoggle

In the next month or two, regional leaders in Portland are going to be asked to approve the “modified locally preferred alternative” for the I-5 Bridge Replacement (IBR) Project, an intentionally misnamed, $5 billion, 5 mile long, 12-lane wide freeway widening project between Portland and Vancouver, Washington.

There’s a decided rush to judgment, with almost many of the most basic facts about the project being obscured, concealed, or ignored by the Oregon and Washington Departments of Transportation.  As with the failed Columbia River Crossing, they’re trying to pressure leaders into making a decision with incomplete information.  Here are ten questions that the IBR project has simply failed to answer.  We’ve offered our own insights on the real answers, but before the region’s leaders take another step, they should satisfy themselves that they know the real answers to each of these questions.

1. How much will it cost?

Conspicuously absent from IBR presentations is any clear statement of what the project is likely to cost.  Almost two years ago, the project released a warmed over version of the cost estimates from the Columbia River Crossing indicating the project could cost $4.8 billion.  But this estimate is based on an update of old CRC estimates, rather than a new, bottom-up cost estimate of the current project.  Already, the IBR team has decided to rebuild the North Portland Harbor bridge which will add an estimated $200 million to the project.  Moreover construction inflation has accelerated in recent months; bids for the Abernethy Bridge project in Portland came in almost 40 percent higher than forecast.  Similar cost overruns on the IBR would add more than $2 billion to the price tag.

Real Answer:  The IBR is likely to be a $5-7 billion project

2. Who will pay for it?

Also missing from the IBR presentation is a definitive statement of the sources of funds to pay for the project.  For starters–and just for starters–the project says Oregon and Washington will each be expected to contribute $1 billion.  There’s a considerable amount of vague hand-waving about federal support, but most federal money in the Infrastructure bill is allocated by formula, and comes to the two states whether they build this project or not; and so spending this money on the IBR, rather than fixing the multi-billion dollar backlog of other bridge repairs, comes at a real cost to the states.  What is clear is that a third or more of the IBR’s costs will have to be recouped by charging tolls to bridge users, and that the two states, and no one else, will be on the hook for any cost overruns and any revenue shortfalls.  And cost overruns are hardly conjecture:  The I-5 Rose Quarter Freeway widening project, estimated to cost $450 million five years ago, is now likely to cost as much as $1.45 billion according to ODOT.

Real answer:  Oregon and Washington have unlimited liability for project costs including cost overruns and toll revenue shortfalls.

3.  How high will tolls be?

IBR staff have said next to nothing about what level of tolls will be charged for bridge users.  Studies prepared for the Columbia River Crossing showed that tolls would have to be a minimum of $2.60 for off peak users and $3.25 for peak travel, plus surcharges for those who don’t buy transponders, which would push peak period car tolls over $5.00 each way.  Trucks would pay 5 times as much as cars, with peak period tolls topping $18.  Knowing what the toll levels will be is essential to understanding the economic impacts of the bridge, as well as accurately forecasting future traffic levels.  Experience in other states has shown that even an $1 or $2 toll could permanently reduce traffic to half of its current levels, eliminating the need to add any capacity to the I-5 crossing.  Before they move ahead with the project, shouldn’t the public and its leaders know how much will be charged in tolls?

Real answer:  Tolls will be $2-3 each way, and highest at peak hours, costing regular commuters more than $1,000 per year.

4  Will other bridges and highways be tolled to avoid gridlock?

If just the I-5 bridges are tolled, ODOT and WSDOTs own consultants predict that this will produce gridlock on I-205.  IBR staff have made vague statements claiming to have looked at tolling other roadways at the same time.  But unless parallel routes like the I-205 are also tolled, the traffic claims made for the IBR are simply invalid.  If the region is serious about tolling and avoiding gridlock, it needs to adopt a comprehensive tolling strategy before it commits to a multi-billion dollar freeway widening project.

Technical work done for the CRC project, reported on page one of the Oregonian in 2014, indicated that tolling I-5 would produce gridlock on I-5.  

Tolling will dramatically affect the traffic levels on I-5 and I-205.  The best evidence is that tolling the region’s freeways would virtually eliminate the need for additional capacity expansion.  ODOT’s own congestion pricing consultants showed that a comprehensive system of road pricing would eliminate most metro area traffic congestion, without the need to spend billions on added capacity.  We know from experience in other cities that tolling after adding capacity simply leads to wasting billions of dollars on roadways that aren’t used because travelers don’t value them.

Real Answer:  Unless we toll the I-205 bridge as well, the I-5 bridge will be under-utilized, and I-205 will have gridlock. The region needs to decide on a toll system before its squanders billions on un-need highway capacity, and goes deeply into debt to repay bonds for capacity that isn’t used.

5. What will it look like?

Despite spending more than two and a half years and tens of millions of dollars on designing the project, the IBR has yet to produce any renderings showing what the project would look like to human beings standing on the ground in Vancouver or on Hayden Island.  The bridge will be 150 feet tall as it crosses the Columbia River and will have lengthy approach ramps, and extensive elevated freeway sections over Vancouver and Hayden Island, with substantial visual and noise impacts.  But you would never know it from the project’s presentations, which if they show the bridge and freeway expansion at all, show it from an aerial view that could be seen only from flights over Portland International Airport.  The project’s presentation to a joint legislative committee in April contains no illustrations of what is to be built at all.

City Observatory has obtained, via public records request, the 3D models created by IBR to show the size and location of the proposed I-5 Bridge.  The following image shows what the proposed I-5 bridge would look like, compared to the existing bridge.  It would be dramatically taller and wider, and would loom over downtown Vancouver.  It’s relatively easy to produce images showing how the replacement bridge would affect Vancouver.  Why hasn’t the IBR with its extensive budget produced any such images?

Real Answer:  The I-5 replacement bridge and approaches will tower over downtown Vancouver and Hayden Island.

6. How long will the trains take?

A key part of the project is a plan to add light rail service between Portland’s Expo Center and downtown Vancouver.  The IBR project asserts that there will be huge demand for travel on light rail.  But light rail is relatively slow.  Unless light rail is faster than car travel or express buses, it’s unlikely to attract many riders.  Currently, Tri-Met’s Yellow line takes 29 minutes to get from the Expo Center to downtown Portland.  The CRC FEIS projected that it would take light rail trains about 6 minutes to get from Mill Plain Boulevard across a new I-5 bridge to the Expo Center; together this means it will take at least 35 minutes via light rail to reach downtown Portland from Vancouver.  That’s more than 10 minutes longer than it takes current C-Tran express buses, traveling in morning, peak hour traffic, to travel between 15th and Broadway in Vancouver to SW 5th and Alder in Portland—a 7:56 AM bus leaving Vancouver reaches downtown Portland at 8:20.  Also:  with added capacity on I-5 and tolling of I-5, future express buses would travel even faster than they do today, so light rail would likely be at an even greater time disadvantage than it is now.  The information provided by the IBR contains no explanation of how a slower train is going to attract more riders than a faster bus or why BRT would perform worse than LRT in this corridor.

Real Answer:  The LRT extension to Vancouver will be considerably slower than today’s buses.

7. How can traffic models predict more no-build traffic on a bridge that is already at capacity?

The I-5 bridges reached capacity almost two decades ago, and can’t handle additional traffic, but ODOT’s model apparently predicts that traffic will continue to grow across the bride even though there’s no capacity.  This is a classic example of a broken model that in the words of national modeling expert Norm Marshall “forecasts the impossible.”  ODOT’s own consultants, CDM Smith, said in 2013 that the I-5 bridge could handle no more peak traffic due to capacity constraints:

Traffic under the existing toll-free operating condition on the I-5 bridge reached nominal capacity several years ago, especially considering the substandard widths of lanes and shoulders on the facility. The I-5 bridge has little or no room for additional growth in most peak periods, and capacity constraints have limited growth over the last decade.

The IBR’s own modelers admitted that traffic growth on I-5 has been limited due to the bridge being at capacity and congested.  Yet they’ve created a fictitious “no build” scenario in which traffic continues to increase, essentially because it has no meaningful feedback loops to adjust travel demand to reflect how humans actually respond in the face of congestion.

Real Answer:  ODOT is using flawed models that overstate no-build traffic and pollution, and conceal the true environmental impact of freeway expansion

8. How wide will the bridges be?

The IBR team describes the I-5 Bridges adding either two or four so-called “auxiliary lanes” to the existing six freeway lanes on I-5 through the project area.  But the project hasn’t revealed how wide the structures are that its actually building.  In the project’s last iteration, the “Columbia River Crossing”, the project said they reduced the size of the bridge from twelve lanes to ten in response to objections to its width from local leaders, but in fact, public records requests showed that they didn’t reduce the physical size of the bridges (or other structures) at all.  The supposed “ten lane” bridge was 180 feet wide, just as was the proposed “twelve lane” bridge.

The cryptic information provided by the IBR says that its so-called 10-lane bridge would be just as wide as the CRC (180 feet), and the so-called 8 lane bridge (“one auxiliary lane”) would be just 16 feet narrower (“2013 LPA Minus 16 Feet”), which works out to 164 feet wide.  With standard-width 12 foot wide freeway lanes, this 164 foot wide bridge would accommodate ten traffic lanes (120 feet), with 11 foot shoulders on either side of the travel lanes, or as many as twelve travel lanes (144 feet) with five foot shoulders on either side of the twelve travel lanes).  (Alternatively, the 164 foot width would allow construction of 12 travel lanes with 2 foot wide left shoulders and 8 foot wide right shoulders, which would be common, if not generous for an urban bridge).

When it comes to bridges or freeway capacity, ignore how many “lanes” ODOT and WSDOT claim they’re building, and look at how wide the structures are.  They’ve repeatedly used this deceptive tactic to intentionally conceal the true width and environmental impact of their projects.

Real Answer:  Regardless of how many lanes IBR claims its building, its actual plans provide capacity for more, in this case a 10 or 12 lane bridge.

9. How many cars will use the bridge?

The primary argument for the IBR is that it is needed to carry a growing number of vehicles crossing the Columbia River.  But completely absent from any of the project’s materials is any specification the volume of traffic the bridge will carry.  The project makes claims about travel times and traffic delay, but can’t possibly have come up with those estimates without coming up with estimates of the number of cars that will use the bridge.  It specifically suppressed this information to undercut the public’s ability to understand–and ask questions about and criticize the modeling.  And we know that the project’s earlier modeling done for the Columbia River Crossing was simply wrong.  It predicted that traffic would grow by 1.7 percent per year on I-5 between 2005 and 2030; in fact, through 2019, traffic grew by only 0.3 percent per year.   This chart shows the average daily traffic on I-5 as predicted by the CRC (blue: no-build, red build) and actual, from ODOT’s own traffic records (black).  We can’t see how IBR’s new modeling compares to these figures, because they’ve simply refused to publish any average daily traffic totals.

The models used by IBR systematically over-estimate travel in the No-build scenario and underestimate, if not completely ignore, the additional traffic induced by adding more lanes.  It’s impossible to assess the project’s claims about traffic performance, environmental impacts, or financial viability with out transparent and accurate estimates of the number of vehicles that will use the bridge.

Real Answer:  IBR uses flawed models which overstate the need for freeway capacity to justify un-needed and expensive freeway widening.

10. How will a wider freeway reduce carbon emissions?

The IBR material makes the specious claim that it will result in lower emissions, based on the false claim that decreasing traffic congestion will reduce vehicle idling in traffic, and that the bridge will have a higher share of transit passengers (something which it cannot explain–see #6 above).  The RMI Shift induced travel calculator estimates that adding lanes to the I-5 bridge could increase greenhouse gas emissions hundreds of thousands of tons per year.

Real Answer:  Expanded freeway capacity leads to more driving and more greenhouse gas emissions.

What are they hiding? Why highway builders won’t show their $7.5 billion freeway

Oregon and Washington are being asked to spend $7.5 billion on a giant bridge:  Why won’t anyone show pictures of what it would look like?

The Oregon and Washington highway departments are using an old Robert Moses trick to make their oversized bridge appear smaller than it really is.

The bridge will blot out much of the reviving waterfront and downtown in Vancouver, and put Hayden Island in the shadow of a half-mile long viaduct.

The IBR has distributed misleading and inaccurate images of the proposed bridge, attempting to make it look smaller.

The agency is spending $1.5 million to create a “digital twin” computer model of the IBR, but is keeping it secret to avoid public scrutiny of its design.

Computer visualizations, complete with human-scale animations, are cheap and common for construction projects, such as Vancouver’s proposed waterfront public market–but ODOT and WSDOT have steadfastly refused to provide such visualizations for the IBR.

The proposed Interstate Bridge Replacement Project would be the largest and most expensive public works project in the Portland metro area’s history.  You’d think that if you were spending $7.5 billion, you’d be proud to show the public and elected officials what it will look like.  But in the case of the IBR, you’d be wrong.  What do the Oregon and Washington highway department’s have to hide?

While the IBR project has only released distant aerial photos that make the project look tiny, we obtained a copy of a preliminary version of their 3D computer model, and used it to show how the view from Hayden Island changes with the construction of the new bridge.  (You can use the slider to show how the view changes between the current bridge (on the left) and the proposed IBR (on the right).

The striking difference in the height and scale of the two bridge images shown above contrasts sharply with the official image crated by IBR from the same digital model.  They use the well-worn trick of showing the bridge, not from anywhere on the ground, or where humans are likely to see it, but from a point suspended in the sky, high above the project.

The very, very short and small Interstate Bridge Replacement.

You’d have to be several thousand feet in the air to get this view of the IBR.  This false perspective makes the bridge look tiny.  It’s simply impossible to compare the height of the bridge, for example, to the height of buildings in downtown Vancouver, or get a sense of how much taller the freeway viaducts across Hayden Island are than any of the other structures on the island.

Blotting out the Vancouver waterfront

The proposed bridge will have a river clearance of at least 116 feet—the Coast Guard is asking for 178 feet—and the structure itself is a double-decker that will be between 35 and 40 feet tall, making the overall structure roughly 150 feet tall over much of the river.  Because of that elevation, the bridge requires half-mile long viaduct approach ramps to get traffic from ground level north and south of the river, up to the high level of the crossing (the lengthy viaducts and elevated intersections are more costly than the bridge itself).  This giant structure will tower over the Vancouver waterfront, which in the past decade has been the site of a remarkable urban redevelopment, with offices, shops, housing, and hotels.

Yet ODOT and WSDOT, who’s massive project will completely remake this part of the city has yet to provide a single illustration showing how the city would be affected.  Again, using the IBR’s crude digital model, we were able to produce this image showing how the view along Vancouver’s riverfront will change if the IBR is built.

Just as a point of reference for local residents, the proposed IBR river crossing will be the size of three of Portland’s I-5 Marquam Bridges side-by-side.  The massive new IBR bridge will tower over the waterfront, with associated noise and pollution.  In addition, the viaducts leading to the bridge will be as high, and in some cases higher than adjacent downtown and waterfront buildings.  Seattle just spent more than a decade and $3 billion to remove the Alaskan Way viaduct that blighted the city’s waterfront for more than half a century.  Vancouver appears to be signing up to create the same kind of roadway scarred landscape that Seattle is trying to fix.

Using manipulated drawings to make the new bridge look smaller

The IBR project has purposely avoided providing an elevation, or profile view of the proposed bridge, in order to keep its height and bulk a secret. But a year ago, it did produce a profile drawing, but one that was purposely inaccurate.   In March, 2022, when IBR as part of a navigation report with the US Coast Guard’s bridge permitting process, it produced an intentionally misleading drawing comparing the existing bridge and the new IBR.  The image was dutifully published by the Vancouver Columbian (March 25, 2022):

Original drawing: IBR project. As published in the Columbian, March 25, 2022). Yellow annotations: Bob Ortblad.

The diagram of the navigation clearance of the new bridge and old bridge, shown one-over-the-other uses different vertical scales to make the new bridge appear smaller and shorter than the old bridge.  See this from CRC (yellow markings and red text are added by Engineer Bob Ortblad).  The broad yellow band superimposed on the top diagram shows the true height and size of the new bridge.  Notice that the top panel says “not to scale” and while the diagrams use the same horizontal scale, they use different vertical scales.  This is intentional distortion.

A $1.5 million “digital twin”

This agency has no need of crude, not-to-scale drawings.  It has detailed plans, and what’s more, buying a state-of-the-digital model of the bridge. IBR is spending $1.5 million to build a so-called “digital twin“—a deeply detailed computer model of every aspect of the proposed bridge, that will be used for design, construction, monitoring and maintenance.  What, you might reasonably ask, is a digital twin?  It’s not a mere computer model, it’s really much more complicated (and expensive) than that.  IBR explains:

A digital twin, as envisioned in this project, is a portal (a 3D model of the bridge and other associated visual dashboards), through which authoritative data and information about the bridge and related road network can be accessed efficiently and quickly by authorized users along its entire lifecycle—from early project planning to real time operations. It is expected to not only serve as a digital record of the physical structure but also as a process twin whereby future “what if” scenarios related to design decisions, constructability, construction or maintenance activities, emergency operations, etc. can be simulated to a very high degree of precision

The contractor IBR hired to build the “digital twin,” WSP, touts its modeling as being an example of the “metaverse,” essentially a digital alternate reality:

Also today, we can use IoT and artificial intelligence [AI] to add data to visualizations and make decisions across departments; we can put ourselves inside of the virtual model of a city, for example, and interact with it—a process now called the metaverse; we can better relate the design to the context of the world around us We can test and validate elements of infrastructure—bridges, roadways, transit, and buildings—before construction;. we can create dynamic models that simulate and predict how these assets will perform in real-life contexts. Three-dimension reality models provide the basis for visualizing, collaboratively managing, and monitoring changes to infrastructure during the project and when the asset is in operation. (Emphasis added)

WSP has been working on the digital twin of the IBR for nearly three years, since at least June, 2020, according to company documents.  WSP’s software vendor, Bentley, also flogs the IBR digital twin work in its promotional material, saying modeling tasks that formerly took months and months to do, can now be done instantly. In theory, the “digital twin” ought to be a way for the public to see exactly what the project will look like, from any angle,  It is fully possible with such a model to create realistic, on-the-ground images and “walk throughs” of the project that convey exactly what it will look like.  But that’s just a theory, because IBR has explicitly chosen not to create or share such images or visualizations with the public.

The digital twin is a secret

But IBR is doing its best to keep the “digital twin” and the images it would show of the IBR project a secret.  At a March 16, 2023 meeting of a construction industry group in Seattle, IBR’s consultant, WSP, admitted that they were being told to keep the project under wraps so as not to provoke public outcry about the design:

Last night at Construction Management Association of America NW Chapter meeting Kevin Gilson, Director of Design Visualization at WSP USA, presented 3D/4D modeling. When he was asked about a 3D model for the Interstate Bridge Replacement (IBR), he said “Yes, but it isn’t public yet. There is a model on the website. It’s being produced by the communications team. There is a very detail 3D model. I was going to try and show it, but I am not working on that project. It’s very, very, it’s kept under wraps quite a bit, and I think it’s because of their experience with the first round, trying to tread carefully.”

Personal communication from Bob Ortblad, who attended this meeting, March 17, 2023 (Emphasis added).

City Observatory has filed a public records request to obtain a copy of the model.  IBR officials have declined to provide that model until no earlier than the end of April, 2023.

Concealing images of the proposed giant bridges is a calculated PR strategy

The Interstate Bridge Project has contracted for nearly $10 million in public relations and communications consultants, and they’ve kept a tight lid on project images. A little over a year ago, the IBR project showed its first sketchy images of what the IBR project might look like.  At the time, one of their public relations consultants, Millicent Williams, described the project’s desire to control the dissemination and interpretation of the images:

Thank you, Commissioner Berkman.  I will share that the communications team has discussed,  first of all the fact that once images like this get shared, there will be the opportunity for people to develop a narrative and we’re working to manage that— making direct contact with media outlets to ensure that they have the accurate information and are clear about what these images represent.  Additionally we have asked that disclaimers drafts watermarks all the things that could make sure that folks know that this is not final.  This is concept.

We recognize that there is the possibility that someone might have taken a screenshot of what we just shared and so hopefully, um, we can manage that messaging as well,  and i’m sure that the team is prepared to do that but um that those are things that we’ve thought long and hard about because we want to make sure that we are not stymieing the process or the progress based on our failures to fully disclose where we are in the process and what these images represent.

IBR Executive Steering Group, January 20, 2022 (at 39:54)

Much smaller projects have sophisticated visualizations

Computer graphic simulations of new buildings and construction projects are commonplace—often as a sales and promotion tool.  Developers want to let potential investors and local governments know what a project will look like before it gets built.  That’s exactly what’s happening on the Vancouver waterfront—just not for the IBR project.  The Port of Vancouver is building a new public market building—its take on Seattle’s Pike Place Market—at Terminal One (site of the now demolished Red Lion Hotel).  The port’s architects prepared a detailed model of the public market building and the surrounding area, complete with a video “fly through” showing what the area will look like when the project is complete in a couple of years.  You can even see the current I-5 bridge in their computer video.

The Vancouver waterfront in a computer rendering showing a forthcoming public market–but not the proposed $7.5 billion IBR. (Youtube: Click image to view video)

The total cost of the public market is on the order of $10 million—roughly the same as what the IBR project has spent on public relations in the past few years.  Yet even though the IBR will cost about 750 times as much, and has such a copious budget for communication, it has not produced a comparable computer rendering, much less a human-level fly-through of the project.

Ironically, the public market modeling doesn’t include the new Interstate Bridge, which will slice through and tower over the Vancouver waterfront.  It will likely go right over the top of this particularly bucolic native garden:

Computer rendering of Vancouver’s forthcoming Public Market site: “Native plantings throughout bolster the project’s connection to the environment. . . The site’s design was guided by LEED-ND requirements.”  No mention that a 180-foot wide concrete freeway will tower ten stories above the garden.

The Columbia River Crossing images were hidden as well.

Hiding the images of what they are planning to build has been going on for more than a decade.  In 2010, Columbian reporter Eric Robinson wrote a front page story for the Vancouver paper, noting that the project had done virtually nothing to show the visual impact of the giant new bridge on downtown Vancouver and its waterfront.  He wrote:

Stand for a minute along Columbia Street near the railroad berm in downtown Vancouver.

Now look up.

A massive steel and concrete structure that today exists only in technical engineering schematics will materialize high above Vancouver’s riverfront within the decade if the proposed Columbia River Crossing sticks to its current schedule. The Interstate 5 bridge will deliver thousands of cars, heavy trucks and light rail trains into the city at roughly the height of an eight-story building.

Washington-based bridge architect Kevin Peterson is appalled.

“It looks like a big damn freeway crossing a railroad staging yard,” he said.

Vancouver Mayor Tim Leavitt acknowledged the new bridge will cast a long shadow.  . . . 

“It’ll be a monumental structure,” he said.

Yet, you’d scarcely know it by the tenor of public discussion.

Robinson, Erik, “Casting a long shadow.” Columbian; Vancouver, Wash. 01 Aug 2010: A.1.

City officials asked for a ground-level eye-view of the project, but were told it would be too expensive, and no renderings were produced. CRC official Carley Francis (now Southwest Washington WSDOT regional administrator), told Robinson that there was no guarantee they’d produce a street-level simulation of project before construction began in 2012, chiefly because the project—which at the time had spent about $134 million on planning—had “limited resources” to produce such a rendering.

Jonathan Maus, writing at Bike Portland (in 2013), reported much the same when he tried to find realistic and detailed images of the multi-billion dollar bridge project, as Oregon and Washington were being asked to fund the project:

You’d think that with all the support for the Columbia River Crossing down in Salem, lawmakers and their constituents would have a good idea about what their votes — and their tax dollars — will be going toward. But for some reason, CRC and ODOT staff have hidden the project from public view. Despite spending nearly $170 million on consultants and planning thus far, detailed renderings and/or visualizations of key elements of the project are nowhere to be found.

This is not typical of other large infrastructure projects across the country and it begs the question of whether or not CRC and ODOT staff are purposefully pulling the wool over our eyes. (emphasis added)

Our own work at City Observatory shows that the two state DOTs have been going out of their way to conceal what they’re planning to build, and to avoid showing how it will affect downtown Vancouver. For example, it took a public records request to learn that after promising to reduce the width of the CRC highway bridge from 12 lanes to 10 lanes, all the two DOTs did was erase all the references to the actual physical width of the bridge from the project’s environmental impact documents, while leaving in place plans to build a 180 foot wide highway bridge–enough for 12, or even 14 14 travel lanes.

At one point, the Oregon and Washington highway departments actually built a physical 3D model of the bridge.  Photographs of the model were published by a local television station.  But there are no photographs or other evidence of this physical model on the IBR or CRC websites.

A physical model of the Columbia River Crossing (now disappeared). This is the original shorter 95′ vertical clearance version of the bridge, not the final 116′ clearance. (KGW)

The Long History of Using Misleading Images to Sell Urban Highways

Using this kind of illusion  and creatively mis-representing the visual impact of a new construction has a long history in the world of selling highways. Robert Moses famously skewed the illustrations of his proposed Brooklyn Battery Bridge (which would have obliterated much of lower Manhattan and Battery Park); we turn the microphone over to Moses’ biographer Robert Caro, from The Power Broker:

Moses announcement had been accompanied by an “artist’s rendering” of the bridge that created the impression that the mammoth span would have about as much impact on the lower Manhattan Landscape as an extra lamppost. This impression had been created by “rendering” the bridge from directly overhead—way overhead—as it might be seen by a high flying and myopic pigeon. From this bird’s eye view, the bridge and its approaches, their height minimized and only their flat roadways really visible, blended inconspicuously into the landscape. But in asking for Board of Estimate approval, Moses had to submit to the board the actual plans for the bridge. . . .

The proposed bridge anchorage in Battery Park, barely visible on Moses’ rendering, would be a solid mass of stone and concrete equal in size to a ten-story office building. The approach ramp linking the bridge to the West Side Highway, a ramp depicted on the rendering as a narrow path through Battery Park, would actually be a road wider than Fifth Avenue, a road supported on immense concrete piers, and it would cross the entire park—the entire lower tip of Manhattan Island—and curve around the west side of the island almost to Rector Street at heights ranging up to a hundred feet in the air. Not only would anchorage and piers obliterate a considerable portion of Battery Park, they—and the approach road—would block off much of the light not only from what was left of the park but also from the lower floors of every large office building they passed; because the approach ramp was really an elevated highway that would dominate the entire tip of Manhattan, it would depress real estate values throughout the entire area.

Sprawl and Tax Evasion: Driving forces behind freeway widening

Sprawl and tax evasion are the real forces fueling the demand for wider freeways

Highway widening advocates offer up a  a kind of manifest destiny storyline: population and traffic are ever-increasing, and unless we accommodate them we’ll be awash in cars, traffic and gridlock.  The rising tide of cars is treated as a irresistible force of nature.  But is it?  Look more closely and its apparent that rising traffic levels aren’t inevitable, they’re the product of other forces.  And far from solving traffic problems, widening roads makes these problems worse.

In the case of Portland’s proposed $5 billion 5-mile long freeway widening project—the mis-named Interstate Bridge Replacement project—the real forces behind the project aren’t pre-destined levels of car traffic, but instead, are much more prosaic, and questionable:  sprawl and tax evasion.

Sprawl:  Cause and consequence of wider roads

While Oregon has some of the tightest land use controls in the nation, Washington State is still far more accommodating to rural and exurban residential development.  As many critics of the I-5 bridge project have noted, precious few commuters from Washington State to jobs in Oregon use transit, despite the fact that their are good express bus services from Vancouver to Oregon job centers.  (Prior to the pandemic, express buses carried only about 3,000 people per weekday between Oregon and Washington, compared to more than 250,000 vehicles per day crossing the river). A key reason for this auto-dominated travel pattern is that housing growth in Clark County has been driven by exurban sprawl, and workers commuting from these locations travel overwhelmingly by car.  Here’s a map prepared by Seattle’s Sightline Institute showing the comparative patterns of population growth in the Oregon and Washington portions of the metropolitan area between 1990 and 2000.  While Oregon has had little population growth outside its urban growth boundary–a testament to the policies effectiveness–Washington has experienced a rash of exurban development.

Sightline Institute

This exurban sprawl is both the source of demands for expanded highway capacity on I-5 and elsewhere, and in turn, widening roads simply encourage more such sprawl—a pattern that is repeated in metropolitan areas across the country.  The technical analysis done for the proposed Columbia River Crossing (predecessor of the IBR) estimated that 93 percent of the growth in peak hour trips on I-5 between 2005 and 2030 would result from additional population growth in the suburban fringe of Clark County (i.e. even more purple dots).

Tax evasion fuels traffic growth

While sprawl is one contributor to traffic growth, a second is tax evasion.  Here’s the short story:  Oregon has no retail sales tax; Washington charges its residents one of the nation’s highest rates (over 8 percent).  As a result, Washington residents regularly drive across the Columbia River on one of two Interstate Bridges to shop tax-free in Oregon.  They spend over $1.5 billion per year in Oregon, and effectively evade more than $120 million in sales taxes by doing so.  The average Clark County family of four evades about $1,000 of sales tax each year.

But all these sales tax evasion produces a lot of traffic on the two bridges that cross state lines:  We estimate that between 10 and 20 percent of all the trips crossing the I-5 and I-205 Columbia River bridges are Southwest Washington households driving to shopping centers in Oregon to evade Washington sales tax.  Conveniently, there are major shopping centers at Jantzen Beach and Hayden Meadows (on I-5) and on Airport Way (I-205), both just across the Columbia River into Oregon.   The parking lots of these retail centers are chock-a-block with Washington vehicles.

Jantzen Beach Home Depot parking lot (City Observatory)

Far from being inexorable and inevitable forces of nature, the factors driving the growth of traffic between Portland and Vancouver are actually symbolic of dysfunctional and environmentally destructive trends.  Rather than accommodating them, and encouraging more sprawl and tax evasion, we should be making choices that are consistent with our stated values.

A Universal Basic income . . . for Cars

California is the first in the nation to establish a Universal Basic Income . . . for cars

One of the most widely discussed alternatives for tackling poverty and inequality head-on is the idea of a “Universal Basic Income”—a payment made to every household to assure it has enough for basic living expenses.  While there have been a few experiments and a lot of political hyperbole, it hasn’t really been tried at scale.  But now, California is on the verge of enacting a Universal Basic Income, but instead of being for people, it’s for cars.

It’s a symptom of our deep car dependence thant faced with somewhat higher gas prices (still lower, in inflation-adjusted terms than a decade ago), politicians are falling all over themselves to insulate cars and driving from their real costs.  It speaks volumes that we’re so quick to allocate resources to cars and so reticent to have similar energy when it comes to tackling poverty.

High gas prices are a potent political issue for car-dependent Americans, and that’s prompted elected officials to scramble to come up with ways to ease the pain.  California Governor Gavin Newsom has proposed giving California car-owners a $400 debit card for each car they own, at a total cost of an estimated $9 billion.  It’s effectively a universal basic income (UBI), but for cars.

In an ironic parallel, the City of Oakland is reporting the results of its own recent experiment with a kind of UBI for transportation.  Oakland gave $500 households $300 debit cards that they could spend on a range of transportation services, like bus travel, bikes, scooters and ride-hailed trips.  They then surveyed participants to see how their travel patterns changed.  Overall, about 40 percent of participating households reported reducing their single occupancy car trips.  The idea of a flexible transportation allowance is great way to directly address the equity concerns of our transportation system, especially as we begin using road pricing as a way to make the transportation system function more efficiently.  But it’s striking that while a universal basic mobility allowance merits only a tiny and tentative $150,000 experiment, a universal car allowance worth nearly $10 billion is likely to move forward with little, if any consideration of its social and environmental effects.

Other states have taken a different approach to reducing transport costs, with a similar car bias.  New York Governor Kathy Hochul is proposing a gas tax holiday (which may or may not save motorists money, depending on whether oil companies pass along the savings to customers).  Of course, the cost of paying for maintaining the state’s roads will just be shifted to others, so the savings mostly an illusion.

There’s a good argument that Newsom’s debit cards directly undermine the state’s climate goals, especially by handing out money based on the number of cars a household owns. Both the California and New York plans give fiscal relief to car owners.   You have to own a car to get a California debit cards, and somewhat perversely, households with two cars (who tend to have higher incomes) get twice as much relief as families with a single car.   But the incentive effects of the tax cut are even worse than California’s debit card approach:  people will save in proportion to how much gas they buy.  Those who don’t drive much, drive fuel efficient vehicles, or who don’t own or drive cars at all, will get no relief.  The big winners will be those who own fuel inefficient vehicles and drive a lot.  At least with the California debit card approach, families don’t have to buy more gasoline to get more relief.  They can spend the $400 on anything else they like, including for example, a bus pass or part of the purchase price of a new bike.

Gas tax holidays and California’s universal basic income policy for cars are emblematic of the fundamental inequity of our current transportation policy.  Measures, like a universal basic mobility allowance, which would help those most in need and incentivize more sustainable transportation are subject to protracted experimentation at trivial scale.  Meanwhile, rising gas prices prompt sweeping and ill-considered policies that will send most benefits to those who drive the most, and which will further incentivize more driving and environmental destruction.

Flying blind: Why public leaders need an investment grade analysis

Portland and Oregon leaders shouldn’t commit to a $5 billion project without an investment grade analysis (IGA) of toll revenues

Not preparing an IGA exposes the state to huge financial risk: It will have to make up toll revenue shortfalls, 

The difference between an IGA and ODOT forecasts is huge:  half the traffic, double the toll rate.

There’s no reason to delay preparing the investment grade analysis:  The federal government and financial markets require it, and all of the needed information is available

If you don’t prepare an IGA before making a commitment to this project, you are flying blind

 

Portland are a leaders are being asked to greenlight the so-called Interstate Bridge Replacement Project, which is projected by its proponents to cost as much as $5 billion.  But they’re being asked to give a project a go-ahead with only the sketchiest financial information.  The project’s cost estimates are slightly warmed over versions of decade old estimates prepared for the failed Columbia River Crossing.  Ominously, the details of where the money will come from—who will pay and how much—are superficial and vague.

One thing project advocates grudgingly admit is that the I-5 bridge replacement can’t be financed without tolls.  Program administrator Greg Johnson and Oregon Transportation Commission Chair Bob Van Brocklin have repeatedly said as much.  But how much money tolls will produce and how high tolls will be are never clearly mentioned.  Johnson has said tolls will provide “about a third of project costs.”

Knowing how much money tolls will produce, and how high tolls will have to be to produce that revenue is the central financial question.

Currently the I-5 bridge carries about 130,000 vehicles per day.  But that volume is predicated on the bridge being free.  Charging people to use the bridge would dramatically reduce the number of crossings.  As we’ve documented at City Observatory, when tolls were added to a similar crossing, the I-65 bridges across the Ohio River in Louisville, traffic levels fell by half.

Because tolling depresses traffic, you can’t accurately estimate how much toll revenue a bridge will produce without a detailed model that accounts for this traffic depressing effect.

The models routinely used by state highway departments don’t accurately account for the effect of tolling on traffic volumes.  They tend to dramatically over-predict the amount of traffic on tolled roadways, which has led to over-built facilities that don’t generate enough toll-paying traffic to cover their costs.

Financial markets and the federal government, who are asked to loan money up-front (with a promise to be repaid by future tolls) simply refuse to believe state highway department traffic forecasts.  Instead, they insist that states pay for an “investment grade” traffic and revenue forecast.  You can’t sell toll-backed bonds on private financial markets, and you can’t even apply for federal TIFIA loans, without first getting an investment grade forecast.  In January, Portland’s Metro Council adopted a statement of Values, Outcomes and Actions governing the I-5 project, directing the Oregon Department of Transportation to prepare an Investment Grade Analysis of the project:

As the part of the finance plan, engage professionals with expertise in financing massive complex transportation infrastructure construction projects to conduct and deliver the results of an investment-grade traffic and revenue study of the design options.

That’s a critical step to making and informed decision.

What is an investment grade analysis?

Investment grade forecasts are generally prepared by one of a handful of financial consulting firms.  These studies start with the traffic models used by state highway departments, but make much more realistic assumptions about future population and employment growth, the likelihood of economic cycles, and critically, the effect of tolling on levels of traffic.  As a result, investment grade analyses invariably predict lower levels of traffic that the models used by state highway departments.  Because traffic levels are lower, tolls have to be higher to produce any given amount of revenue.

And the differences between investment grade analysis and highway department forecasts are not trivial:  they are huge.  The Oregon and Washington highway departments prepared traffic and toll estimates for the Columbia River Crossing’s Final Environmental Impact Statement published in 2011.  Those estimates were that the I-5 bridges would carry 178,000 vehicles per day in 2030, and that minimum tolls would be $1.34 to pay for about one-third of the cost of the project.  The Investment Grade Analysis for this project, prepared by CDM Smith on behalf of the two agencies in 2013 estimated that in 2030, the I-5 bridges would carry just 95,000 vehicles per day in 2030, and that tolls would be a minimum of $2.60 each way in order to cover a third of project costs.  In short, the initial highway department estimates overstated future traffic levels by double, and understated needed tolls by half.

The starkly different figures in the investment grade analysis called into question the size of the project, which was predicated on the exaggerated highway department forecasts.  If a tolled bridge would carry dramatically fewer vehicles than the existing bridge, there was no justification for building an expensive wider structure and approaches.  The money spent expanding capacity on the bridge would be wasted because fewer vehicles would use it.  Also, the dramatically different traffic figures also meant that the environmental analysis contained in the FEIS was simply wrong.

Investment Grade Analyses are required for financial prudence

The reason that the federal government and financial markets insist on the preparation of an investment grade analysis is so that they don’t get stuck holding the bag when traffic levels, and toll revenues fall short of the excessively optimistic expectations of state highway departments.  Around the county dozens of toll roads and bridges have failed to produce expected revenues, leading to delinquencies, defaults, and bankruptcies.

If anything, state lawmakers have an even larger financial stake in the IBR project than do financial markets or the federal government.  Financial markets, for example, will insist on additional state guarantees, besides repayment just from the stream of toll revenues.  They’ll require states to pledge other revenues to repay bonds, in addition to insisting on the investment grade analysis.  The 2021 Oregon Legislature passed HB 3055, which authorizes ODOT to pledge state gas tax revenues and future federal grant monies to repay holders of state-issued toll bonds.

Because the state is ultimately liable for any toll-revenue shortfalls, it has an even higher stake than private lenders or the federal government  in knowing the true level of future toll revenues as would be disclosed in an investment grade analysis.

Why ODOT doesn’t want the public to see the IGA first

ODOT and WSDOT are greatly resisting calls to prepare an investment grade analysis.  Their current project schedule doesn’t call for conducting the analysis until 2024 or 2025–well after the design of the bridge is settled and too late to consider a smaller or cheaper alternative.  The highway departments variously claim that its “too expensive” or “premature” to carry out the IGA.

There’s no technical reason it can’t be prepared now.  The base transportation data have been gathered, and the regional model exists.  The agencies say the IGA is expensive, but it’s far less costly than what the agency has spent already on public relations, and the money has to be spent anyhow.  And the IGA will continue to be valid for several years—and can easily be updated once it is complete, if that becomes necessary.  You can’t save any money by delaying.  The only real reason to put off preparing an IGA is because it will show that the IBR will carry vastly less traffic than the DOTs predict, and that tolls will have to be much higher than they’re implying.  In short, the DOTs don’t want the IGA because it will present a definitive case against the over-sized project that they’re building.  Financial markets and the Federal government will insist on the IGA before they make their decision:  the only ones being denied access to this vital financial information are local leaders and state lawmakers who will have to pay for the project.  According to DOT plans, they’ll find out the results of the IGA only after it’s too late to do any good.

Their plan is clearly to convince local and state leaders  irrevocably commit to the construction of a much larger project than could possibly be  justified it anyone saw the results of the investment grade analysis.  It’s obvious from the project’s unwillingness to do anything other than advance a single alternative (a 164-foot wide bridge, enough for ten or twelve lanes of traffic) into the next environmental analysis, that they don’t want the results of an investment grade analysis to undercut their contrived case for a massive structure.

State Highway Department Forecasts are Flawed

As we’ve written before, the IBR project is a scene-for-scene remake of the Columbia River Crossing debacle. Just as they are doing now, the state highway departments published grossly inflated traffic forecasts.  In 2010, the Oregon State Treasurer hired Rob Bain, an internationally recognized expert on toll revenue financing, and author of “Toll Road Traffic and Revenue Forecasts: An Interpreters Guide” to assist in the financial analysis of the CRC.   He found numerous flaws and biases–which prompted calls for the investment grade analysis that produced dramatically different results than the highway department projects.  Specifically, Bain reviewed the CRC traffic and revenue forecasts prepared for the project’s environmental impact statement on behalf of the Oregon State Treasurer.  He stated:

  • The traffic and revenue (T&R) reports fall short when compared with typical ‘investment grade’ traffic studies. As they stand they are not suitable for an audience focussed on detailed financial or credit analysis.
  • The traffic modelling activities described in the reports are confusing and much of the work now appears to be dated. Although a number of the technical approaches described appear to be reasonable, many of the modelling-related activities seem to ‘look backwards’; justifying model inputs and outputs produced some years ago. There is a clear need for a new, updated, forward-looking, comprehensive, ‘investment grade’ traffic and revenue study.
  • No mention is made in the reports of historical traffic patterns in the area or volumes using the bridges. This is a strange omission. Traffic forecasts need to be placed in the context of what has happened in the past. If there is a disconnect (between the past and the future) – as appears to be the case here – a commentary should be provided which takes the reader from the past, through any transition period, to the future. No such commentary is provided in the material reviewed to date.
  • Traffic volumes using the I-5 Bridge have flattened-off over the last 15-20 years; well before the current recessionary period. . . . the flattening-off is a long-term traffic trend; not simply a manifestation of recent circumstances. The CAGR for the period 1999 – 2006 reduces to 0.6%

An investment grade analysis is the bare minimum that’s needed to make a responsible and informed decision about a multi-billion dollar project.  The only reason not to ask these questions now, and to get clear answers, is because the two state DOTs know that the financial risks will prompt legislators and the public to seriously question this massive boondoggle.

A note on nomenclature:  Level I, Level 2, Level 3

Highway departments frequently label traffic forecasts as being one of three levels, ranging from a rough sketch level (Level 1), to a somewhat more detailed Level 2, and up to the financial gold standard, Level 3, an investment grade analysis.  As noted, neither the federal government nor private bond markets will make loans based on Level 1 or Level 2 studies:  they are inadequate to accurately forecast traffic for making financial decisions.  This chart from Penn State University describes the general differences between these three levels of analysis:

Editor’s Note:  Nomenclature section added August 4, 2022

Which metros are vulnerable to gas price hikes?

Green cities will be less hurt by higher gas prices; Sprawling cities are much more vulnerable to gas price hikes.

In sprawling metros like Atlanta, Dallas, Orlando, Nashville and Oklahoma City, higher gas prices will cost the average household twice as much as households living in compact metros like San Francisco, Boston, Portland and Seattle.

Rising gas prices are a pain, but they hurt most if you live in a sprawling metro where you have to drive long distances to work, shopping, schools and social activities.  Some US metros are far less vulnerable to the negative effects of rising gas prices because they have dense neighborhoods, compact urban development, good transit, and bikeable, walkable streets.  Among the 50 largest metro areas, the best performers enable their residents to drive less than half as much as the most car-dependent metros.  Those who live in metro areas where you have to drive, on average, 50 miles or more per day (places like Oklahoma City, Nashville and Jacksonville) will be hit twice as hard by higher fuel prices than the typical household living in a place like San Francisco, Boston or Portland, where people drive, on average, fewer than 25 miles per day.  When gas prices go up, it’s easy being green:  These compact, less car-dependent metros and their residents, will experience far less economic dislocation than metros where long daily car trips are built-in to urban form.

Gasoline prices have shot up in recent days, thanks to the Russian invasion of Ukraine.   A year ago, average gas prices nationally were under $3 gallon.  In February, they averaged around $3.30 per gallon.  After the Russian invasion began, oil prices and gas prices jumped.  On March 14, the national average was $4.30, and rising rapidly, with much higher prices in some markets.

There’s the usual barrage of media hand-wringing about the impact of high gas prices, notwithstanding the widespread support for backing Ukraine, even if it means higher oil prices.  Some 71 percent of Americans favored banning Russian oil imports even at the cost of higher gas prices.  As high as they seem, gas prices today are just now approaching the levels recorded in 2008, when gas prices peaked at $5.09 per gallon (in 2022 dollars).

In our largely car-dependent nation, higher gas prices feel painful, but some Americans feel the pain far more deeply than others, and some feel it not at all.  There’s been more than a little bike advocate schadenfreude on Twitter, pointing out that those who travel by bike or on foot aren’t feeling the pain of higher gas prices.

But this isn’t just about individual choices and behavior:  whole communities can be more or less vulnerable to gas price shocks, depending on how much land use patterns effectively necessitate driving.

Some metro areas are vastly more car-dependent than others, and as a result, are more vulnerable to gas price hikes.  We can get a good idea of which metros will be most affected by price hikes by looking at data on average travel distances in different cities.  The big data firm Streetlight Data published its estimates of the amount of daily driving per person for large US metros.  We’ve tabulated their publicly released data for the period just before the advent of the Coronavirus pandemic, to get a reasonable baseline for comparing travel patterns.

On average, the residents of the typical large metro area in the US drive about 30 miles per person per day (that’s a bit higher estimate than the one provided by the US Department of Transportation).  But there are extremely wide variations in average driving among metro areas.  In general, older, denser metros with more extensive transit systems seem to have dramatically less driving per person than newer, sprawling Sunbelt metros with weak transit.

The metros least likely to feel the pain of higher gas prices include Buffalo, San Francisco, Boston, New York, Portland and Seattle, where metro residents drive about 25 percent less than average.

On the other hand, the metros most vulnerable to higher gas prices are those where, due to job and population sprawl, people tend to drive much further.  These highly vulnerable metros include Oklahoma City, Orlando, Nashville, Dallas, Charlotte and Atlanta, where the typical resident drives 50 or more miles per day, according to the Streetlight estimates, nearly double the typical metro area.

Average Miles Driven Per Person Per Day Prior to Covid-21 Pandemic (Streetlight Data)

 

As we’ve pointed out before, residents of more compact metro areas, with better transit and closer destinations earn the equivalent of a huge green dividend, even when gas is cheap, because they spend far less on cars and gasoline.  Meanwhile, their counterparts in decentralized metros pay a “sprawl tax.”  When oil prices rise, the pain falls disproportionately on those who live in metros where they have to drive a lot.

The differences are significant.  The households living in metros where people drive 50 miles per person per day are conservatively buying twice as much fuel as those living in metros where people drive only 25 miles per day.  So while a family in a compact metro area would be buying say 100 gallons or so of fuel a month, its counterpart in a sprawling metro would be buying 200 gallons.  So a $1 increase in the price of gas would hit about $1,200 harder over the course of a year in a sprawling metro than in a compact one.

In the face of rising fuel prices—whether from a war, or from the the long overdue need to reflect the true social and environmental costs associated with fossil fuels—communities where people don’t have to drive as much, or drive as far, have a real economic advantage over more car-dependent places.  That’s a consideration that ought to play a larger role in local, state and national policies going forward.

 

 

 

A reporter’s guide to congestion cost studies

Reporters:  read this before you write a “cost of congestion” story.

Congestion cost studies are a classic example of pseudo-science:  Big data and bad assumptions produce meaningless results

Using this absurd methodology, you can show:

Waiting at traffic signals costs us $8 billion a year—ignoring what it would cost in time and money to have roads with no traffic lights.

Our lack of flying cars costs us hundreds of billions of dollars of travel time—never mind that putting everyone in a flying car would be financially and physically impossible.

Something is actually a “cost” only if there’s a cheaper and physically possible alternative

There’s a robust literature debunking the congestion cost studies from Texas Transportation Institute, Inrix, and Tom-Tom.

Every year or so, one or more traffic-counting organizations trots out a report claiming that congestion is costing us tens of billions dollars each year.  Despite the “big data” and elaborate estimates, the results are simply bunk, because they’re based on a flawed premise.  Each of these reports calculates as the “cost” of congestion how much longer a trip takes at peak hours compared to off-peak hours, but fails to define what actions or policies could produce such a change in traffic, and how much they would cost.  Every one of these reports tallies up the supposed “costs” of congestion, without telling how to solve the problem or what it would cost.

Traffic Lights Cost Billions

You can apply this idea of computing a “cost” to any kind of waiting.  We’ve done it, tongue-in-cheek, but calculator in hand, for cappuccino.  Others take this notion seriously.  For example, crack statisticians at the University of Maryland have sifted through reams, nay gigabytes, of big data, and have produced a comprehensive, nationwide estimate of the amount of time lost when we sit, waiting for red lights to turn green.

According to these University of Maryland estimates, time lost sitting at traffic signals amounts to 329 million vehicle hours of delay, and costs us $8.6 billion dollars per year.  They estimate that time spent waiting at traffic signals is roughly three-fifths as great as the 561 million vehicle hours of delay associated with routine “recurring” traffic congestion.

This University of Maryland study calculates that roughly 19 percent of all traffic congestion is due to waiting at traffic signals.  Those traffic lights do get in your way and slow you down.

Traffic signals cause delays as vehicles queue at intersections. In 10 states, traffic signals are the top cause of traffic congestion, though congestion levels overall remain relatively low in those states. For example, even though Alaska ranked highest in the country in percentage of delay caused by signals at 53%, it ranked 42nd in terms of total hours of delay caused by signals.

As an accounting exercise, there’s little reason to doubt these calculations. But whether they constitute a “loss” is highly doubtful, because there’s no question that we’d all collectively lose more time in travel if there were no traffic lights.  The policy implication of this finding is not that we should be tearing out or turning off traffic signals.  That would be absurd, of course.  And what the claims of time spent waiting at traffic lights constitute an actual “loss” rests on the assumption that there’s some other traffic-light free way of managing the flow of traffic at intersections that would involve less total travel time for those now waiting.  Simply getting rid of traffic lights—and say replacing them with stop signs—would likely decrease the throughput of many intersections and actually increase delays (though it might beneficially reduce traffic speeds and improve safety for vulnerable road users). Theoretically one might replace every single traffic light in the US with a fully grade separated interchange without stops.

Let’s suppose, for a moment, that you could instantly replace all of the 330,000 or so traffic signals in the US with grade-separated interchanges that eliminated traffic signals.  That might eliminate all the time “lost” by vehicles waiting at traffic lights, but it would come at a cost.  At say, $10 million per intersection (which is probably a conservative estimate) that would cost about $3.3 trillion, all that to save maybe $8.6 billion per year.  Time spent waiting at traffic lights is costly, only if you ignore the vastly greater cost of doing anything to try to reduce it.

It’s easy to point out that the theory about the “time loss” due to traffic lights is pretty silly.  But what’s true of the elaborate (but fundamentally wrong-headed) estimates of the time “lost” to traffic signals is that it also holds for all the other estimates of supposed congestion costs.  For years, a range of highly numerate charlatans have been purporting to compute the value of time lost to traffic congestion. The congestion cost studies generated by the Texas Transportation Institute, Inrix, Tom-Tom and others invariably conclude that traffic congestion costs us billions of dollars a year.  Their copious data creates the illusion of statistical precision without providing any actually useful knowledge.  They generate heat, but don’t shed any light: The congestion cost estimates are part of the propaganda effort of the road-builders, who assert we need to spend even more billions to widen roads to recoup these losses.

It’s an example of a measurement that’s literally true, but quite meaningless.  It’s true in the sense that people probably due spend millions of hours, collectively sitting at traffic lights or traveling more slowly because of congestion.  It’s meaningless, because there’s not some real world alternative where you could build enough road capacity to eliminate these delays.  So, as an elaborate accounting exercise, you can use big data and computing power to produce this estimate, but the result is a factoid that conveys no useful, actionable information—just as we’ve shown with our Cappuccino Congestion Index, which totes up the billions of dollars American’s “lose” waiting in line at coffee shops.

Where are my flying cars?  Think of all the congestion costs they’ll save!

The sky’s the limit if you want to generate large estimates of the supposed time “lost” due to slower than imaginable travel.  Consider for example flying cars, which according to this year’s Consumer Electronics Show (CES), are just about to darken our skies.  One company, ASKA, is showing a four-seat prototype that can whisk you and a friend at speeds of up to 150 miles per hour, land in the space of a helipad, and park in an area no larger than a conventional parking space.

An Aska-A5 flying car ($789,000) at CES. (CNET)

Unsurprisingly, the flying car advocates are pitching it as a solution to traffic congestion (move over Elon Musk):

. . . who doesn’t want to hop over the traffic? The Aska A5 can fly at a maximum speed of 150 mph and travel 250 miles on a single charge. That could cut a 100-mile car trip down to just 30 minutes. Aska’s Kaplinsky sees the A5 flying car tackling long commutes, allowing them to move to more affordable communities further away from big cities and reduce the number of regular cars they own, he said, adding that most people would probably use them when needed through a ride-sharing service

If you could travel by flying car to all your destinations, it would shave hours a day off your total travel time.  Imagine all the time we could save if everybody had a flying car, and what those savings would be worth.  With a spreadsheet and some travel data, you could work out an estimate of how many million hours might be saved and how many tens or hundreds billions of dollars that saved travel time would be worth.  You could produce a report arguing that the personal flying car shortage costs us in lost time and money.  It would be a large but meaningless number, because there’s no world where its financially feasible, much less physically possible, for everyone to take every trip by flying car.  The price per flying car is a cool $789,000 (plus operating costs), and there aren’t enough heliports or heliport-adjacent landing spots to accommodate everyone; not to mention that there’s no air traffic control system for thousands of such vehicles moving over cities.  The only way to make meaning of such numbers is in the context of plausible, real-world alternatives.  And that’s exactly what these cost of congestion studies almost invariably fail to consider.  Something is only a “cost” if there’s an actual practical alternative that would save the lost time without incurring even greater monetary costs in doing so.  Imaginary savings from an impossible, or impossibly expensive alternative aren’t savings at all.

It’s tempting to believe that more data will make the answers to our vexing problems, like traffic congestion, clearer.  But the reverse is often true:  an avalanche of big data obscure a fundamental truth.  That’s what’s going on here.

More Background on Congestion Cost Reports

City Observatory has written extensively on the flaws of past congestion cost studies.  Here are some of our commentaries:

Want to know more?

Really want to wonk out on all the methodological, conceptual and data flaws in these congestion cost reports?  Here are two key resources:  First, our own Measuring Urban Transportation Performance report:

Cortright_Measuring_Urban_Transportation_Performance_2010

And second, Todd Litman of Victoria Transportation Policy Institute’s critique of the urban mobility report.

 

More Congestion Pseudo Science

A new study calculates that twenty percent of all time “lost” in travel is due to traffic lights

Finally, proof for the Lachner Theorem:  Traffic signals are a major cause of traffic delay

Another classic example of pseudo-science:  Big data and bad assumptions produce meaningless results

When I was in graduate school, I shared a house in Berkeley with five roommates.  Once a week we’d pool our food dollars, and pile into Archie Lachner’s ’67 Falcon and drive across town to Lucky, Safeway or the Co-Op, and mount a group shopping expedition for the week.  This was in the late 70s, just after Berkeley had installed a series of traffic diverters to stop cut-through driving in residential neighborhoods.  Our driver, Archie, repeatedly chose routes that were blocked by one diverter and then another.  He cursed at the inconvenience:  “These traffic diverters, they get in your way, they slow you down.”  That prompted a heated debate about the merits of diverters.  Archie defended the inherent right of drivers to go wherever they wanted.  Others in the car said they could see how people who lived on these streets might appreciate the diverters cutting down on or at least slowing traffic. Archie had to turn around at least twice to avoid diverters, and as we finally got near the grocery store, we came to to a stop at a red traffic signal.  From the back seat, someone said:  “These traffic lights, they get in your way, they slow you down.”    Offended, Archie, spun the wheel and drove home–“if you can’t respect the driver, you won’t get a ride.”  Despite the protests, Archie drove a couple of miles back home, and the five other roommates had to repeat the trip in another car.

 

Traffic signals cause 20 percent of all time lost to congestion!

Thus was born the Lachner theory of traffic congestion:  Traffic lights get in your way and slow you down.  For decades the theory has been wanting for actual quantification, but at last, we have it.  Crack statisticians at the University of Maryland have sifted through reams, nay gigabytes, of big data, and have produced a comprehensive, nationwide estimate of the amount of time lost when we sit, waiting for red lights to turn green.

According to these University of Maryland estimates, time lost sitting at traffic signals amounts to 329 million vehicle hours of delay, and costs us $8.6 billion dollars per year.  Time spent waiting at traffic signals is roughly three-fifths as great as the 561 million vehicle hours of delay associated with routine “recurring” traffic congestion.

This new study from the University of Maryland finally vindicates the Lachner theorem.  By their reckoning, roughly 19 percent of all traffic congestion is due to waiting at traffic signals.  Those traffic lights do get in your way and slow you down.

Traffic signals cause delays as vehicles queue at intersections. In 10 states, traffic signals are the top cause of traffic congestion, though congestion levels overall remain relatively low in those states. For example, even though Alaska ranked highest in the country in percentage of delay caused by Signals at 53%, it ranked 42nd in terms of total hours of delay caused by signals.

As an accounting exercise, there’s little reason to doubt these calculations. But whether they constitute a “loss” is highly doubtful, because there’s no question that we’d all collectively lose more time in travel if there were no traffic lights.  The policy implication of this finding is not that we should be tearing out or turning off traffic signals.  That would be absurd, of course.  And what the claims of time spent waiting at traffic lights constitute an actual “loss” rests on the assumption that there’s some other traffic-light free way of managing the flow of traffic at intersections that would involve less total travel time for those now waiting.  Simply getting rid of traffic lights—and say replacing them with stop signs—would likely decrease the throughput of many intersections and actually increase delays (though it might beneficially reduce traffic speeds and improve safety for vulnerable road users). Theoretically one might replace every single traffic light in the US with a fully grade separated interchange without stops.

Let’s suppose, for a moment, that you could instantly replace all of the 330,000 or so traffic signals in the US with grade-separated interchanges that eliminated traffic signals.  That might eliminate all the time “lost” by vehicles waiting at traffic lights, but it would come at a cost.  At say, $10 million per intersection (which is probably a conservative estimate) that would cost about $3.3 trillion, all that to save maybe $8.6 billion per year.  Time spent waiting at traffic lights is costly, only if you ignore the vastly greater cost of doing anything to try to reduce it.

It’s easy to point out that the Lachner Theorem about the “time loss” due to traffic lights is pretty silly.  But what’s true of the elaborate (but fundamentally wrong-headed) estimates of the time “lost” to traffic signals is that it also holds for all the other estimates of supposed congestion costs.  For years, a range of highly numerate charlatans have been purporting to compute the value of time lost to traffic congestion. The congestion cost studies generated by the Texas Transportation Institute, Inrix, Tom-Tom and others invariably conclude that traffic congestion costs us billions of dollars a year.  Their copious data creates the illusion of statistical precision without providing any actually useful knowledge.  They generate heat, but don’t shed any light: The congestion cost estimates are part of the propaganda effort of the road-builders, who assert we need to spend even more billions to widen roads to recoup these losses.

It’s an example of a measurement that’s literally true, but quite meaningless.  It’s true in the sense that people probably due spend millions of hours, collectively sitting at traffic lights or traveling more slowly because of congestion.  It’s meaningless, because there’s not some real world alternative where you could build enough road capacity to eliminate these delays.  So, as an elaborate accounting exercise, you can use big data and computing power to produce this estimate, but the result is a factoid that conveys no useful, actionable information—just as we’ve shown with our Cappuccino Congestion Index, which totes up the billions of dollars American’s “lose” waiting in line at coffee shops.

The sky’s the limit if you want to generate large estimates of the supposed time “lost” to slower than imaginable travel.  Consider for example flying cars or helicopters.  If you could travel by helicopter to all your destinations, it would shave hours a day off your total travel time.  With a spreadsheet and some travel data, you could work out an estimate of how many million hours might be saved and how many billions of dollars that saved travel time would be worth.  You could produce a report arguing that the personal helicopter shortage costs us in lost time and money.  It would be a large but meaningless number, because there’s no world where its financially feasible, much less physically possible, for everyone to take every trip by helicopter.

The only way to make meaning of such numbers is in the context of plausible, real-world alternatives.  And that’s exactly what these cost of congestion studies almost invariably fail to consider.  Something is only a “cost” if there’s an actual practical alternative that would save the lost time without incurring even greater monetary costs in doing so.  Imaginary savings from an impossible, or impossibly expensive alternative aren’t savings at all.  All of the evidence about induced travel shows that expanding capacity to try and reduce time “lost” to congestion is ultimately futile:  more capacity encourages more travel, induces more sprawl, and does nothing to reduce congestion and delay.

It’s a welcome sign that one recent report acknowledged this fundamental fact.  To their credit, at least Tom-Tom acknowledges that adding capacity is futile, or even-counterproductive:

Developing road infrastructures and increasing the capacity isn’t the solution. “When a new road is built, it is only a matter of time before more vehicles are added to the road, offsetting this initial easing: it’s called the traffic demand dilemma”, Ralf-Peter Schäfer said. Change behaviours and traffic patterns can make a significant difference. Congestion is non-linear: once traffic goes beyond a certain threshold, congestion increases exponentially. Discouraging drivers to drive during peak rush hour can lead to big improvements, as proven during the pandemic.

And the purveyors of congestion cost estimates almost never point to the only solution that’s been proven to reduce congestion:  road pricing.  Even a modest system of time-based user fees could dramatically reduce congestion.

It’s tempting to believe that more data will make the answers to our vexing problems, like traffic congestion, clearer.  But the reverse is often true:  an avalanche of big data obscure a fundamental truth.  That’s what’s going on here.

 

Freeway widening for whomst?

Widening freeways is no way to promote equity.  The proposed $5 billion widening of I-5 between Portland and Vancouver is purportedly being undertaken with “an equity lens,” but widening Portland’s I-5 freeway serves higher income, predominantly white workers commuting from Washington suburbs to jobs in Oregon.

The median income of peak hour, drive alone commuters to Oregon from Clark County is $106,000; significantly higher than for the region as a whole (about $78,000).  

More than 53 percent of peak hour drive alone commuters are from households with incomes over $100,000; fewer than 15 percent of these peak hour car commuters have incomes under $50,000 annually.

Some 86 percent of peak hour, drive-alone commuters are non-HIspanic whites, according to the 2019 American Community Survey; only 14 percent of these peak hour car commuters are persons of color.  Peak hour drivers are half as likely to be people of color (14 percent) as are residents of the region (28 percent).

Clark County is less diverse than the rest of the Portland metro area; its residents of color are vastly more likely to work at jobs in Clark County than to commute to jobs in Oregon.

The proposal to spend $5 billion to widen a 5-mile stretch of I-5 between Portland and Vancouver is being marketed with a generous dose of equity washing.  While it is branded the “Interstate Bridge Replacement” or IBR,  replacing the bridge is less than a quarter of the total cost; most of the expense  involves plans to double the width of the freeway to handle more peak hour traffic. The project has gone to some lengths to characterize suburban Clark County as an increasingly diverse population to create the illusion that the freeway widening project is primarily about helping low and moderate income households and people of color travel through the region.  A quick look at Census data shows these equity claims are simply false.  Peak hour freeway travelers commuting from homes in Washington to jobs in Oregon are overwhelmingly wealthy and white compared to the region’s average resident.

 

Equity? A proposed super-sized $5 billion freeway would mostly serve peak hour commuters with incomes over $100,000, 86 percent of whom are non-HIspanic whites.

What this project would do is widen from 6 lanes, to as many as 14 lanes, five miles of Interstate 5 between Portland and Vancouver.  The principal reason for the project is a claim that traffic volumes on I-5 cause the road to be congested.  But congestion is primarily a peak hour problem, and is caused by a large and largely uni-directional flow of daily commuter traffic.  About 60,000 Clark County residents work at jobs in Oregon, and they commute across either the I-5 or I-205 bridges.  Fewer than a third that many Oregonians work in Clark County, with the result being that the principal traffic tie-ups coincide with workers driving from Clark County in the morning, and back to Clark County in the evening.  Plainly, this is a project that is justified largely on trying to provide additional capacity for these commuters.  That being the case, who are they?

Census data show that the beneficiaries of the IBR project would overwhelmingly be whiter and higher income than the residents of the Portland metro area.  As with most suburbs in the United States, Clark County’s residents, who are those most likely to use the IBR project, are statistically whiter and wealthier than the residents of the rest of the metropolitan area.  In addition, the most regular users of the I-5 and I-205 bridges are much more likely to be white and higher income than the average Clark County resident.  This is especially true of peak hour work commuting from Clark County Washington to jobs in Oregon, which is disproportionately composed of higher income, non-Hispanic white residents.

Peak hour, drive-alone commuters are overwhelmingly white and wealthy

Data from the American Community Survey enable us to identify the demographic characteristics of peak hour, drive-alone commuters going from Clark County Washington to jobs in Oregon on a daily basis. Here are the demographics of the nearly 20,000 workers who drive themselves from Clark County to jobs in Oregon, and who leave their homes between 6:30 AM and 8:30 AM daily.  Some 53 percent of peak hour, drive-alone commuters from Clark County to Oregon jobs lived in households with annual incomes of more than $100,000.  The median income of these peak hour drivers was $106,000 in 2019, well above the averages for Clark County and the region.

Fully 86 percent of the peak hour, drive-along commuters from Clark County to Oregon jobs were non-Hispanic whites.  Only about 14 percent of these peak hour drivers were persons of color.  The racial/ethnic composition of these peak hour car commuters is far less diverse than that of Clark County, or the region.  Clark County workers who work in Clark County are about 50 percent more likely to be people of color than those who commute to jobs in Oregon.

Clark County is whiter and wealthier than the region and Portland

Suburban Clark County, Washington is whiter and wealthier than the rest of the Portland metropolitan area, and the City of Portland. Clark County may be more racially and ethnically diverse than it once was, but so is the entire nation.  And it’s still disproportionately whiter and wealthier than the rest of the region.  Only about 23 percent of its residents are people of color, compared to about 38 percent for the region as a whole, and about 30 percent for Portland, according to the 2019 American Community Survey. Clark County’s median household income of $80,500 is higher than for the region ($78,400) and for the City of Portland ($76,200).

Few low income and workers of color commute to Oregon from Clark County

Not only is Clark County less diverse than the rest of the Portland region, only a small fraction of its low income workers and workers of color commute to jobs in Oregon at the peak hour.  More than ten times as many low income workers and workers of color who live in Clark County work at jobs in Clark County than commute to jobs in Oregon.  About 38,000 Clark County workers in households with incomes of $50,000 or less work at jobs in Clark County; only about 2,800 are peak hour, drive-alone commuters to jobs in Oregon.  About 31,000 Clark County workers of color work at jobs in Clark County.  If we’re concerned about addressing the transportation needs of low income workers and workers of color in Clark County, we should probably focus our attention on the vast majority of them who are working at jobs in the county, not the comparatively small number commuting to Oregon.

Middle and upper income households are far more likely to commute to jobs in Oregon

In general, for Clark County residents, the higher your income, the more likely you are to commute to a job in Oregon.  Only about 1 in 5 workers in households with incomes less than $40,000 in Clark County commute to jobs in Oregon.  About 30 percent of workers in middle and upper income families in Clark County commute to Oregon jobs, meaning that these higher income households are about 50 percent more likely to commute to jobs in Oregon than lower income households.

 

Data notes

Data for this post is from 2019 American Community Survey, via the indispensabile  University of Minnesota IPUMS project:

Steven Ruggles, Sarah Flood, Ronald Goeken, Josiah Grover, Erin Meyer, Jose Pacas and Matthew Sobek. IPUMS USA: Version 10.0 [dataset]. Minneapolis, MN: IPUMS, 2021. https://doi.org/10.18128/D010.V10.0.

Biased statistics: Woke-washing the I-5 Boondoggle

The Oregon and Washington transportation departments are using a biased, unscientific survey to market their $5 billion I-5 freeway widening project.

The survey over-represents daily bridge users by a factor of 10 compared to the general population.

The IBR survey undercounts lower and middle income households and people of color and overstates the opinions of White non-hispanics, higher income households, and Clark County residents

As we’ve noted, highway builders are increasingly engaging in woke-washing, claiming—after decades of experience in which freeway projects have devastated communities of color and destroyed city neighborhoods across the country—that wider freeways will somehow be a good thing for low income people and people of color.

The latest example of this comes from the sales campaign to promote the $5 billion I-5 freeway widening between Portland and Vancouver Washington, misleadingly branded as the “Interstate Bridge Replacement” (IBR) project.  The reality is pretty simple:  the primary beneficiaries of a wider roadway would be higher income, overwhelmingly white commuters who drive daily from suburbs in Washington State to jobs in Oregon.  As we documented last month, the peak hour drive-alone car commuters who cross the I-5 and I-205 bridges from Washington State to jobs in Oregon are whiter and wealthier than the region’s population, with median incomes of $106,000, and 86 percent non-Hispanic whites.

But the IBR project has carefully constructed an alternate reality in which this car-centric freeway widening project is really something that benefits low income people and people of color.  The project’s promotional materials—which actually don’t show the project, or acknowledge its price tag, or the fact that it will charge tolls to bridge users—prominently features stock images of people of color.

Here’s what we mean by “woke-washing.”  The project’s home page featured this image . . .

See our commitment to equity: We bought this stock photo! (Source: Interstate Bridge Replacement Project, March 7, 2022).

. . . which is a stock photograph used by hundreds of websites, mostly those focusing on women’s health.  (Just an aside: A true health-oriented and equity focused project wouldn’t build a 12-lane wide, 5 mile long freeway guaranteed to increase air pollution and with a long history of destroying neighborhoods.)

In addition to its woke imagery, the IBR project supplements this messaging with a pseudo-scientific web-based survey which purports to show that the project is really for lower income people of color.

Selling a $5 billion freeway widening with a woke-washed fable

The IBR staff have developed a fictional “just so” story of how the freeway widening project is needed to help low income households and people of color, who’ve moved to Clark County for cheaper housing, but have to travel to jobs and other opportunities in Oregon.  The survey is grounded, not in actual scientific data, but the project’s own unscientific and biased web-based survey.

Here is IBR staff person Jake Warr, making this false claim to the January 20, 2022 Executive Steering Committee meeting:

One thing that really came out through this survey that I want to highlight is when we . . . asked how often people drive across the bridge, we found a higher percentage of folks who identified with a race or ethnicity besides white or or in addition to white/Caucasian, the non-white respondents really reported more frequently traveling across the bridge.

So that 53 percent‑that’s listed there, 53 percent‑of our of our BIPOC survey respondents reported traveling across the bridge either daily or a few times a week. That’s compared to closer to 40 percent for the white respondents.

IBR’s unscientific and biased web-based survey.

So just something that that really kind of drives home a point that we’ve suspected. It provides further data that you, we’ve seen a trend in our region of folks of color being pushed to further areas of the region, being pushed north of the river, or seeking out more affordable housing north of the Columbia River, but still relying on services jobs etc, in Multnomah County.

And so, there’s that piece that I think this speaks to. We also suspect that related to Covid, as people were answering this question in the context this pandemic, there might be some explanation there, as we know that BIPOC individuals tend to be, disproportionately rely needing to work still in a location and not be able to work from home.

That might have contributed to this but just something that we really found was was a poignant data piece to point out.

The trouble is, this claim is easily disproved by referring to valid survey data from the Census Bureau which shows that commuters across the I-5 and I-205 bridges are actually disproportionately white, and higher income.  Low income workers, and those of color, are dramatically under-represented among bridge commuters.

A biased, unscientific survey from the IBR

The trouble with web-based surveys is they suffer from self-selection bias.  Only highly motivated people take such surveys, and the opinions, experience and demographics of these people differ substantially, and systematically, from the general population.  As a result, it’s simply invalid to make statistical claims (such as people of color are more likely to use the bridge frequently).  That’s especially true when there’s valid scientific data from the American Community Survey, which shows exactly the opposite: peak hour users (for whom the bridge is being expanded) are 86 percent non-Hispanic white and have average incomes of $106,000).

To see just how biased the unscientific IBR web-survey is, we can compare it to other surveys conducted with more valid methodologies.  The correct way to do surveys is with an random selection methodology; the IBR actually commissioned such a survey in 2020.  In its random survey of more than 900 Portland area voters, 13 percent of respondents reported never crossing the I-5 bridge over the Columbia, compared to just 1 percent in the unscientific online survey.  The random survey of voters showed only 5 percent of respondents crossed the I-5 bridge every day, compared to 19 percent in the unscientific online survey.  As a result, the unscientific online survey implies the ratio of daily users to non users is 19 to 1 (there are 19 times as many daily users as never users), while the random survey shows that there are two and a half times as many non-users as daily users of the I-5 bridge.  That means that the unscientific survey overweights the role—and opinions—of daily users relative to non users by more than an order of magnitude relative their share of the overall population of the Portland metropolitan area.

Source: IBR Community Opinion Survey, 2020

Demographic bias in the IBR unscientific web survey

A quick look at the American Community Survey, which is conducted annually by the Census Bureau, shows that the demographics of the IBR’s unscientific web-based survey are dramatically different from the metro area.

One essential for surveys is that participants should be randomly selected.  If they’re not randomly selected, there’s little guarantee that the results will be representative of the larger population. One of the sure tells of a non-random survey is that the characteristics of survey participants don’t match up well with the characteristics of the overall population of the area being surveyed.  That’s the case here.  The IBR survey systematically over-represents some groups, and systematically underrepresents others, which should cast doubt on the validity of its results.  The survey systematically over-represents white, non-Hispanic people, higher income households, and residents of Washington State, and systematically under-represents people of color, low and moderate income households, and Oregon residents.  Here are the details.

Income:  Higher incomes over-represented.  The respondents to the unscientific web-survey are much higher income than the overall population.  Some 44 percent of survey respondents had household incomes over $100,000; only 38 percent of the region’s households had incomes that high.

Race and Ethnicity:  People of color under-represented.   The respondents to the unscientific web-based survey are much more likely to be non-Hispanic white than the overall population; some 85 percent of survey respondents were non-Hispanic white compared to 72 percent of the region’s population.  People of color were 28 percent of the region’s population, but only 15 percent of survey respondents.  People of color were undercounted by almost half in this unscientific survey.

Residence:  Clark County over-represented.  The respondents to the unscientific web-based survey are disproportionately residents of Clark County.  Clark County accounts for less than 20 percent (488,000 of the region’s 2.5 million residents) but accounts for 43 percent of those taking the survey.  Clark County resident views are given more than double the weight of view of other of the region’s residents in this unscientific survey.

Age:  Young people significantly under-represented. There’s also a strong generational bias:  only 5 percent of survey respondents are under 25, compared to nearly 30 percent of the population.  And these people will be the ones who have to live with the environmental consequences of the project.

No doubt the highway agencies will point with pride to the large number of completed surveys–more than 9,000 to date.  But large numbers are irrelevant if you don’t have a random sample.  For a metropolitan area the size of Portland, you need only about 400 to 800 survey participants to come up with statistically valid results,  if you have a random sample.  If you don’t have a random sample, then even very large numbers (and IBR surveyed only about one-third of one percent of the region’s residents) just aren’t meaningful.  The underlying problem that invalidates the survey is called “Self-Selection Bias.” Because this isn’t a true random survey, and because respondents choose whether to participate, there’s no guarantee that the survey data reflect the views (and experiences) of the larger population.  Because those who are predisposed to care about this issue are likely to differ systematically from the rest of the population, the survey produces results that are biased.

Not asking the most important question:  Who wants to pay a toll?

There’s a lot more to dislike about the survey beyond its poor quality sampling strategy and biased sample.  The questions posed in the survey don’t get at the real issues raised by the freeway widening project.  The project’s financial plan shows that it won’t be built without tolls—something you’d be hard-pressed to learn from any of the “public information” work.  The last estimates prepared for the Columbia River Crossing showed I-5 tolls would be a minimum of $2.30 during off peak hours, rising to $3.25 during rush hour, with additional surcharges for those who didn’t buy transponders for their cars in advance.  The survey didn’t reveal these toll rates, or ask people whether they might prefer a smaller, less expensive bridge with lower tolls, to a larger one with these high tolls, or whether they’d really rather keep the existing bridge if it meant they could avoid tolling altogether.  Despite the fact that the survey avoided talking about tolls, many survey respondents raised the question in answering open-ended questions.

It’s rather like a taste test survey that asks people whether they’d prefer filet mignon to a hot dog, without revealing the price tag of either alternative.  For a project that claims so prominently to care about “centering equity,” failing to reveal that people might have to pay on the order of $1,600 per month to commute daily across this bridge is a monumental omission.  But it’s no accident:  the project’s “public information” campaign is designed is an intentionally misleading way to manufacture consent, not to accurately measure public attitudes.

Surveys can be a useful way to gauge public opinion, if they’re undertaken in a scientifically valid fashion.  But if you aren’t careful, you end up with a classic, garbage-in, garbage-out exercise.  That appears to be the case with survey work commissioned by the “Interstate Bridge Replacement” project, a thinly veiled marketing campaign for freeway widening funded by the Oregon and Washington transportation departments—with “communications” consultants reaping more more than $4 million for their services in the past few years.

 

 

 

The I-5 bridge “replacement” con

Oregon and Washington highway builders have re-branded the failed Columbia River Crossing as a “bridge replacement” project:  It’s not.

Less than 30 percent of the cost of the nearly $5 billion project is actually for replacing the existing highway bridge, according to independent accountants.

Most of the cost is for widening the freeway and rebuilding interchanges for miles north and south of the bridge crossing, replacing the current bridge is somewhere between $500 million and one billion.

Calling $5 billion, 5-mile long freeway a “replacement bridge” is like saying if you buy a new $55,000 truck it’s a “tire replacement.” 

Nearly a decade ago, the “Columbia River Crossing—the multi-billion dollar plan to build a wider I-5 freeway between Portland and Vancouver—collapsed of its own fiscal weight, after both the Oregon and Washington Legislatures refused to pony up an estimated $450 million each (as well as signing a blank check to cover future cost overruns and revenue shortfalls). Project advocates delayed for as long as they could revealing the project’s true price tag and actually asking for the money, and when they finally did, legislators balked.

Promoters of the newly re-chrisented “Interstate Bridge Replacement (IBR) Program” have been assiduous in their efforts not to talk about the scale or cost of the project. In two years, they’ve yet to produce a single, new comprehensive illustration of the project—something that’s a standard fare in megaprojects.

That new name is part of the sale pitch.  Ever since attempting to breathe life back into the failed Columbia River Crossing project, the Oregon and Washington Departments of Transportation and their coterie of consultants have been engaged in an extensive effort to rebrand the project to make it more salable. (According to Clark County Today, over the past two years, $5.3 million—more than a quarter of the project’s $21 million spending—has been for “communications.”)

It’s no longer ever referred to as  the “Columbia River Crossing”—although the project’s expensive PR consultants failed to get that talking point to the White House, as President Biden recently referred to it by it’s obsolete moniker.  instead, it’s the far more modest “I-5 bridge replacement program”.  The project’s public materials talk mostly about the existing bridge, and as we’ve noted, almost never reveal that the total project is 5 miles long, that it contemplates widening this stretch of freeway to 12 (or more lanes), will cost upwards of $5 billion, and require minimum tolls of $5 for every round trip across the river.  Project staff are even leery of letting anyone look at computer renderings of the project.

The drawings of the Columbia River Crossing hint at just how massive this project would be.  The following animated GIF shows the design for the CRC as it crosses Hayden Island, superimposed on an aerial view of the existing freeway.  And none of what’s shown in this particular illustration includes the actual bridge structure crossing the Columbia River (which would be out of frame to the left).

The plans for Hayden Island show that much of the area would be paved over in a complex web of on- and off-ramps, flyovers, and multi-lane arterials.  Little wonder the residents of the island are strongly opposed to the project, saying:  “the massive footprint over Hayden Island .  .  . will destroy our community.”  (Hi-Noon Newsletter, January 26, 2022).

On and off ramps for the Columbia River Crossing on Hayden Island, south of the Columbia River.

Calling it just a “replacement” is PR gimmick to conceal all these elements of the project.  But it also conceals where the real money is going:  the reality is that the “replacement” of the two existing I-5 bridges, is just a small part of the project’s total costs—less than 30 percent according to independent estimates.

The “bridge” part of the IBR is less than 30 percent of total costs

In 2012, forensic accountant Tiffany Couch undertook a detailed audit of the CRC cost estimates.  Her analysis showed that the portion of project costs attributable to the bridge structure was $796.5 million—just a shade under $800 million.  Her analysis showed these costs represented just 23 percent of the total $3.49 billion price tag for the entire project..

Acuity Group, Inc., Report #6 Columbia River Crossing – Cost Allocation Discrepancies, April 8, 2013

The estimates by Acuity Group differ from the summary level budget breakdowns publicly distributed at the time by the CRC project staff.   According to Acuity, CRC transferred a portion of the costs associated with interchange overpass construction to the “bridge” portion of the project, effectively understating the cost of the freeway widening on either side of the river, and overstating the cost of the river crossing itself:

According to the CRC’s own detailed budgets, the costs to build the interchanges in Oregon and Washington are expected to cost hundreds of millions more than what is being reported to legislators, public officials, and the citizens of Oregon and Washington. Conversely, the CRC’s own detailed budget shows that the cost to tear down and rebuild the interstate bridge is hundreds of millions less than what is being reported.

According to the forensic accountants, ODOT and WSDOT shifted a portion of the cost of reconstructing interchanges north and south of the bridge by allocating all of the costs associated with overpass structures for these interchanges to the category “interstate bridge”:

. . . we found that when we allocated the cost of the overpasses associated with each interchange to the cost of the interstate bridge, we were able to reconcile to the CRC’s public communications and maps.

Replacing the existing bridge capacity might be only $500 million

Even at $800 million, this price estimate is too high to count as a “replacement” cost, because  much of the cost is associated with increasing the bridge’s capacity to 12 lanes, rather than simply replacing the existing 6 traffic lanes.  Inasmuch as the CRC plan calls for building two side-by-side bridges (each about 90 feet wide), the cost of “replacing” the existing structure with a new one is just the cost of one of these two bridges.  That means the cost of a like-for-like bridge replacement would be less than $500 million.

The CRC and IBR projects are proposing two new bridges: only one is a “replacement;” the other is an expansion.

It also now appears that the revived IBR project will be even larger and more expensive than the CRC.  For example, it has at a minimum added in some expenses that were cut out of the final CRC design, such as the North Portland Harbor Bridge, spanning the a slough south of the Columbia River (which would add about $200 million to the project’s cost).

What this means is that, if the “IBR’ were just about replacing the I-5 Columbia River bridges, its cost would be far smaller—in all likelihood less than $1 billion.  A right-sized bridge would be much more affordable, and wouldn’t raise the strong environmental objections that are associated with the DOTs freeway widening plans.

The IBR Project is still hiding the cost

The epic failure of the Columbia River Crossing had everything to do with the project’s unwillingness to talk frankly about finances, and the same mistake is being repeated this time as well.  It’s fair to ask, why should we rely on ten-year old cost estimates in sussing out the actual cost of “replacing” the current bridges?

The reason is that, so far, after more than two years of work to revive the project, ODOT and WSDOT have yet to produce any new cost estimates.  Their “draft” financial plan, released in November 2020, is based on the old CRC budget, with some adjustments for inflation.  In the past year, none of the meetings of the “Executive Steering Group” supposedly charged overseeing the project has discussed project costs or financing.

The fact that the project hasn’t done new, ground-up cost estimates isn’t an oversight—it’s a conscious strategy, to avoid revealing the true cost and scale of the project—and subjecting themselves to the kind of scrutiny offered in the Acuity forensic analysis of the CRC budget.

It’s a bit like going to the car dealership to get a new set of radials for your fifteen-year old F150, and coming back home in a  new $50,000 pickup truck, and telling your spouse that it’s a “tire replacement” program.

It’s always been a bloated boondoggle

In less guarded moments, influential local politicians have been outspoken about the excessive costs generated by ODOT and WSDOT.   Congressman Peter DeFazio famously declared the Columbia River Crossing project to be a gold-plated monstrosity.  In the Oregonian on August 14, 2011, Representative DeFazio said:

“I kept on telling the project to keep the costs down, don’t build a gold-plated project,” a clearly frustrated DeFazio said. “How can you have a $4 billion project? They let the engineers loose, told them to solve all the region’s infrastructure problems in one fell swoop… They need to get it all straight and come up with a viable project, a viable financing plan that can withstand a vigorous review.”
(Manning, Jeff. “Columbia River Crossing could be a casualty of the federal budget crunch”, The Oregonian, August 14, 2011).
Later, Representative DeFazio told Oregon Public Broadcasting:
“I said, how can it cost three or four billion bucks to go across the Columbia River?  . . . The Columbia River Crossing problem was thrown out to engineers, it wasn’t overseen: they said solve all the problems in this twelve-mile corridor and they did it in a big engineering way, and not in an appropriate way.”
“Think Out Loud,” Oregon Public Broadcasting, August 18, 2011.

The irony is that if this project were just about replacing the bridge, rather than building a massive freeway, not only would the project be vastly cheaper, there’d almost surely be less public opposition to the project.  The objection isn’t to having a safe, functional bridge, its to building a giant highway that will worsen pollution and bankrupt taxpayers and commuters.

Portland: Don’t move or close schools to widen freeways

Adah Crandall is a sophomore at Grant High School. She is the co-lead of Portland Youth Climate Strike and an organizer with Sunrise PDX’s Youth Vs ODOT campaign, a biweekly series of rallies fighting for the decarbonization of Oregon’s transportation systems.

 

City Observatory is pleased to publish this commentary by Adah Crandall on a proposal currently being considered to move Harriet Tubman Middle School to facilitate the $1.25 billion widening of the Interstate 5 freeway through Portland’s Rose Quarter.  Crandall’s advocacy was recently profiled in a report by Bloomberg CityLab.  Portland Public Schools (PPS) is considering an option that would close another predominantly Black school (Martin Luther King, Jr., Elementary) to provide a new site for Tubman.

Crandall gave this testimony to the Portland School Board on January 25, 2022.  A full video of her testimony is here:

 

 

Good evening board members, my name is Adah Crandall and I’m a sophomore at Grant High School.

I’m here tonight because I am extremely concerned about your proposed relocation of Harriet Tubman Middle School. It’s finals week right now, and I should be studying for my algebra test tomorrow morning. But instead, here I am at a school board meeting begging you to do what is right and not displace students to accommodate the expansion of fossil fuel infrastructure in the middle of a climate crisis.

In preparation for this, I spent some time looking into PPS’s bullying policy, because here’s the thing: I think the Oregon Department of Transportation is a bully, and that you all are bystanders doing nothing about it. And I don’t know what you all were taught, but what I learned in your school system is that when you see someone being picked on, you’re supposed to stand up for them.

So why is it that when ODOT’s proposed freeway expansion is literally cutting into Tubman’s backyard and threatening to displace hundreds of students, your response is to just give in and let it happen? The PPS website says bullying is “strictly prohibited and shall not be tolerated,” and to me it seems like you’re breaking your own rule. Why aren’t you modeling to students what it means to be an active ally and stand up against injustice?

As a former Tubman student, I know the pollution at Tubman is dangerous- no students should have to worry about if the air they’re breathing at recess will one day cause asthma or lung cancer. But the decision to move the school rather than fight the freeway expansion follows the same short- sighted line of thinking that started the climate crisis in the first place. Yes, you can move student’s away from the direct threat of pollution, but you cannot move them away from the life of climate disasters they’re inheriting as a result of your decision to support fueling this crisis without making ODOT even study the alternatives.

ODOT has bullied you into thinking this freeway expansion is inevitable, but it’s not. PPS could avoid all the community disruption associated with displacing Tubman and potentially King Elementary by simply forcing ODOT to consider “not building the freeway”. The project just lost a key federal approval last week, remains tied up in multiple lawsuits, and is currently $500 million short. These recent updates are a massive step forward for efforts to stop the expansion, efforts that for some reason, PPS seems to be completely ignoring.

I urge you to join in with the community groups demanding ODOT fully study the environmental impact of the Rose Quarter freeway expansion, which would include studying congestion pricing, an alternative that would reduce congestion and pollution rather than increasing it.

At the last board meeting I attended, I asked each of you to raise your hand if climate justice was important to you, and as I remember with striking clarity, everyone had their hand up. This is your chance to follow through on that promise. Don’t just raise your hands, raise your voices, and raise your standards. If you truly value climate justice, you will not settle for the displacement of students to accommodate expansion of fossil fuel infrastructure into the backyard of a middle school.

If you truly care about climate justice, you will not let ODOT get away with this and destroy my generation’s future. Tonight I urge you to stand true to the values you teach students, and dare to imagine a better world. Stand up for us.

 

Editor’s Note (March 29, 2022):  Portland Public Schools subsequently decided not to relocate the Harriet Tubman School to the King school location.  It is exploring other locations in Portland’s Albina neighborhood.

Transportation trends and disparities

If you aren’t talking about our two-caste transportation system, you’re not really addressing equity.

Portland’s regional government is looking forward at trends in the transportation system and their implications for equity.  In December, City Observatory submitted its analysis of these trends for Metro’s consideration.

Local and regional leaders are increasingly promoting concerns of equity in transportation, as well they should.  But many analyses of equity leave out the most fundamental inequity in the structure of transportation:  our explicit two-caste system that privileges those who can afford and can operate cars, and systematically disadvantages everyone else:  those too young, too old, too infirm or too poor to own and operate a motor vehicle.  Those in the lower caste are condemned to lives of impaired access to the economy and society, and greater risk of death and injury when they do travel. Many of the other observed inequities in transportation flow directly from this two caste system.

If governments are serious about rectifying inequities in transportation they have to look past symptoms and superficial manifestations to underlying causes.  A careful consideration of these trends will take them in this direction.

 

Trend Disparities
Portland will continue to have a two-caste transportation system, with priority for those who can afford to, and are legally and physically able to operate a car (the upper caste), and lower priority for those too poor, too young, too old, to operate a car (the lower caste). Most of the other inequities (safety, pollution, lack of access and discrimination) flow from this two-caste system. Low income people, people of color, and the old and the young are disproportionately consigned to being in the lower caste by our car-dependent transportation system.

 

 

 

 

 

Portland area transportation greenhouse gas emissions have increased by 1,000 pounds per person annually (14 percent) over the past few years, and show no signs of declining, despite state, regional and local plans calling for a reduction in GHGs. The region will have to take much bolder action than any laid out in the RTP to comply with adoption laws. Climate change caused by GHG emissions disproportionately come from higher income households and lower density sprawling neighborhoods, and disproportionately affects low income neighborhoods.

 

 

ODOT plans to spend billions of dollars widening area freeways, which will induce additional travel; Gas taxes from road use don’t cover anything approaching the cost of building and maintaining freeways, meaning that their costs are subsidized by non-users. Freeways are only usable to people who can afford the roughly $5,000 annual cost of owning and operating a car. Car ownership is much lower among low income populations and people of color.   A car dependent transportation system doesn’t work for those who can afford to own a car and those who can’t or shouldn’t drive.
The number of persons killed on Portland area streets and roads has increased steadily. Pedestrians and other vulnerable road users account for half of deaths. Most transportation spending is devoted to enabling vehicles to move faster making roads more dangerous for non-car travelers People of color, low income people, and the young and old are disproportionately likely to be pedestrians, cyclists and vulnerable road users. Spending most transportation dollars on freeways, which are the least deadly roadways is inequitable.
Gasoline prices and gas taxes don’t cover the fiscal, social or environmental costs caused by driving. These costs, which range into the billions of dollars annually, are shifted to non-users.

 

Under-charging users for the costs of driving results in more driving, and more social costs that would otherwise occur, and unfairly imposes these damages and costs on non-users, who tend to be disproportionately low income and people of color.
Public policies will continue to allow unpriced use of public roads by cars while charging prices for use of transit. Congestion on public streets by unpriced private automobiles diminishes the speed and efficiency of public transit, which lowers its productivity, decreases its services levels and competitiveness, which lowers ridership and increases costs. Low income people and people of color, as well as the very young and very old are more likely to be transit-dependent than the overall population. They disproportionately bear the costs of worse bus service caused by the unpriced use of public streets by private cars.

 

Public policies will continue to subsidize free on street parking for most car owners at a cost of tens or hundreds of millions of dollars a year. Free and subsidized parking only benefits those who own cars, and disproportionately benefits higher income and whiter populations.
Roads and streets continue to contribute 50 percent or more to stormwater runoff, which causes pollution, and is expensive to fix.   Yet streets and roads, and their users pay nothing toward costs of stormwater collection and treatment. These costs are largely shifted to water users, especially households, many of whom don’t own or drive cars. Low income populations and people of color are disproportionately likely to be responsible for paying costs of stormwater due to costs shifted on to residences.

 

 

 

 

Adjacency is not a good measure of equity

 

 

 

 

 

 

Currently Metro relies on measures of adjacency (i.e. the demographic composition of census tracts adjacent to transportation infrastructure) to determine whether projects are equitable; This approach ignores the negative effects of proximity to many types of infrastructure, particularly highways)..
Accessibility Measures should be used, rather than mobility.

 

 

 

 

 

 

 

 

The performance of the transportation system should be judged by accessibility (the number of destinations one can easily reach), rather than by mobility (distance and speed traveled).   Maximizing accessibility is consistent with the region’s environmental, social and land use objectives; maximizing mobility undercuts key objectives and is more expensive.
Equity is best served by direct payments rather that more spending to increase supply.

 

 

Measures such as Portland’s transportation wallet can promote equity by giving more purchasing power and a wider array of options to low income households and targeted populations.
Target VMT reductions. Reduced VMT is needed to achieve the state and region’s legislatively mandated GHG reduction goals. Portland decreased VMT 1.5 percent per year between 2005 and 2013. VMT reduction saves money and stimulates the local economy, which benefits disadvantaged populations. The 1.5 mile per day decrease in average trips between 2005 and 2013 saved the region $600 million per year on transportation expense, which benefited the local economy.
Transportation spending targets peak hour car trips.

 

Peak hour car commuters have vastly higher incomes than the general population, and those who commute by transit, bike or walking
Green Dividend: Measures that reduce transportation costs have, in the past, created a “green dividend” for local households. Failure to continue to decrease VMT and transportation expense would be a missed opportunity to improve the region’s economy.

 

 

Transportation is costly: the average household spends 15 percent of its income on transportation.   Policies that reduce the amount of travel that households need to make, as measured by average VMT, reduce household expenses and increase household disposable income. Transportation expenditures are particularly burdensome for lower income households.
Demand for Walkability. Walkable neighborhoods are in high demand and short supply. More housing in dense, high demand locations results in fewer VMT, lower GHG emissions, and higher use of transit, biking and walking.

 

 

More and more people are interested in living in walkable urban neighborhoods, which are in short supply.   The failure to build enough housing in walkable neighborhoods drives up housing prices, and makes it more difficult for low income households to be able to live in walkable neighborhoods, where transportation costs are lower.

Metro’s “Don’t Look Up” Climate Policy

Metro, Portland’s regional government, says it has a plan to reduce transportation greenhouse gases

But in the 8 years since adopting the plan, the agency hasn’t bothered to look at data on GHGs—which have increased 22 percent, or more than one million tons annually.

Metro’s Climate Plan is “Don’t Look Up” 

In the new movie “Don’t Look Up,” Jennifer Lawrence and Leonardo DiCaprio play two scientists who identify a planet-killing comet headed for earth.  Their warnings go largely ignored, and by the end of the movie, there’s an active anti-scientific movement, which as the comet becomes visible in the sky, tells its adherents to simply “Don’t look up.”

The movie is an allegory for our climate peril:  faced with mounting scientific evidence about the trajectory of climate change, and the increasingly evident manifestation of heat waves, storms, flooding and fires, too many of our leaders are simply looking away.

And in Portland, which prides itself as being a green leader, the regional government has, effectively been pursuing a “Don’t Look Up” climate policy.

Noble intentions, soaring rhetoric

Here’s the background.  In 2007, the State Legislature set a goal of reducing Oregon greenhouse gas emissions by 75 percent by 2050.  And in 2014, Metro, Portland’s regional government adopted what it called a “Climate Smart Strategy” to reduce greenhouse gasses.

On paper, seems good.

The Metro plan had a few policy ideas for reducing greenhouse gas emissions, for example by expanding transit and promoting more compact land uses, which would enable more cycling and walking.  But for the most part, it relied on expectations that federal and state regulations and car makers would figure out a way to quickly make cars non-polluting.  Recognizing—at the time, at least—that there was a lot of uncertainty in the efficacy of these policies and the evolution of technology, Metro promised that if its efforts weren’t reducing greenhouse gasses, it would revisit the plan and take even tougher measures.

Here it is, eight years later.  How is that “Climate Smart Strategy” working out?

Well, you might read through Metro planning documents, but nowhere in them will you find any data on the change in transportation-related greenhouse gases in Metro’s planning area in the years since 2014.  In essence, after adopting its plan, Metro hasn’t looked up.

But just like in the movie, scientists are looking up.  And what they see, specifically in Portland, is that the Metro strategy is failing—greenhouse gas emissions are increasing, not decreasing, as called for in Metro’s plan.

Here, the parts of Leonardo DiCaprio and Jennifer Lawrence are played by real-life Boston University physicists Conor Gately, Lucy Hutyra and Ian Sue Wing.  Their research was sponsored by NASA, published by the National Academy of Science, and their database is maintained by the Oak Ridge National Laboratory.  What they’ve done is to create a nearly four-decade long, very high resolution map of greenhouse gas emissions from on-road transportation in the US.  They’ve mapped emissions down to a 1 kilometer (0.6 Mile) square grid for the entire nation, for each year from 1980 through 2017.  (There are more details about the project below). Their data is the best evidence we have on the trajectory of this comet.  And for Portland, the news is not good.

Here’s what their data show for the tri-county Portland metro area:

The green line on the chart is the actual amount of greenhouse gas emissions from transportation in Clackamas, Multnomah and Washington Counties from 1990 through 2017.  The blue line shows the trajectory of emissions needed to achieve the greenhouse gas reduction goals spelled out in Metro’s 2014 climate action plan.  In 2013, the year before Metro adopted its plan, emissions were about 6 million tons.  The plan envisioned the emissions levels going down by roughly a million tons by 2017.  But instead, as the green line shows, transportation greenhouse gas emissions in the Portland area increased by nearly 1 million tons a year after 2013, to 7 million tons.

Metro’s “Climate Smart Strategy” isn’t just somehow behind schedule.  It is failing.  Emissions are increasing, not decreasing.  The comet is accelerating towards earth. So what are the leaders doing?

Not looking up

Metro’s climate plan promised to track emissions.  To be sure, Metro has published annual sustainability reports since 2014.  And they proudly mention the adoption of the Climate Smart Strategy.  But the only thing Metro tracks in these reports is greenhouse gas emissions (and other environmental effects) of its own internal business operations.  There’s absolutely no mention of overall regional trends from the transportation system Metro is charged with planning.  Neither does the 2018 Regional Transportation Plan provide a time series of data showing the trend in regional transportation greenhouse gas emissions.

Metro’s plan also promised to take additional and tougher measures if those in the Climate Smart Strategy weren’t working fast enough.  On page 1 of the 2014 strategy document, Metro committed to periodically assessing its progress and said:

If the assessment finds the region is deviating significantly from the Climate Smart Strategy performance monitoring target, then Metro will work with local, regional and state partners to consider the revision or replacement of policies, strategies and actions to ensure the region remains on track with meeting adopted targets for reducing greenhouse gas emissions.

But if you don’t track your progress, you don’t have to admit you’re failing and you don’t have to  bother with considering more serious steps to reduce greenhouse gases.  Don’t. Look. Up.  It’s a recipe for disaster, and it’s the approach Metro is taking.

The science behind the DARTE database.

The tragedy here is that we have sound scientific data that tell us what is happening.  The research, undertaken over a period of years, sponsored by NASA, gives us a very granular, long-term picture of how our climate efforts are fairing.  You can’t claim to be taking climate change seriously if you aren’t paying attention to this kind of data.

Gately, C., L.R. Hutyra, and I.S. Wing. 2019. DARTE Annual On-road CO2 Emissions on a 1-km Grid, Conterminous USA, V2, 1980-2017. ORNL DAAC, Oak Ridge, Tennessee, USA. https://doi.org/10.3334/ORNLDAAC/1735

Their results were featured in the New York Times in October 2019.  We alerted Metro staff to the availability and importance of this data in October 2019 (Cortright to Kloster, October 16, 2019).

ODOT’s forecasting double standard

Oregon’s highway agency rigs its projections to maximize revenue and downplay its culpability for climate challenge

ODOT has two different standards for forecasting:  When it forecasts revenue, it says it will ignore adopted policies–especially ones that will reduce its revenue.  When it forecasts greenhouse gas emissions, assumes policies that don’t exist–especially ones that will magically make greenhouse gas emissions decline.

Revenue forecasts are “purely based on historical data” and don’t include adopted policies.  Greenhouse gas emission forecasts are based on “goals” and “wishes” and are explicitly not an extrapolation of past trends.

The inflated revenue forecasts are used to justify (and help fund) highway widening; the greenhouse gas emission forecasts are used to absolve the agency from any responsibility to reduce driving related greenhouse gas emissions.

As we’ve pointed out, the Oregon Department of Transportation keeps two sets of books when it comes to climate emissions.  It tells the public that it cares about climate and greenhouse gas emissions in its largely performative “Climate Action Plan,” but when it comes to the agency’s budget, it tells financial markets it’s counting on Oregonians burning just as much gas—and creating just as much carbon pollution—a decade from now as they do today.

ODOT’s officials have defended their revenue forecasts as being merely passive representations of current trends, unaffected and unfiltered by state policy objectives.  Somehow these actions that produce revenue are beyond either their control or responsibility.

But when it comes to the agency’s climate plan, they’ve gone out of their way to make highly speculative assumptions that all kinds of other actors—consumers, automobile manufacturers, the federal government and other state agencies—will make radically different decisions or implement entirely new policies that lead to reductions in greenhouse gases.

ODOT has a double-standard for forecasting—when it comes to forecasting climate, and especially establishing its responsibility for greenhouse gas emissions—it will make elaborate and speculative assumptions about other people doing things that will make the problem go away.  When it comes to estimating its own revenue (which it then uses to justify building new roadways and borrowing for more), it assumes that nothing will change and that it can safely ignore already adopted legal requirements to implement congestion pricing and limit greenhouse gases—both of which will reduce gas tax revenue.  It’s a deceitful, inconsistent and self-serving approach to forecasting.

ODOT Revenue Forecasts:  We assume nothing will change and ignore our own adopted laws.

Earlier, we pointed out that ODOT’s revenue forecasts are utterly at odds with claims it will reduce transportation greenhouse gas emissions, as mandated by state law, and directed by Governor’s executive order.  ODOT representatives defended their forecasts in the media by saying that the agency’s forecasting approach was merely to extrapolate existing trends, and that its forecasts were in no way a reflection of its policy objectives.

Here’s ODOT spokesman Don Hamilton responding to Willamette Week.

“ODOT revenue forecasts are based purely on consumer patterns and historical data,” says ODOT spokesman Don Hamilton. “They are not based on what we want to see.”

The forecasts also don’t take into account the reductions in driving that may come with “congestion pricing” or other ODOT initiatives, Hamilton says.

“As Oregon executes many of its climate-focused programs, we expect gas sales to decline, and we will revise our gas sales forecasts to reflect those changes as they occur.”

Oregon Public Broadcasting’s Dave Miller pushed the agency’s top planner, Amanda Pietz to explain the discrepancy:

Dave Miller: . . .  I want to focus on a new critique that I’m sure you’re aware of. It came about a week and a half ago from the frequent ODOT critic Joe Cortright, the economist. He put out a report digging into the agency’s estimates given to financial markets about expected gasoline tax revenues through the end of this decade. This was his summary: “What ODOT official revenue forecasts are telling us is that the agency fully expects us to be generating just as much greenhouse gasses from driving in 2030 as we are today. Indeed,” he wrote, “the agency is counting on it to pay its bills.” Amanda Pietz, how do you explain this?

Amanda Pietz: I think it goes back to the earlier statement I was making. When we do our revenue forecasts it’s often looking back at the trends then and projecting those forward without necessarily seeing some of the interventions take hold and create those changes.[Emphasis added].

Dave Miller: I’m slightly confused by that, and that jibes with what an ODOT spokesman said when there was an article about it this week in Willamette Week. But aren’t you supposed to give bond markets a projection that is as accurate as possible? If the whole point is [to say] “trust us, we’ve got revenue coming in, we can back these bonds and here’s our estimate for how the money is going to be coming in,” why don’t you factor in all the things you say you’re going to be doing so economic markets can know to trust you?

Amanda Pietz: Part of what is done when we look at things is [that] we have to rely on something very solid – a clear policy change, a solidified investment that’s been amended into our investment strategy in a way that’s very clear, it’s solid. I think what you’re seeing is an agency that’s recognized that we’re a contributor to the problem in the last year and [is] starting to make some changes and modifications. Now when those take hold and the degree to which they’re solidified [so] that we can roll them into our financial assumptions, my guess is another six months to a year before you start to see some of those. Another key example of that is DEQ has its Climate Protection program which will set limits on fuel sales that will have a big impact on that revenue forecast. That’s in draft form, not finalized. When that’s finalized, becomes implemented, and there’s clarity around what that looks like, that’s when it gets rolled into the financial assumptions. Similar things for us, too. I mentioned we’re investing over $50 million dollars in transportation electrification. We should see fuel sales drop as a result of that. Until we figure out exactly where we’re placing that, how we’re going to leverage with our private partners to put those in the right locations, [that’s when] we can factor that into our revenue forecast.[Emphasis added].

ODOT Climate Forecasts:  Wishes and speculation, including magical policies that don’t exist

When it comes to making forecasts about future automobile emissions, and whether the agency will need to do anything to curtail the growth of driving in order to achieve the state’s statutory greenhouse gas reduction goals ODOT has an entirely different approach to forecasting.  It makes heroic assumptions about things that might happen, if somebody else does them.  It pretends that policies that don’t exist will be adopted and aggressively implemented. And all of these assumptions are skewed in a very particular way, i.e. to reduce or eliminate any need for ODOT to take responsibility for cutting greenhouse gases from cars and driving in Oregon.

These assumptions are built into the State Transportation Strategy (STS), developed by ODOT to sketch out how Oregon might reduce transportation greenhouse gases in the decades ahead.  In a memo prepared for the Land Conservation and Development Commission, explaining the STS modeling, Brian Gregor, who was ODOT’s modeler, explained ODOT’s approach to estimating future greenhouse gas emissions from cars.

The members on the Core Tech Team from the Departments of Environmental Quality and Energy agreed that the STS “trend line” is a reasonable reflection of goals that California, Oregon, and other states participating in the multi-state ZEV standards wish to achieve. They caution, however, that this planning trend does not reflect recent trends in vehicle fuel economy. Substantial efforts on the part of states and the federal government will be necessary to make this planning trend a reality. [Emphasis added].

A footnote on page 30 of the LCDC report makes this point even more clearly:

It is important to note that these ‘trend lines’ represent the trend in the model results given the vehicle assumptions in the STS recommended scenario. They do not represent an extrapolation of past trend. [Emphasis added].

The contrast couldn’t be sharper:  when it comes to estimating an elevated level of future revenue, ODOT discounts anything that will reduce driving or pollution, and won’t even consider the impact of policies, like congestion pricing, which were approved by the Legislature in 2017.  But when it comes to optimistic speculation about technologies or policies that might lower future vehicle emissions—absolving ODOT of the need to act—the agency will definitely count on policies that haven’t been adopted by anyone.  It’s a clear and calculated strategy to avoid responsibility for doing anything to address climate change.

Clearly, ODOT’s current revenue forecasts are counting on the failure of the state’s climate efforts.  They’re assuring financial markets that Oregon will collection hundreds of millions of dollars in motor fuel tax revenues with which to repay bonds it will use to expand the state’s highways, encouraging and subsidizing more driving and greenhouse gas emissions.  It may seem like an arcane detail, but it’s the kind of technocratic climate arson that’s routinely practiced by state highway departments.

 

Why the proposed $5 billion I-5 bridge is a climate disaster

The plan to spend $5 billion widening the I-5 Bridge Over the Columbia River would produce 100,000 additional metric tons of greenhouse gases per year, according to the induced travel calculator

Metro’s 2020 transportation package would have cut greenhouse gases by 5,200 tons per year– 20 times less than the additional greenhouse gases created by freeway widening.

Widening freeways induces additional travel. It’s an established scientific fact:  widening urban freeways prompts more miles of travel and consequently, more greenhouse gas emissions.  The effect is so well-documented that its referred to as the “fundamental law of road congestion.”

Based on a synthesis of the latest award-winning peer-reviewed scientific research and work by scholars at the University of California, Davis‘s National Transportation Center, the Natural Resources Defense Council developed the induced travel calculator that computes the additional amount of greenhouse gases produced by an additional lane-mile of freeway capacity in each of the nation’s metro areas.

The proposed Columbia River Crossing, now re-branded as the “I-5 bridge replacement project”, contemplates a 12-lane wide, 5 mile long freeway between Portland and Vancouver, effectively doubling the size of the existing I-5 freeway and adding 30 lane miles of freeway (3 lanes in each of 2 directions for 5 miles).  Freeway advocates have claimed that the bridge might only be 10 lanes, but as public records requests revealed, the bridge structure is designed to carry twelve lanes, and in many places the proposed roadway is 14 lanes wide.

The Induced Travel Calculator shows that this increase in roadway capacity in Portland would produce an addition 155 to 233 million miles of travel annually, leading to burning an additional 11 million gallons of gas.  That in turn would translate into additional annual greenhouse gases of about 100,000 tons (at roughly 20 pounds of CO2e per gallon of gas).

How big is that amount?  Well, to put in perspective, let’s compare it to the expected greenhouse gas reductions from other possible transportation investments.  In 2020, Metro advances a multi-billion dollar transportation spending project including light rail, bus lanes, pedestrian and safety improvements and other projects.  Metro estimated that this package of investments would reduce greenhouse gases by about 5,200 metric tons per year.

State, regional and local government officials all recognize that we’re in the midst of a climate crisis.  It should be apparent to any casual observer that widening freeways takes us in the opposite direction of our stated commitments to reduce greenhouse gases.  In fact, the best scientific estimates of the emissions from added freeway capacity suggests that widening I-5 would generate 20 times more greenhouse gas emissions than would have been saved by the multi-billion dollar package of projects proposed by Metro last year.  Given the cost and difficulty of reducing greenhouse gas emissions, the last thing we should be doing is making the problem worse.

How to solve traffic congestion: A miracle in Louisville?

Louisville charges a cheap $1 to $2 toll for people driving across the Ohio River on I-65.  

After doubling the size of the I-65 bridges from six lanes to 12, tolls slashed traffic by half, from about 130,000 cars per day to fewer than 65,000.

Kentucky and Indiana wasted a billion dollars on highway capacity that people don’t use or value.

If asked to pay for even a fraction of the cost of providing a road, half of all road users say, “No thanks, I’ll go somewhere else” or not take the trip at all.

The fact that highway engineers aren’t celebrating and copying tolling as a proven means to reduce congestion shows they actually don’t give a damn about congestion, but simply want more money to build things.

Picture this.  A major interstate freeway that connects the downtown of one of the nation’s 50 largest metro areas to its largest suburbs.  It’s a little after 5 pm on a typical weekday.  And on this 12-lane freeway there are roughly two dozen cars sprinkled across acres of concrete.

I-65 in Southern Indiana (Trimarc)

 

I-65 crossing the Ohio River at Louisville (Trimarc)

These pictures were taken by traffic cameras pointed in opposite directions on the I-65 bridges across the Ohio River at Louisville Kentucky on Wednesday, November 3 at about 5:30 pm.  Traffic engineers have a term for this amount of traffic:  They call it “Level of Service A”—meaning that there’s so little traffic on a roadway that drivers can go pretty much as fast as they want.  Highway engineers grade traffic on a scale from LOS A (free flowing almost empty) to LOS F (bumper to bumper stop and go).  Most of the time, they’re happy to have roads manage LOS “D”.

Somebody finally figured out how to reduce traffic congestion!  Usually, as we know, simply widening highways, to as many as 23 lanes as is the case with Houston’s Katy Freeway, simply generates more traffic and even longer delays and travel times.  And, with no sense of irony, highway boosters even tout the Katy Freeway as a “success story,”  despite the fact it made traffic congestion worse. In contrast, Louisville’s I-65  is an extraordinarily rare case where traffic congestion went away after a state highway department did something.

You’d think that the Kentucky Transportation Cabinet and the Indiana Department of Transportation would be getting a special award, and holding seminars at AASHTO to explain how to eliminate traffic congestion.  The fact that they aren’t tells you all you need to know about the real priorities of state highway departments–they really only care about building things, not about whether congestion goes away or not.

So how did they do it?  Let’s go back a few years.  In 2010, I-65 consisted of a single six-lane bridge over the Ohio River, which carried about 120,000 vehicles per day.  The two states decided this was getting too crowded (and predicted worsening delay due to ever expanding traffic volumes) and so spent about $1 billion building a second six lane bridge (the Lincoln) next to the existing Kennedy bridge.  After in opened in 2017, the two states implemented a toll to pay part of the cost of construction.  Tolls started at $2 for single crossings (if you had a transponder), but regular commuters were given a discounted toll: regular commuters pay just a bit over $1 for each crossing.  Today the toll for one-way crossings if you have a transponder (and 450,000 area vehicles do), is $2.21.  But if you cross the bridge 40 times a month (back and forth daily for 20 work days), your toll for each trip is reduced by half to $1.10.

And after the tolls went into effect, traffic on I-65 fell by half.  Here’s the average daily traffic count on I-65, according to data tabulated by the Indiana Department of Transportation.  In the years just prior to the tolling, traffic was in the 135,000 to 140,000 vehicles per day level.  But as soon as tolling went into effect, traffic dropped to barely 60,000 vehicles per day (with a very slight further decline due to Covid-19 in 2020).

 

The two states spent a billion dollars doubling the size of I-65, only to have half as many people use the bridge.  That money was wasted.  Nothing more clearly illustrates the utter folly of highway expansions.  As we’ve pointed out, highway engineers size roadways based on the assumption that the users will pay nothing for each trip.  Just as with Ben and Jerry’s “Free Ice Cream Day,” when you charge a zero price for your product, people will line up around the block.  But ask people to pay, and you’ll get fewer takers.

The fact that Louisville residents would rather drive miles out of their way or sit in traffic for an extra 10 or 15 minutes to travel on a “free” road, rather than spend a dollar or two for a faster, more direct trip tells you the very low value that highway users attach to these extremely expensive roadways.  In fact, they’ll only drive on them if somebody else pays for the cost the roadway.   This is also powerful evidence of what economists call induced demand:  people only taking trips because the roadway exists and someone else is paying for it.

The Louisville traffic experiment shows us that there’s one surefire fix for traffic congestion:  road pricing.  Even a very modest toll (one that asks road users to pay only a third or so, at most of the costs of the roads they’re using) will cause traffic congestion to disappear.  This traffic experiment shows the folly and waste of building additional capacity.  Kentucky and Indiana spent over $1 billion for a bridge to carry as many as 250,000 vehicles per day, and today barely a quarter of that number are using it.

If state DOTs really cared about congestion, they’d be implementing congestion pricing.  A small toll, probably less than a dollar per crossing, would be sufficient to get regular free-flow conditions on the I-65 bridge—without having to spend a billion dollars.  But the truth is, state DOTs don’t care about congestion, except as a talking point to get money to build giant projects. The next time you hear someone lamenting traffic congestion, ask them why they aren’t trying the one method that’s been shown to work.

 

 

 

Louisville’s financial disaster: Deep in debt for road capacity that will never be used

Louisville’s I-65 bridges:  A huge under-used roadway and hundreds of millions in debt for their kids—who will also have to cope with a climate crisis.

Their financial plan kicked the can down the road, saddling future generations with the cost of paying for unneeded roads.

The two states mortgaged future federal grant money and borrowed against toll revenues, which are falling dramatically short of projections.

Louisville’s Ohio River Bridges are a monument to the epic policy, financial and generational failure that is the US highway system.  Ohio and Indiana spent more than a billion dollars on doubling an interstate highway bridge, that thanks to very modest tolls, is utilized at less one-fourth of its capacity.  Meanwhile, through a series of “creative” financial maneuvers, it passed the bill for the highway onto future generations, who, as it turns out will have to actually pay for the bridge at the same time the climate crisis hits in full force.  The almost empty freeway bridges show the folly of “asphalt socialism”—wasting vast amount of public resources on roads that their users don’t value enough to pay even a fraction of their cost.

Earlier, we wrote about one aspect of the I-65 bridge project in Louisville.  It turns out that just by charging a $1 to $2 toll, Kentucky and Indiana were able to entirely eliminate traffic congestion on I-65.  Traffic plummeted from around 130,000 vehicles per day to about 60,000.  Now, even at the rush hour, I-65 is almost empty.

 

I-65 crossing the Ohio River at Louisville

 

And after the tolls went into effect, traffic on I-65 fell by half.  Here’s the average daily traffic count on I-65, according to data tabulated by the Indiana Department of Transportation.  In the years just prior to the tolling, traffic was in the 135,000 to 140,000 vehicles per day level.  But as soon as tolling went into effect, traffic dropped to barely 60,000 vehicles per day (with a very slight further decline due to Covid-19 in 2020).

 

While there’s a hopeful lesson here—one that highway engineers are studiously avoiding—that road pricing can eliminate congestion, there’s a financial horror story that should be a warning to everyone thinking about highway expansion projects.  The two states spent over a billion dollars on doubling bridge capacity in downtown Louisville, and their financial plans show how through combination of cynicism, incompetence or saddled future generations with the cost of this boondoggle.

Ostensibly, the justification for widening the bridges was the notion that traffic was already too congested and was growing rapidly.  The project’s environmental impact statement claimed that the I-65 bridges were “over capacity” in 2012, and predicted that traffic would grow from 120,000 vehicles per day to more than 180,000 by 2025, leading to hours and hours of traffic delay.

The trouble with these forecasts is that they were both simple-minded and wrong, especially given the need to pay for this project—in part—by actually charging the users for a portion of its construction cost.  Like so many state highway department predictions, this one was flat out wrong—traffic was nearly flat-lining on I-65 before the project was built.

State DOT traffic forecasts were wrong

Surely, you must be thinking, the state DOTs knew that charging a toll would reduce traffic.  Before the project was completed, Kentucky and Indiana hired consultants—CDM Smith and Steer, Gleaves, Davie—to estimate future toll revenues from the project.  CDM Smith predicted that future traffic levels on newly expanded and tolled I-65 bridges would be 92,000 vehicles per day in 2020, growing to 102,600 in 2030..  That was a dramatic over-estimate.  Actual traffic levels in 2019 (i.e. the year prior to the pandemic) were just 62,000 vehicles per day.  Whereas CDM Smith predicted that tolling would produce about a 27 percent decline in traffic from pre-tolling levels, the imposition of tolls actually led to a 50 percent decline in traffic.  CDM Smith overestimated the amount of toll-paying traffic that would cross the I-65 bridge by 50 percent.  The direction and magnitude of that error is all too common in toll traffic forecasts, and has led to defaults and bankruptcies for other tolled projects.

Creative Finance:  Sending the bills to future generations.

Tolls are only paying for a small fraction of the costs of the project.  Both states borrowed substantial sums. Kentucky borrowed deeply to pay for its share of the project, using “Garvee” bonds—essentially mortgaging future federal grant money—to the tune of $300 million in principal repayment and $138 million in interest payments.  In addition, Kentucky’s tolls are pledged to pay of both Kentucky revenue bonds of $272 million and a Federal TIFIA loan of $452 million.  The debt service for these two borrowings is back-loaded, i.e. is very low in the first few years of the project, but then steadily escalates.

This back-loading means the financial plan for the I-65 bridges project essentially sends the bill to the region’s future residents.  Essentially, Kentucky has borrowed most of the money to construction the project and arranged for a loan with a series of “balloon payments,” in later years.  Kentucky “back-loaded” the repayment of principal on both its own revenue bonds and its borrowings from the federal government’s TIFIA program.  It pays interest-only or just a token amount of the principal for these loans in the first few years, and then required payments steadily escalate in later years.  Debt-service obligations start at less than $10 million per year, and then balloon to more than $80 million annually in the early 2020s.

In theory, the escalating repayment can be met by growing toll revenues, from some combination of toll rate increases and growing traffic.  Toll rates have increased steadily at 2.5 percent per year, but as the traffic counts show, volume has flat-lined.  The artificially low repayments in the first few years of the project create the illusion that toll revenues are sufficient to cover debt service payments, but as required payments steadily escalate over the next few years, the Kentucky will find it increasingly difficult to meet its repayment obligations.

This looming mountain of debt service obligations has already prompted Kentucky to refinance part of its debt, essentially kicking the can further down the road for repayment of the cost of the I-65 bridges.  The refinancing plan essentially doubles down on earlier back-loading strategy, borrowing more money now to make these payments, and extending the period for repayment further.  Instead of paying off its “first tier revenue bonds” in 2045, they’re extending the term of the repayment by 8 years, to 2053.  And like the initial borrowing, these refunding bonds are mostly “interest-only” for the next 25 years, with nearly all of the principal being repaid after 2045.  As a result, the borrowers will pay almost as much in interest charges ($182 million over the life of the loan) as they pay in principal ($192 million).

Traffic on I-65 will never get back to pre-tolling levels

Whether toll revenues will be sufficient to repay these bonds hinges on whether you believe the latest forecast from Kentucky’s consultant, Steer & Company, prepared as part of the latest re-financing plan.  This new forecast now predicts  that total annual transactions (the number of vehicles using the project) will increase from about 30 million today to about 48 million by 2053.  What this means is that the I-65 bridges will effectively never recover to the level of traffic they had before the crossing was widened.  Currently, as noted above, the bridges are carrying about 60,000 vehicles per day.  The latest Steer forecast is that traffic will increase from that level by about 60 percent over the next three decades (48 million = 1.6 x 30 million).  This means that in 2053, the bridges will be carrying about 96,000 vehicles (which, ironically, is about the same level that the other toll consultants, CDM Smith predicted for 2020 in the projections that were originally used to justify project financing).  On the following chart, the red line is the FEIS traffic forecast for I-65, the black line is the actual level of traffic according to INDOT, and the blue line is the growth in traffic forecast by Steer & Company for the refinancing plan).

 

Comparing these new refinancing estimates with the rosy projections used to justify the project in the first place show the profound gap between highway boosters and reality.  The project was sold based on an environmental impact statement forecast that in the “no action” scenario (with just the single six-lane Kennedy Bridge), traffic on I-65 across the Ohio River would grow to 178,600 vehicles per day, exceeding its “capacity” by 142 percent.  This estimate implies that the capacity of the 6 lane Kennedy Bridge was 125,000 vehicles per day.  (As traffic expert Norm Marshall has shown these “over-capacity” estimates amount to forecasting the impossible, but neatly serve the interests of highway advocates).

Doubling the size of the I-65 crossing was needed, according to the project’s supplemental EIS in order to assure that the project could carry about 185,000 vehicles per day when completed:

Specifically, the combination of new bridges in the Downtown and Far East corridors would result in the Kennedy Bridge operating at 74 percent of capacity in 2025.

With a capacity of 250,000 vehicles per day (double the 6-lane Kennedy Bridge), the 74 percent of capacity implies a forecast level of travel of 185,000 vehicles per day in 2025.  Now, based on the impact of tolling, it’s doubtful that the bridges will ever carry more than a third of their designed capacity, and the much lower level of traffic on I-65 predicted for the 2050s shows that the project was a colossal blunder, wasting a billion dollars.  It’s a blunder that future Louisville area residents will be paying for—for decades to come.

 

 

 

The opposite of planning: Why Metro should stop I-5 Bridge con

Portland’s Metro regional government would be committing planning malpractice and enabling lasting fiscal and environmental damage if it goes along with state highway department freeway widening plans

  • The proposed $5 billion, 5-mile long, 12-lane freeway I-5 bridge project is being advanced based on outdated traffic projections using 2005 data.
  • ODOT is pushing freeway plans piecemeal, with no acknowledgement that they are creating new bottlenecks.
  • Freeway plans fail to address climate change and don’t acknowledge that new capacity will produce additional travel and increased greenhouse gases
  • I-5 bridge plans are inconsistent with adopted state, regional and city commitments to use road pricing to manage demand, which would obviate the need for expensive capacity
  • ODOT and WSDOT have not produced a viable financing plan for the project, which would be the region’s most expensive, and which has a $3.4 billion financial hole.

In theory, Portland has a smart approach to regional planning.  It has a directly elected regional government, with strong planning authority over transportation and land use.  That government claims to care deeply about the climate crisis, and regularly touts the sophistication of its transportation modeling team.  And it says it’s looking at how the whole system works to make Portland a greener more just place.

But when it comes to the single largest transportation investment in the region, a proposed $5 billion 5-mile long, 12-lane wide freeway project across the Columbia River, it’s simply abdicating its responsibility and betraying its stated principles.

Next month, the Metro government is being asked to approve $36 million in additional funds for further planning of this massive freeway project.  It should say no.

Supersize Me: The planned $5 billion widening of I-5 (Courtesy:  Bike Portland)

 

This approval is one more brick in the wall of an even larger freeway building plan.  The Oregon Department of Transportation is pushing an entire series of freeway widening efforts, including the $1.2 billion Rose Quarter project, $5 billion for the mis-named “I5 Bridge Program” and billions more for widening I-205 and I-5 at the Boone Bridge in Wilsonville.

In theory, a regional planning agency would be guided by current, accurate data and scientifically based models.  It would insist on knowing how each project fitted into a larger, long-term vision of how roads and transportation system would work.  It would insist on knowing what a project will cost, how it will be paid for, and who will pay for it.  And if it has committed itself to pricing roadways, it should know how pricing will affect demand before it commits billions on capacity that may not be needed or valued.  And if it is serious about its oft-repeated commitments to tackling climate change, it should insist that its investments actually result in fewer vehicle miles traveled, and less greenhouse gases.

In practice, Portland Metro has done none of these essential things as it has considered the I-5 Bridge.

No forecasts. Most fundamentally and technocratically, ODOT has not prepared, and Metro has not reviewed or analyzed current traffic forecasts that show the actual and projected demand for the I-5 bridge. The foundation of any road project is estimates of the future level of traffic the roadway is expected to carry.  Just last week, the staff working on the bridge project admitted that after more than two years of work to revive the failed CRC, they have no new traffic forecasts, and won’t have any for at least another year.  That hasn’t stopped them from claiming that they know just how big the project should be (they say “ten lanes”) and from claiming that other alternatives won’t meet the project’s purpose and need.  (As we’ve noted before, the two DOTs may claim it’s a “ten lane” project, but they’re planning on building a structure that would easily accomodate a dozen freeway lanes).

The last traffic projections prepared for the I-5 bridge as part of the project’s environmental impact statement date back more than a decade, and are based on data from 2005.

Ruling out alternatives and deciding on the size and design of a highway project before preparing and carefully vetting traffic forecasts is the opposite of planning.

No comprehensive look:  building a badder bottleneck for billions.  As noted earlier, the I-5 bridge project is just one of a series of Portland-area freeway widenings.  Metro should be asking what the region, its environment, and its transportation system would look like with and without all these projects.  Instead, it is considering them only piecemeal.

In effect, this approach amounts to approving the creation of new bottlenecks on the freeway system that will undoubtedly trigger efforts to widen freeways yet again in the future.  The I-5 bridge project would widen I-5 from six to as many as twelve lanes from Vancouver to Victory Boulevard (the project claims its just ten lanes, but in the past it has lied about the actual physical width of the project it plans to build). ODOT is also planning to widen I-5 at the Rose Quarter to as many as ten lanes.  Once these two I-5 projects are complete, a new bottleneck will be formed between them in the three-mile long six-lane wide section of I-5 between the Fremont Bridge and Victory Boulevard, with 12 lanes feeding into six at the north, and 14 lanes (I-5 plus I-405) feeding into this stretch of freeway from the south.  ODOT will then no doubt call for the construction of further “auxiliary” lanes) to carry traffic between exits on this newly bottlenecked segment of I-5.  In essence ODOT is building very large funnels at either end of this six-lane stretch of I-5 North, which will predictably lead to further traffic congestion, more pollution, and additional demands to waste billions of dollars widening roads to accommodate this traffic.

As Metro staff noted in their comments on the I-5 Rose Quarter project, the Oregon Department of Transportation routinely lies about the fact that it is expanding freeway capacity.  It wrote of ODOT’s claim that it was not expanding I-5:

This statement is not objectively true and is potentially misleading; auxiliary lanes clearly add capacity.

Piecemeal reviews that approve segments on an ad hoc basis, and don’t consider the long-term effects of encouraging even more car traffic are the opposite of planning.

Not following through on fighting climate change.  The original CRC was conceived with no notion of the seriousness of the climate challenge.  The proponents of the new I5 bridge have steadfastly opposed incorporating climate considerations in the project’s purpose and need statement.  It’s clear from their choice of alternatives (every one includes at least ten lanes of freeway), and claims that the inclusion of sidewalks, bike paths and transit as somehow make the project “climate friendly,” that nothing has changed with this new iterations of the project.  Never mind that the authoritative Induced Travel calculator based on research from the University of California Davis, shows that expanding I-5 to 10  or 12 lanes for five miles would add 155 to 233 million miles of driving and 800,000 to 2.5 million tons of greenhouse gases.  Freeway widening would worsen the climate crisis.

Of course, these calculations don’t include the effects of congestion pricing.  Tolling I-5, which will be needed to pay for this project, would likely reduce and divert traffic (as we explain below), and ODOT’s own consultants show that tolling would reduce I-5 traffic by enough to entirely eliminate the need for widening I-5 at all.  If the project manages somehow not to be tolled (as many in Clark County want) it would tend to produce vastly more traffic and pollution, as estimated by this calculator.

At $5 billion, the proposed I-5 bridge project is the largest single spending item in the Regional Transportation Plan.  If Metro isn’t going to undertake a serious appraisal of the greenhouse gas impacts of building or not building this freeway then it doesn’t really have a climate strategy.

Metro is officially on record as supporting efforts to address climate change.  Metro has said it wants to reduce greenhouse gases by 20 percent by 2035.  But so far, its efforts have yielded no decline in emissions.  And greenhouse gas emissions from transportation  in metro Portland have actually increased by 1,000 pounds per person in the past five years.  Metro has so far done nothing, and this and other freeway widening projects will make pollution worse.

At best, the I-5 bridge advocates pay lip service to climate issues, completely ignoring the effects of added road capacity on likely travel volumes and greenhouse gases, and instead making vague and unquantified claims that pedestrian and bike facilities on the bridge, plus transit improvements will somehow ameliorate the climate damage done by doubling freeway capacity.

Approving funding for a climate polluting freeway widening project, and failing to insist on developing a more climate friendly alternative way of spending $5 billion is the opposite of planning, and a betrayal of Metro’s stated climate commitments.

A failure to plan for road-pricing.  The State Legislature, ODOT, the City of Portland and Metro have all said that road pricing will be a key component of the region’s future transportation system.  Pricing can help better manage roadways, reduce peak hour traffic, lower the need for additional capacity, and provide funding for maintenance and equitable alternatives.  Metro should not approve a $5 billion freeway project without a clear idea of how the project integrates with a system of road pricing—and yet ODOT and WSDOT have done essentially nothing to integrate these two concepts.

ODOT faces a profound dilemma with regard to road pricing.  Its financial analysis counts on at least $1.4 billion in revenue from tolling the I-5 bridge.  But the project is being sized and designed as if it needs to handle 180,000 vehicles per day, based on traffic projections—outdated, using 2005 data, and built using a model that ODOT concedes can’t address the effect of tolling.

But imposing tolls will profoundly reduce traffic growth.  ODOT’s own consultants, in work completed after the CRC FEIS, have said that the proposed tolls on the I-5 bridges would reduce traffic levels on the bridge from their current level of approximately 130,000 trips per day to only 85,000.  (And this is a firm that routinely over-estimates traffic on toll roads).  Road pricing could dramatically reduce the need for expensive infrastructure.  Yet ODOT has not incorporated the traffic-reducing effects of tolling into its design or alternatives analysis.  They are treating it purely as a financial afterthought:  a way to pay for a over-sized roadway after they’ve borrowed billions of dollars to build it.  That’s exactly what Louisville did with a remarkably similar project (widening I-65 from 6 to 12 lanes across the Ohio River); there $1 tolls caused traffic to fall by almost half.

Louisville’s I-65 bridges at rush hour: $1 tolls eliminated tens of thousands of daily trips

If Metro were to demand that road pricing be implemented before squandering billions on this project, it would likely find that the region had more than adequate transportation capacity across the Columbia River. A region that says it is going to implement road pricing doesn’t commit to a multi-billion dollar freeway project based on outdated projections, and subsidize expensive freeway capacity that won’t be needed in a world with pricing. Going deeply into debt for a megaproject and failing to consider how paying for it will reduce traffic is the opposite of planning.

No financial viability.  At $5 billion or more, this will be the most expensive transportation project in this region for the next couple of decades.In theory, the project should be part of the region’s “financially constrained” regional transportation plan, but the budget documents prepared by the state DOT staffs show that they don’t know the actual cost of the project, and that there is a massive, $3.4 billion hole in the project’s budget.  Moving ahead with no clear idea how the project would be paid for is opposite of planning.

The original CRC effort foundered a decade ago because there was no stomach for its excessive costs in either Oregon or Washington.  Congressman Peter DeFazio famously declared the project to be a gold-plated monstrosity.   In the Oregonian on August 14, 2011, Representative DeFazio said:

“I kept on telling the project to keep the costs down, don’t build a gold-plated project,” a clearly frustrated DeFazio said. “How can you have a $4 billion project? They let the engineers loose, told them to solve all the region’s infrastructure problems in one fell swoop… They need to get it all straight and come up with a viable project, a viable financing plan that can withstand a vigorous review.”
(Manning, Jeff. “Columbia River Crossing could be a casualty of the federal budget crunch”, The Oregonian, August 14, 2011).
Later, Representative DeFazio told Oregon Public Broadcasting:
“I said, how can it cost three or four billion bucks to go across the Columbia River?  . . . The Columbia River Crossing problem was thrown out to engineers, it wasn’t overseen: they said solve all the problems in this twelve-mile corridor and they did it in a big engineering way, and not in an appropriate way.”
“Think Out Loud,” Oregon Public Broadcasting, August 18, 2011.

Ten years ago, the two state DOT’s squandered nearly $200 million on planning without first securing the needed funds for the project, and they are repeating this exact failed strategy today.  Now, in their efforts to revive the project, after two years of work, the project has not developed a definitive financial plan, and its estimates of Oregon’s needed contribution have inexplicably jumped by more than $150 million in a month.  ODOT and WSDOT are spending millions—$200 million is planned for staff and consultants before this project breaks ground—with no clear idea of how this will be paid for.

This amendment adds $71 million to the preliminary engineering (PE) phase of the IBR Program. With this change, the total available budget will change to $80 million ($45M from Oregon and $35M from Washington). The estimated PE cost to complete NEPA for the IBR program is approximately $135 million based on a completion of a supplemental environmental impact statement (SEIS) in mid-2024. Following NEPA completion, the IBR program will develop a program delivery plan and progress with right-of-way acquisitions and final design to prepare for the start construction in late 2025. The estimated PE cost for progressing final design to start the first phase of construction is estimated at approximately $70 million. In summary, the total estimate of PE to begin the first phase of construction is estimated to be approximately $205 million. This estimate is contingent on the scope of the IBR solution, as agreed to by program partners, that will be evaluated through the SEIS along with the scope of the program’s first construction phase. Right-of-way costs and construction costs are not included in this budget estimate.

[Chris Ford, Memo to Metro TPAC, “I-5:Columbia River (Interstate) Bridge: Requested Amendment to the 2021-24 Metropolitan Transportation Improvement Program.” Oregon Department of Transportation. September 24, 2021, aka ODOT/Ford Memo. Page 6. Emphasis added.)

The prospect for Build Back Better and a national infrastructure funding package is no reason to move ahead with a misguided, environmentally destructive bridge project.  Oregon and Washington will get their share of these monies whether they build this project, or whether they choose to use these funds more wisely.  A regional government that cared about the future would ask “what is the smartest possible use of $5 billion” rather than approving this project.

Cannibalizing maintenance to pay for megaprojects.  This particular project is a particularly egregious example of how state DOT’s beg for money by complaining that they don’t have enough money to fix potholes, but then use any additional revenue they can find to build massive new projects that simply increase the maintenance burden.  This project is no exception, ODOT is literally asking Metro to approve the reallocation of funds that would otherwise be used for maintenance to pay for planning the megaproject.

ODOT is reducing money for road maintenance and repair to hire consultants for this megaproject.  ODOT’s own memo makes this clear.

This project change requires adjustment to the fiscally constrained RTP. Funds from the fiscally constrained Fix-It buckets in the RTP will be reduced to allow for the $36M ODOT funds to be advanced on this project. Memo with details was sent to Metro 9/17/21 by Chris Ford. We find the analysis is still applicable with the addition of WDOT funds since RTP focuses on Oregon revenue only.

[ODOT/Ford Memo. page 12, Emphasis added.]

Diverting money from maintenance funds to pay for a megaproject is the opposite of planning.

This is a pivotal moment for Metro.  As former Metro President David Bragdon (who guided the agency through the original Columbia River Crossing) wrote in retrospect:

Leadership at ODOT frequently told me things that were not true, bluffed about things they did not know, made all sorts of misleading claims, and routinely broke promises. They continually substituted PR and lobbying gambits in place of sound engineering, planning and financial acumen, treating absolutely everything as merely a challenge of spin rather than matters of dollars or physical reality.

That history is important, because if you’re not honest about the patterns of the past, you are doomed to repeat them. Unfortunately, I understand that’s exactly what’s going on with the rebranded CRC: the same agencies, and even some of the same personalities who failed so spectacularly less than a decade ago – wasting nearly $200 million and building absolutely nothing – have inexplicably been rewarded for their failure by being given license to try the very same task, using the very same techniques of bamboozlement.

Metro has a choice: It can repeat the mistakes of the past and bow to the wishes of an entrenched highway building bureaucracy, or it can do its job, and live up to its professed values.  It can plan.  It can insist on accurate travel projections, it can demand a definitive finance plan, it can require that freeway construction be addressed comprehensively, rather than piecemeal, it can require that the vision for capacity be integrated with congestion pricing, and it can require a full financial plan before squandering more on planning this speculative project.  And above all, it can insist that the region’s next multi-billion dollar transportation project reduce greenhouse gases, rather than increase them.  Anything less would be the opposite of planning.

Oregon, Washington advance I-5 bridge based on outdated traffic projections

The Oregon and Washington Departments of Transportation are advancing their $5 billion freeway widening plan based on outdated 15-year-old traffic projections. No new projections have been prepared since the 2007 estimates used in the project’s Draft Environmental Impact Statement,

The two state DOTs are essentially “flying blind” assuming that out-dated traffic projections provide a reasonable basis for sizing and designing and new bridge, and rejecting other alternatives.

The two agencies have spent two years and tens of millions of dollars but not done the most basic preliminary work to accurately predict future traffic levels.

The Oregon DOT has specifically violated Governor Kate Brown’s pledge that new traffic analyses would be done prior to determining the “best solution” for the I-5 bridge project.

The two agencies have no plans to publish new traffic studies until mid-to-late 2022—months after determining a final design and asking for other local sponsors to approve.

The justification for spending upwards of $5 billion on a massive expansion of the I-5 freeway between Vancouver and Portland—a project misleadingly branded as a mere “bridge replacement”—is the notion that there will be a huge increase in traffic between the two cities.  That notion is based on traffic forecasts prepared by the Oregon and Washington Departments of Transportation.  As with all transportation projects, estimates of how much demand there will be are key to deciding whether projects are needed and justified, for determining how they’ll be designed and what their worth, and critically, assessing their environmental impacts.

Traffic Projections for the I-5 Bridge are based on 15 year old data

When it comes to the “I-5 Bridge Replacement Project,” which has been proceeding for more than two years, there are no new traffic projections.  The latest traffic numbers the Oregon and Washington Departments of Transportation are from the project’s Final Environmental Impact Statement, published in 2011.  They predict that without the project, traffic on the I-5 bridge will increase to 184,000 vehicles per day, and produce high levels of congestion.

Columbia River Crossing Final Environmental Impact Statement , 2011, Chapter 3, page 3-30.

 

These numbers are the same as were presented in the project’s Draft Environmental Impact Statement, published in 2008.  In fact, the traffic analysis for the project was completed in 2007, and is based on traffic data gathered in 2005.

.

Columbia River Crossing Draft Environmental Impact Statement , 2008, Traffic Technical Report, page 47

How, it is reasonable to ask, is it possible to plan for a $5 billion project without bothering to update the most fundamental data used to design, justify, and evaluate the environmental impacts of the project?

Traffic Projections are Central to Project Design and Environmental Impact

The basis for any major transportation investment is some sort of careful statistical analysis to project future travel volumes.  How many people might travel in a region or a corridor, and what are the various options for accommodating their travel?  The statistical models used to generate these data, should, in theory, inform the design of particular alternatives and shape the choices.  In particular, traffic forecasts are essential to evaluating the environmental effects of alternatives:  which alternative will have lower levels of pollution?

We have many concerns about the quality and biases built into the models used by state Departments of Transportation, but without a doubt, these statistical estimates are in theory, the intellectual foundation for any claims about the need for a project.  Without traffic estimates, highway engineers are simply predicating key project decisions on their personal opinions rather than demonstrated facts.  In this case, the engineers guiding the I-5 bridge project are engaged in nothing more than faith-based project planning.

For the past two years, the Oregon and Washington Departments of Transportation have been trying to revive the corpse of the Columbia River Crossing, a multi-billion dollar boondoggle that died in 2014.  In the process, they’ve told a series of lies, beginning with the false claim that unless they move forward with the moribund project, that they’d have to repay $140 million in federal money spent on planning the original project.  (That’s not true!).

In September, the staff of the misnamed “Interstate Bridge Replacement Program” debuted their final and definitive list of project alternatives.  Every one of them is centered on a something labeled as a ten lane bridge, with typical illustrations like this:

If the past is any guide, the agency will draw pictures of a ten-lane bridge, but then size it to accomodate 12 or 14 lanes of traffic—exactly what they did with the failed Columbia River Crossing.  In reality the project is likely to look like these renderings of the Columbia River Crossing—12- or 14- lane, five-mile long freeway.

In the process, the staff has ruled out a range of other alternatives, like improving transit, instituting pricing, improving local connections, and constructing a supplemental bridge, rather than a replacement.  The staff published a series of memos in August, 2021, claiming, based on technical work done by the original CRC process more than a decade ago,  that these alternatives “failed to meet the project’s purpose and need,” the first item of which is “growing travel demand and congestion.” Whether any of these alternatives can meet “growing travel demand” and result in lower congestion depends critically on the assumptions one makes about future levels of traffic.  Similarly, the as yet un-resolved question of how wide the bridge needs to be also hinges on these same traffic forecasts.

For two years, ODOT has disobeyed Governor Brown’s order to prepare new forecasts first

The need for updated forecasts was recognized when the project was revived in 2019.  At the time, Governor Kate Brown promised that a first order of business would be revised forecasts to shape the project.  On November 18, 2019, Brown said:

“I think what else is key is that we’re going to be doing a traffic analysis ahead of time to help us determine what’s the best solution for the I-5 Bridge Replacement Project.”

Clearly, Governor Brown envisioned that we would do a traffic study first—”ahead of time”—and allow the data to shape decision.  But that’s not what has happened.  Let’s turn the microphone over to Clark County Today, which specifically asked the managers of the bridge project the status of their traffic projections, which were originally promised  in 2019.

It is now almost two years later. Has the IBRP team conducted a new traffic analysis to determine what’s the best solution for the I-5 Bridge replacement project? Clark County Today asked for the details of any traffic analysis.

“The Interstate Bridge Replacement (IBR) program is currently collecting new traffic data and conducting preliminary traffic modeling that will be used to inform the evaluation of preliminary design options that will be considered to identify the IBR solution early next year,” said Frank Green, IBR assistant program administrator. “More in-depth traffic modeling is expected to be completed in mid to late 2022 as a critical component of the federal environmental review process.”

The IBRP team has no plans to release forecasts until after making design decisions

That timetable was confirmed at the December meeting of the project’s Executive Steering Committee.  The project’s schedule calls for developing a resolution defining a “locally preferred alternative” by April 2022, and securing endorsements of that solution by June 2022.

 

Meanwhile, ODOT and WSDOT have no plans to complete serious traffic modeling—which would address the impact of tolling on traffic levels—for two or three more years.  In its November presentation on its tolling plans, the agencies made it clear that they are putting off serious “investment grade” forecasts—like the ones made for the CRC, which showed traffic on I-5 would never recover to pre-construction levels, until 2025.

What this means in practice, is that the only traffic projections that the project has were the ones prepared for its original environmental analysis.  These were published originally in 2008, and based on 2005 base year data.  As a practical matter, ODOT and WSDOT are planning this bridge based on data that is now more than 15 years out of date.

Pushing for a decision before updating traffic forecasts is engineering malpractice, and violated NEPA

When the project managers say that they need to build a bridge that is at least 10 lanes wide, it’s based on these outdated projections, rather than current, accurate information.  This isn’t so much fact-based engineering as it is faith-based speculation.  They’ve decided the bridge needs a minimum of ten travel lanes, without first doing a traffic forecast. The I-5 Bridge Project’s Manager has made it clear for nearly a year—well in advance of any technical analyses or any new traffic information—that they’ve already decided what they’re going to do.  Clark County Today summarized a January, 2021 presentation by Greg Johnson:

During discussions at Monday’s EAG meeting, Administrator Johnson made the following statement.
“One of the things that I also tell folks, if you’re here and you think we’re going to talk about a third bridge, or we’re going to talk about not doing the Interstate Bridge, you’re in the wrong meeting.  The whether we’re gonna do this has been decided. “

John Ley, “Revelations surface from the two ‘advisory’ group meetings on the Interstate Bridge,” Clark County Today, January 28, 2021.

Saying that the project must be ten lanes wide, or claiming that other alternatives don’t adequately meet the project’s stated purpose and need are, in the absence of traffic forecasts, simply arbitrary and capricious.  The fact that project managers have repeatedly made definitive statements that no other options will be considered, and that the bridge will be ten lanes wide before even undertaking traffic analysis, shows that they have no intention of allowing the data to drive their decisions, and are signalling that they will cook the modeling to justify this pre-made decision about the project’s size and scope.

Concealing or lying about traffic models is nothing new for ODOT. When it released its environmental assessment for the I-5 Rose Quarter freeway widening project, it entirely omitted any data on “average daily traffic”—the most basic yardstick of travel volumes, and also purposely concealed its modeling assumption that its base year, 2015 traffic volumes were based on the entirely fictional assumption that the I-5 Columbia River Crossing had already been built. As we’ve said, this is the opposite of planning.

 

 

Here’s what’s wrong with Oregon DOT’s Rose Quarter pollution claims

10 reasons not to believe phony DOT claims that widening highways reduces pollution

We know that transportation is the largest source of greenhouse gas emissions in the US, and that our car dependent transportation system is the reason Americans drive so much more and consequently produce far more greenhouse gases per capita than residents of other wealthy countries.  Scientists have shown that building more and wider roads stimulates more driving, longer trips, and more decentralized land use patterns, reinforcing car dependence.

With this entire vicious cycle well-documented, it’s hard to imagine anyone arguing that a widened urban freeway would  be good for the environment, but for state DOTs and their paid apologists, it’s a frequent claim.  They’ve created trumped up projections that claim traffic and pollution will be greater if we don’t build freeways.  These are false claims, and today we take a close look at how this plays out in one egregious, if typical, project.

For years, we’ve been following the Oregon Department of Transportation’s proposed I-5 Rose Quarter freeway widening project.  The project would widen a mile-and-half long stretch of Interstate 5 in downtown Portland at that has recently ballooned to $1.2 billion.

A key part of the agency’s argument is that this freeway widening project—exactly unlike every other one that has ever been undertaken—will have essentially no impact on air pollution or greenhouse gases.  They make the fanciful claim in their Environmental Assessment that the not widening the freeway (the “no-build” option) will somehow produce more pollution than the eight- or ten-lane freeway their plans shown they’re really intending to build.  In this commentary we sketch ODOT’s claims, and present a 10-point rebuttal.

A long list of false environmental claims from Oregon DOT

Recently, a Portlander interested in the project contacted us, asking us to comment on ODOT’s Environmental Assessment, which makes these claims:

  • Traffic operations would improve on I-5 in both the AM and PM time periods, . . .
  •  Conditions for pedestrians and bicyclists would improve from increased travel route options, improved ramp terminal intersections, physical separation from motorized users, and reduced complexity of intersections.
  • Overall, the Regional Travel Demand Model results did not indicate trip increases on I-5 much beyond the Project limits (i.e., no induced demand). The 5 to 14 percent trip increase on I-5 within the Project Area is expected for an auxiliary lane project intended to improve flow between entrance ramps and exit ramps and is indicative of primarily local through-traffic.
  • While consideration of greenhouse gas emissions and the effects of climate change has not been a NEPA requirement for EAs and EISs since the Council on Environmental Quality (CEQ) withdrew its previous guidance on April 5, 2017, ODOT included an analysis of climate change in the Project EA due to the high level of agency and stakeholder interest in these issues. As reported in Section 3.5 of the EA, the 2045 operational greenhouse gas emission total for the Build Alternative is projected to decrease by approximately 22 percent compared to the 2017 emission total due to federal, state, and local efforts to develop more stringent fuel economy standards and vehicle inspection and maintenance programs and the transition to cleaner low carbon fuels for motor vehicles. These trends are expected to continue over the life of the Build Alternative. The Build Alternative would contribute to this reduction due to higher speeds, less stop-and-go traffic, and less idling on I-5. Therefore, no mitigation is proposed.

Ten reasons not to believe Oregon DOT’s false claims

There is so much that is false and misleading about these claims about traffic, air pollution and greenhouse gases that it’s difficult to know where to begin.  We’ve written about all these phony claims at City Observatory.  Here are ten reasons why everyone should ignore ODOT’s environmental analysis of this project.

1. Traffic projections assume that a five-mile long, 12-lane wide freeway was built just north of this project in 2015.  Hidden in the Rose Quarter’s traffic forecasting is an assumption that a massive, multi-billion dollar Columbia River Crossing was built as part of the “no-build”–and finished five years ago. The project is still in limbo in 2021.  This inflates traffic and increases congestion in the Rose Quarter in the “no-build,” and makes the “build” look better than it is.

2. ODOT concealed plans that show it is widening the I-5 roadway enough to accomodate 8 or 10 lanes of traffic.  Two years after ODOT published the environmental assessment we uncovered true plans for a 160-foot roadway. But its traffic modeling assumes that the freeway is expanded only from four to six lanes.   Modeling an 8-  or 10-lane road would show much more traffic and pollution.

Secret ODOT documents showed plans for 160 foot roadway, enough for a 10-lane freeway.

3. ODOT’s Rose Quarter Forecasts forecasts are completely inconsistent with the forecasts they prepared for the CRC and as part of ODOT’s own road pricing work.  Those forecasts show much lower traffic on I-5 in the RQ in the “no build” scenario.  By inflating the base case, and ignoring induced demand, the Rose Quarter forecasts cook the GHG and pollution estimates and hide the negative impacts of the project.  (For details, see the section labeled “Two sets of books”, at the end of this commentary).

4. ODOT frequently claims that “pollution will be lower in the future”—but this is entirely due to assumptions about a cleaner vehicle fleet (more electric vehicles, tougher mileage standards for remaining internal combustion cars).

“. . . operational greenhouse gas emission total for the Build Alternative is projected to decrease by approximately 22 percent compared to the 2017 emission total .  . .”

This is a classic red herring:  You get these emission reductions whether you build this project or not. It’s simply irrelevant to deciding which options to choose. And, for what it’s worth, neither electric vehicle adoption or higher fuel economy standards accomplishing as much as ODOT hoped; in fact greenhouse gas emissions from driving are up in Portland by 1,000 pounds per person

5. In response to these criticisms, ODOT routinely claims that its air quality analysis was validated by a  “peer review panel”.  The panel was a whitewash:  it wasn’t provided with any of the critiques of traffic models and air quality analysis, held no public meetings, and explicitly chose to ignore road pricing, which they admitted could greatly affect project outcomes.  Former ODOT Director Grace Crunican, who ran the review, testified that the group didn’t look at the traffic projections to see if they were reasonable or accurate, they just took them at face value. The phony traffic numbers generate phony air quality estimates.

6. There is a strong scientific consensus on induced demand, with multiple studies in the US, Europe and Japan.  Wider roads means more travel.  ODOT and other highway agencies simply ignore the science.

When pressed, professional staff ODOT and PBOT admitthis project will do nothing to reduce daily “recurring” congestion at the Rose Quarter—invalidating claims that it will produce less idling.

7. At City Observatory, we’ve developed a version of the induced travel calculator created by the University of California Davis to estimate greenhouse gas emissions from the Rose Quarter project.  Its verdict:  Widening the roadway will increase emissions by adding 17.4 to 34.8 million miles of vehicle travel and 7.8 to 15.5 thousand tons of greenhouse gases per year.

8. Highway engineers love to pretend that greenhouse gases are caused primarily by cars having to idle in traffic, and they we can fight global warming by getting cars moving faster. That’s a myth:  Improving traffic flow generates more total miles of travel which overwhelms any savings from less idling.  Also:  Cars generate more GHG per mile at speeds over 50 MPH than below (something ODOT never mentions).  Wider, faster freeways mean both more vehicle miles traveled and more greenhouse gases generated per mile traveled.  This is validated scientifically by a paper published by two scholars at Portland State University.

9. Like most state DOT’s Oregon DOT uses an outdated and flawed traffic modeling approach that fails to accurately incorporate the effects of induced demand.  These static, four-step models consistently over-estimate traffic levels and congestion for no-build scenarios, and under-estimate or completely ignored the added travel induced by creating more capacity.

10. Globally, the only strategy that’s convincingly been shown to lower congestion is road pricing, which the Oregon Legislature approved for this stretch of I-5 in 2017.  Oregon DOT failed to examine road pricing as an alternative in its 2019 Environmental Assessment.  ODOT’s own consultants say pricing I-5 would obviate any need to add lanes at the Rose Quarter.

Governor Brown ordered ODOT to look at road pricing as part of its environmental review of the project in late 2019; but the agency has simply ignored her instruction.

ODOT is keeping two separate sets of books for its I-5 traffic estimates.

There’s no question that the traffic estimates created to sell the Rose Quarter project were rigged to make the “No Build” look worse.  At the same time it generated the Rose Quarter forecasts, ODOT hired another firm to estimate future traffic on this same stretch of roadway in 2027.  It came up with dramatically lower levels of I-5 traffic in the “no-build” world.

In May 2018, at the same time it was preparing I-5 forecasts for the Rose Quarter project, ODOT also contracted for modeling of I-5 traffic for the legislatively adopted congestion pricing plan. These are contained in a report from ODOT: https://www.oregon.gov/ODOT/Value%20Pricing%20PAC/VP_TM3-Final-InitialConceptEvaluatio n.pdf

These data include baseline estimates of traffic on Interstate 5 in the Portland metropolitan area for the year 2027. The study has baseline estimates, that project future traffic conditions in the absence of congestion pricing. This study uses an I-5 cordon line North of the project area corresponds to N. Skidmore Street, which is just two blocks from the I-5 cordon line used for the Rose Quarter projections. The following table compares the projected 2027 volumes in the congestion pricing study at this cordon line with the VISUM Rose Quarter 2015 volumes. This shows that the volumes used in the VISUM model for 2015 are 21 to 37 percent higher than the expected volumes in 2027, according to the congestion pricing baseline model.

I-5 North Volumes from two ODOT models
Northbound Southbound Total Difference
Time Period RQ VISUM Model (2015)
AM Peak 8AM-9AM 4,370 4,631 9,001 37%
PM Peak 5PM-6PM 4,424 4,855 9,279 21%
Conges on Pricing Study (2027)
AM Peak 8AM-9AM 3,255 3,337 6,592
PM Peak 5PM-6PM 3,803 3,860 7,663
RQ VISUM Model, “Mainline North of Going, 2015 No Build”
Conges on Pricing Study, “Interstate Br.-Skidmore” Baseline Traffic Performance

This analysis suggests that the traffic numbers, particularly north of the Rose Quarter project area are much higher than would be expected in another arguably reasonable forecast of traffic conditions. Given the expectation of growing traffic levels in the ODOT Rose Quarter modelling, one would expect that 2027 I-5 traffic levels would be considerably higher, not lower than 2015 levels. The fact that two models, prepared for the same agency, in the same month, produce two such different pictures of traffic levels suggests that the model results are highly sensitive to the assumptions and input values used by the modelers. These key values and assumptions have generally not been provided to the public for review, making it impossible for independent, third parties to understand, replicate, and analyze the summary results presented in the Environmental Assessment.

Climate efforts must be cost effective

Portland’s $60 million a year clean energy fund needs climate accountability

Any grant writer can spin a yarn that creates the illusion that a given project will have some sort of climate benefits, but if you’re actually investing real money, you should insist on a payback in the coin of the climate realm:  a measurable and significant reduction in greenhouse gas emissions.  That standard isn’t being met, mostly because even the most basic questions aren’t being asked of projects submitted for funding by Portland’s Clean Energy Fund.

Former Portland City Commissioner Steve Novick, and long-time Chair of the Oregon Global Warming Council, Angus Duncan are demanding that Portland’s $60 million per year clean energy fund have at least minimal performance standards when it comes to reducing greenhouse gases.  A bit of background:  using the city’s initiative process, activists forwarded a ballot measure to create a gross receipts tax on larger businesses, with the proceeds dedicated to clean energy investments.  The voters approved the measure by a wide margin, and the Portland Clean Energy Fund (PCEF) now is making grants to community groups for projects to promote greater energy efficiency, and, at least in theory, reduce greenhouse gases.  But in practice there are real concerns that the dispensation of grants is advancing either climate or equity considerations. As Novick and Duncan wrote in their Op-Ed:

Already, the fund has issued grants for projects that have nothing to do with climate or equity, including $100,000 for a rooftop garden on a yoga studio. For the program’s long-term success, we should ensure that the fund supports proposals tied to specific outcomes and cost-effectiveness goals. Unfortunately, the fund’s proposed scoring criteria do not provide any such benchmarks. Indeed, they are not really focused on outcomes at all.

The underlying problem is that PCEF doesn’t have clear standards for defining what constitutes either equity, or cost-effective greenhouse gas reductions.  According to Novick and Duncan, the application process is a kind of affinity-demonstrating beauty contest, with applicants chosen chiefly for the meritorious origins of the sponsoring organization rather than the characteristics of the project.

For example, applicants for large-scale grants of up to $10 million can earn no more than 8 points out of 100 for reducing greenhouse gas emissions. Think about that: “Clean Energy Fund” applicants could, apparently, get 92 points out of 100 without reducing greenhouse gas emissions at all. Meanwhile, applicants can only lose 5 points if their proposal will not “realistically result in intended outcomes.” So, you could get 95 out of 100 points even if your project has no realistic chance of succeeding. Applicants get a large number of points for “characteristics of the organization” categories, which may be important, but not more than actual results. The selection committee could simply require that organizational applicants meet certain criteria for diversity, etc., rather than allowing applicants to make up for a lack of substance by accumulating “organizational” points.

The idea of setting up a public sector fund to advance the twin goals of promoting equity and reducing greenhouse gases has considerable merit, but if it’s going to make any difference in either area, it needs to insist on measurable results and clear accountability.  The current PCEF regulations do very little to achieve this fundamental need.  Portland’s voters, historically disadvantaged communities, and the environment all deserve better.

A net zero blind spot

Stanford claims its campus will be 100 percent solar powered . . . provided you ignore cars.

A flashy news release caught our eye this week.  Stanford University is reporting that its campus will be 100 percent powered by solar energy very soon.

In the echo chamber that is social media, that claim got a lot of attention and repetition, and predictably morphed into an even more sweeping accomplishment.  Climate Solutions tweeted that Stanford would be the first major university to achieve 100 percent clean energy.

To be clear, Stanford’s press release didn’t make that claim, but any time you tout “hashtag 100 percent” anything, people tend to focus on the “100” and not on the universe to which that is applied.  When you read the fine print, it’s clear that the “100 percent” claim applies only the the campus buildings, not to how students, faculty, staff and visitor actually get to and from the campus to for education, research and entertainment.

Buildings get a lot of attention; you have to use energy to heat, light, and cool them, and run computers and other equipment, but as with the rest of America, the far bigger source of energy use and carbon emissions is not the buildings themselves, but the energy and pollution generated by traveling to and from them.  In California, cars account for 5 times as much greenhouse gas production (28 percent) as all commercial buildings (5.5 percent).

In the case of Stanford University, The “100 percent  solar” claims definitely doesn’t include the campus’s nearly 20,000 parking spaces, most of which are used by internal combustion fueled cars.   About 58 percent of those working on campus arrive by private car. And that produces vastly more carbon emissions that just building operations.

To its credit, in recent years, the university has been taking steps to build more on-campus housing, and to meet the travel demand from expansion without increasing the total number of car trips, but it’s still the case that cars and travel are the university’s leading source of greenhouse gas emissions.  So far the university’s own sustainability plan has counted mostly building-related emissions, and avoided counting what it classifies as “Scope 3” emissions associated with university travel.  They’re planning to address those emissions in the future.

While it’s technically true that the building energy may come from solar, it’s important to recognize that the buildings have no utility unless people travel to and from them on a regular basis.  The parking lots, and the cars they service are an integral–and in fact dominant–part of the university’s carbon footprint.  It’s all well and good to celebrate greater use of solar power, but before anyone makes “100 percent” or zero net carbon about any particular institution, they would do well to consider all their emissions, not just one component.

 

Insurance and the Cost of Living: Homeowners Insurance

Yesterday, we explored the differences in car insurance premiums in the nation’s largest metropolitan areas. Today, we will take a look at homeowners insurance rates. Unlike car insurance, homeowners insurance is not required in states. Still, this insurance can be required by a mortgage lender, and it is an important action to protect one’s home. Premiums vary across the US, with location being one of the strongest determinants of price. Differences in weather, proximity to disaster-prone areas, crime rates, and density can vary home insurance rates across different metropolitan areas. We will explore these variations in insurance premiums and consider how it might affect the overall cost of living. 

Home insurance rates are important to examine right now because of the growing impact of climate change on our built environments. Differences in insurance costs among cities may become even larger in the years ahead.  The growth in wildfires and extreme weather events associated with climate change has already produced record insurance payouts. Insurance models are generally based upon historical data. They calculate the expected payout and create prices accordingly. What happens when climate change shifts trends in unpredictable ways? Just last year, disaster payouts from reinsurers, the firms that insure the insurance companies, were the fifth highest in history. Climate change prompts insurance companies, and the “re-insurers” who share these risks to reevaluate their rate-setting models. When the risk for payout increases, so does the cost of insurance. Mother Nature plays a crucial role when considering insurance rates. Nature’s volatility can significantly heighten your premiums.

Which cities have the least (and most) expensive home insurance rates?

Home insurance rates have similar variables that impact the rates across the country. The greater the likelihood for damage to the home, the more expensive the rate will be. We used Insure.com data to compile the average annual home insurance rates across 52 of the largest metropolitan areas in the country. Unfortunately, Insure.com did not have any estimates for Riverside-San Bernardino-Ontario, CA. The site did not have clear methodologies on the rate, however, it was uniform throughout so we can examine the relative differences across these metro areas. Among the large metro areas, the median rate was $2118.  (Cities have somewhat higher rates than rural areas, which is why the large city median is north of the national average of $1631). 

The cities with the highest home insurance rates were Miami, New Orleans, Oklahoma City, and Detroit. The lowest home insurance rates were found in San Jose, San Diego, San Francisco, Sacramento, and Los Angeles. Among large metro areas, California appears to be the cheapest state to insure your home. In fact, with Seattle, Portland, and Las Vegas all below the 25th percentile as well, the West Coast proves to be a relatively inexpensive place to insure your home. The South? Not so much. Eight Southern cities are above the 75th percentile. 

The risk of extreme weather in these regions appears to be a significant factor for these differences. Miami and New Orleans have hurricane seasons every year. In 2020, there were a record breaking 30 named hurricanes during the Atlantic hurricane season, with 11 landfalling in the United States. The risk for home damage due to the prevalence of events like this increases insurance rates. As climate change continues to disrupt the trends of the weather, we can expect home insurance rates to rise. Other regions also face their own unique risks: Oklahoma City is in a high risk zone for tornados, fires, and earthquakes. These natural disaster risks push up premiums. While there are wildfire and earthquake risks in the West, they haven’t produced dramatically higher insurance rates—yet. We will likely see changes in this over time as we continue to feel increasing impacts of climate change though. 

Insurance and the Cost of Living

Just like we did yesterday for auto insurance, we compared the BEA’s RPP for Rent with the insurance data below. What we found: a slight negative correlation for homeowners insurance. However, when looking at the graph, you can see California standing out as an outlier, bringing down the correlation. When the RPP for Rent ranges from 80 to 150, there is a clear positive association between the insurance premiums and the price parity. This trend points us to consider that there might be an increasing effect for insurance rates and housing costs.

Let’s consider two cities: Oklahoma City and Seattle. According to Insure.com, the average annual homeowners insurance rate in Oklahoma City is $6045, while Seattle’s average rate is $1214. Clearly, there is a notable difference between these two premiums. When we compared the sprawl tax to cost of living, we used estimates of rent differential as a benchmark for cost of living. The cost of housing varies the most across metropolitan areas, so this was an effective measure for comparison. The typical resident in Oklahoma City paid roughly $2,525 less in annual rent/housing costs compared to the typical large metropolitan area. In Seattle, the typical resident paid approximately $2,055 more. From our Insure.com data, the annual housing insurance premium in Oklahoma City is $3,927 above the median premium across large metro areas. Seattle, on the other hand, has an annual premium that is $904 less than the median rate. These data suggest Oklahoma City’s high homeowner’s insurance rates effectively cancel out the lower rents in the metro area. The cost of housing is heightened significantly by homeowners insurance, placing Seattle and Oklahoma City on roughly the same pedestal. Seattle’s lower cost of insurance made up for the high rents. A cost of living comparison that omits these significant variations in insurance costs probably isn’t reliable.

What’s the relationship between home values and insurance rates?  In general, we would expect that insurance premiums would be higher in cities with expensive housing, as it would be more expensive to repair or replace a home in an expensive metro than a lower priced one.  Overall, there is a positive but modest relationship between home prices and insurance rates. Generally, the more valuable a home is, the more expensive it will be to insure it. 

High homeowners premiums are likely to hike up the cost of living. However, low homeowners premiums in Western metro areas like Seattle, San Jose, and Sacramento at least partially offset higher annual housing costs. In contrast, some seemingly affordable Southern metro areas like New Orleans, Oklahoma City, and Memphis could have some or all of their affordability advantage eroded by notably high insurance premiums. These areas are where insurance makes its biggest impact in the overall cost of living. 

Although homeowners insurance isn’t mandated by law like auto insurance, it is widespread and many homeowners regard it as essential, making it an important element to include in cost-of-living comparisons. Auto insurance likely plays a smaller role in the overall cost of living. Owning and insuring a car is not a requirement in all places, so the lack of need for auto insurance (and other car maintenance) could help lower the cost of living. It is difficult to fully quantify the cost of living across different areas because there is value in the external benefits of different cities. These cost of living comparisons struggle to encapture the amenities which cities can provide for their constituents. Still, insurance has the potential to play a mighty role in the cost of living. Auto and home insurance can require thousands of dollars out of consumers’ pockets each year. The impacts of climate change on our world will increase those premiums. Climate change is making a more volatile and uncertain world. As extreme weather risks intensify, the demand for insurance will as well. The role of insurance will heighten as we move into the future and it will be important to keep in mind how hefty the price tag is across the nation. The role in cost of living may not be entirely significant now, but as the world becomes more unpredictable, we could see the price to protect your belongings rise to new heights.

 

 

Note: This post has been updated to provide a link to insure.com‘s website.

Insurance and the Cost of Living: Auto Insurance

Everyone loves to compare the affordability of different cities, and most of the attention gets focused on differences in housing prices and rents. Clearly, these are a major component of living costs, and they vary substantially across the nation.  But as we’ve regularly pointed out at City Observatory, transportation costs also vary widely across cities, and some places that have somewhat more costly housing also have more compact development patterns and less sprawl, and therefore enable their residents to spend less on cars and gasoline for transportation. These differences can create a significant impact on the overall cost of living across cities. We’ve computed the difference in city living costs in our Sprawl Tax report.

There’s another important component to differing living costs across the nation that we think deserves additional attention: insurance costs. Nearly all drivers and the majority of homeowners carry insurance on their cars and homes. Insurance premiums vary widely across the US, based on differences in crash rates, losses to natural disasters, and state-to-state variations in legal standards (as well as other factors). Today we’re going to start looking more closely at these variations—we’ll start with differences in car insurance costs, and tomorrow, look at home insurance.

Car insurance is a requirement for many people living in auto-dependent places. Wide sprawling metropolitan areas often require more miles to be driven for commuting. We are curious about the role that insurance plays in relation to the cost of living. When a car is required to commute to work or the store, insurance is a necessity to protect yourself and the car. The price of insurance is dependent upon one’s personal background as well as the makeup of one’s setting. Location plays a major role, whether it be state policy, natural regional differences, or the composition of the insurance pool around you. Insurance can become a vital and costly expense for Americans. The magnitude of that expense could severely influence the way we view the overall cost of living across major cities. 

Which cities have the least (and most) expensive auto insurance rates?

Using data from TheZebra.com, we compiled the average annual auto insurance premiums across the 53 largest metropolitan areas in the United States. The Zebra estimated the typical premium paid by a 30 year old single male driving a Honda Accord. In the typical large US metropolitan area, the average automobile insurance premium is about $1,800 per year.  The rankings of large metros are shown below. Premiums in these large metros are higher than the overall national average auto insurance premium. In general, rural areas have generally lower rates. Insurance companies base their rates upon the perceived risk at obtaining a claim. The more at-risk the company is at receiving a claim, the higher their insurance rates will be. Cities increase the risk, as insurance companies report that claims are higher in urban areas, thus their rates increase accordingly.

The cities with the highest car insurance rates were Detroit, New Orleans, Miami, and New York. Detroit, by far the highest, had an average insurance rate of $6,280. The Motor City has annual prices approximately four times higher than the median rate for large metropolitan areas. The cities with the lowest car insurance premiums were Cincinnati, Charlotte, Virginia Beach, and Raleigh. These annual rates were all between $900 and $1200, significantly lower than the highest tier. Generally inexpensive cities (25th percentile) and generally expensive cities (75th percentile) vary by $670.

Other living cost estimates, like the sprawl tax, vary differently than auto insurance rates. The sprawl tax in particular has less extreme values, yet a wider range of variation across the middle of the data. The median sprawl tax, $1,302, is less than the median insurance rate by roughly $500. At the same time, the sprawl tax in generally inexpensive cities compared to generally expensive cities is nearly $1000, a greater difference than insurance rates. Insurance premiums in the most expensive cities are by far greater than the sprawl tax. These clear outliers make the distribution of auto insurance premiums standout compared to other living cost estimates. 

Another interesting difference can be seen by examining which cities standout across sprawl tax and car insurance. For instance, New Orleans and New York are two of the most expensive cities to insure your car. Yet, they have the lowest sprawl tax. Moreover, five of the ten most expensive cities to insure your car are all below the 25th percentile for the sprawl tax. Let’s take a closer look at the relationship between miles traveled and insurance rates. Using data from the Center for Neighborhood Technology, we compiled the annual vehicle miles traveled per household across metropolitan areas and compared it to the auto premiums. We assumed that we would see a positive relationship. The more miles you drive, the greater your risk for filing a claim. Instead, we found the opposite – auto insurance rates decreased as annual VMT per household increased.

Insurance rates were lower in the places where people drove more often. Insurance rates were higher in places where people had lower sprawl taxes. Why? Car dependence. In sprawling metropolitan areas, people need to drive themselves to get where they need to go. People are reliant on their cars for nearly all commuting travel, which in turn makes insurance a necessity. Insurance pools grow with more insurers distributing the risk of payout across more people, which consequently decreases rates. Sprawl may have a negative impact on insurance rates because of its impact on the demand for insurance. 

We believe that the structure of insurance policy is a strong contributor in the variation of rates across metro areas. State laws regulate the market for automotive insurance policies, which can be choice no-fault, a tort liability, or a combination of both. A no-fault state means that in the event of an accident, neither driver is deemed “at fault” and both drivers use their insurance to cover their own damages. As a result of this, no-fault states require personal injury protection (PIP) to cover medical costs, economic benefits, and death benefits. This additional protection on top of the ordinary liability can raise prices. Michigan is a no-fault state that also requires personal property insurance (PPI), and Detroit feels the consequences of that. In fact, 6 out of the top 10 cities for annual auto insurance rates (Detroit, Miami, New York, Tampa, Grand Rapids, and Philadelphia) are all in no-fault states. PIP requirements are likely a key player for the higher prices. Starting in July 2020, the state of Michigan readjusted their auto insurance policy to allow the insured to choose their level of PIP. This may lead to reductions in overall costs in the nation’s most expensive city for car insurance.

A city’s demographics are also important to consider when thinking about insurance rates. We have explored discrimination in automobile insurance before at City Observatory. We found that there were higher auto insurance rates in black neighborhoods, even those with safe drivers. Black drivers were found to be less likely to speed, yet they pay substantially more because of living in a predominantly Black neighborhood. A quick look at our data shows that several of the cities with the largest Black populations—Detroit, New Orleans, Philadelphia, and Baltimore—face some of the highest premiums of all the 53 largest metros. Race and population demographics could help explain some of this variation in average insurance premiums across the country.

Auto Insurance and the Cost of Living

As we explained in our Sprawl Tax series, the Bureau of Economic Analysis Regional Price Parities (RPP) is the most comprehensive measure we have of inter-metropolitan differences in consumer prices. Variations in rents are the largest component of overall cost-of-living variations between cities. But, how does the variation in auto insurance premiums relate to the variation in housing costs? Does the cost of insuring your car significantly change the cost of living? We compared the BEA’s RPP for Rent with the insurance data below. BEA’s RPP for rent does not include the cost of insurance. Insurance is included within the RPP for services, however it does not hold significant weight on their calculations (only about 6%).  

Comparing insurance rates to BEA RPP’s for rent, we find a slightly positive relationship with auto insurance and cost of living. This trend points us to consider that there might be an increasing effect for auto insurance rates and housing costs. However, when thinking about insurance and the cost of living, it is imperative to consider the quantity of insurance. Whether or not someone insures their car is dependent upon the makeup of the city. 

For example, in New York City, auto insurance is quite expensive, but owning a car is not a necessity for living within the city. The accessibility of consistent public transportation and the walkability of the city remove the need for a car and thus auto insurance. An interesting trend we have noticed in our data is that as annual vehicle miles traveled per household increases, the insurance rate decreases. So, while insurance rates might be high in cities like New York, a car is not a vital aspect of living within the city. New York’s relatively high car insurance rates aren’t a cost burden for the large number of households that don’t own cars. This is similar to our takeaways comparing insurance to the sprawl tax. High insurance rates may be more due in part to a lesser demand to drive a car and thus a lesser demand to insure an automobile. Car dependence makes auto insurance a necessary additional cost of living. However, limited sprawl and accessible alternative modes of transportation create an environment where insurance is not required. While this may cause rates to increase, it does not imply an increase in the overall cost of living. 

The real impact on the cost of living arises when automobile insurance is a required aspect of living. Looking objectively at the rankings does not paint the picture of cost of living. We must consider the auto dependence and the sprawl tax across these metropolitan areas to attribute the cost of insurance into transportation costs. It seems simple: people who don’t drive don’t insure cars. For those who must drive, auto insurance can pose an expensive burden annually. For those who don’t, the omission of auto insurance from their expenses is a valuable amenity. It is important to consider not only how much insurance costs but where the costs are necessary.

All in all, the cost of insuring a car can create a serious dent in your paycheck depending upon where you live. If you move across Pennsylvania from Pittsburgh to Philadelphia, your rates could increase by over $1000. The variation between these cities is notable. Racism and the structure of insurance policy may be major contributors, but other factors, like density and land use policy, might help explain this variation as well. Car dependence results in car insurance dependence. When you have to drive, insurance can place a burden onto your budget. This is where the impact on cost of living appears. People can save a whole lot of money annually when they do not need to insure an automobile.

 

BIB: The bad infrastructure bill

Four lamentations about a bad infrastructure bill

From the standpoint of the climate crisis, the infrastructure bill that passed the Senate is, at a minimum, a tremendous blown opportunity.  Transportation, especially private cars, are the leading source of greenhouse gas emissions in the US.  We have an auto-dependent, climate-destroying transportation system because we’ve massively subsidized driving and sprawl, and penalized or prohibited dense walkable urban development.  The Senate bill just repeats the epic blunder of throwing more money at road building, and even worse this time, drivers are excused from paying the costs of the bill—a triumph for asphalt socialism.

You’ve probably seen “BIB,” Michelin’s famous cartoon mascot for its tires (and tourist guides).  Coincidentally—and perhaps appropriately—BIB is the nickname of the Senate-passed bi-partisan infrastructure bill. and the animated stack of tires is an fitting mascot for the legislation.  It’s all about driving and jovially oblivious to the deep systemic problems in our approach to transportation finance.

Michelin’s BIB is the true mascot of the Bi-Partisan Infrastructure Bill; Two thumbs up for more subsidies to auto dependence, climate destruction and asphalt socialism.

Others, including Transportation for America, NACTO, Streetsblog, the Eno Foundation and Politico have written about the details behind the bill.  You should definitely read their analyses.  But allow us to pull out just four themes—lamentations, if you will—that explain why this triumph of bipartisanship is a disaster for safety, climate, and fiscal responsibility.

1. $200 billion for more deadly, climate killing highways.  The centerpiece of the bill is more money for roads and bridges, which in practice means wider roadways and more capacity.  The Senate Bill writes a $200 billion check for state highway builders, which is likely to fund wider roads and bridges that will generate more traffic, more pollution, more climate-destroying greenhouse gases, and more sprawl.  The National Association of City Transportation officials observes:

. . . the infrastructure bill passed today by the Senate keeps our nation on an unsafe and unsustainable path. It continues to prioritize building the infrastructure that most contributes to the U.S.’s worst-in-class safety record and extraordinarily high climate emissions: new highways. With transportation as the largest source of U.S. climate emissions, and 80% of those coming from driving, the Senate’s bill goes in the wrong direction, giving a whopping $200 billion in virtually unrestricted funding to this unsustainable mode.

Ironically the Senate passed the bill the same week the International Panel on Climate Change released its latest and most dire warning about the damage done by these greenhouse gas emissions.  The Senate Bill not only does nothing, but promises to squander vastly more resources to make the problem worse.

2. No accountability for climate, safety or good repair.  It’s a considerable exaggeration to say we have a federal “transportation policy.”  In effect, while it mouths a national interest in a safe, well-maintained and sustainable transportation system, the Senate bill just continues a system in which the federal government writes huge checks to state highway builders and . . . looks the other way.  There are no requirements with any teeth that hold states accountable for the most basic of outcomes, and the Senate bill continues this system, as Beth Osborne of Transportation for America explains.

. . . the Senate also supercharged the highway program with a historic amount while failing to provide any new accountability for making progress on repair, safety, equity, climate, or jobs access outcomes.

3. Inequitable asphalt socialism, and the cynical joke of “user pays”.  The great myth of road finance in the US is that we have a “user pays” system.  Truth is that’s never been the case, as road users have forever been subsidized, and shifted the social, environmental and health costs of the road system off onto others.  The Senate had no stomach for asking road users to pay a higher gas tax to support roads, so instead, they’ve turned the  highway “trust fund” into the trust-fund baby of the BIB, getting a huge infusion of general fund money, which will simply be added onto the national debt, with the costs ultimately being repaid, with interest, from income taxes.

An analysis by the ENO Foundation shows that the Senate bill transfers $118 billion to the highway trust fund from the general fund, bringing the total amount of the bailout of highway funds to more than $272 billion since 2008.

We know that despite Congress dumping more than $200 billion of general fund revenue into the highway portion of the fund, that car apologists will continue to make the false argument that we have a “user pays” system, and that spending money on any non-auto form of transportation is somehow stealing from drivers.  This bill completely disconnects use of the transportation system from the obligation to pay for its costs:  what we call asphalt socialism.

4. We could have done better.  House Transportation and Infrastructure Chair Peter DeFazio managed to engineer passage of a decent stab at transportation reform in the House, only to see it “gutted and stuffed” by the Senate.

The bill that passed the House earlier this year contained provisions that at least gently nudged expenditure priorities in favor of “fix-it first” policies that would put repairs ahead of expansion, would require at least an effort to consider how highway construction leads to greenhouse gas emissions, and would have held states accountable for actually improving safety (rather than just talking about it).  All that language was purged from the Senate BIB.

Politico‘s Tanya Snyder offers a succinct insider explanation of the politics of the bill from an anonymous former congressional staffer:

“It’s not that it was bipartisan; it was a least common denominator approach,” said the former aide, who worked on the transportation bills that Congress enacted in 2012 and 2015, both of which were hailed as triumphs of bipartisanship. Larger ambitions, he said, “fail to get addressed because of the insistence on staying in the very, very narrow area of bipartisan agreement and we recreate the status quo.”

So what the BIB really does is recycle and perpetuate a badly broken transportation finance system that promises to make our climate, community, health and transportation problems worse, not better.

 

To solve climate, we need electric cars—and a lot less driving

Electric vehicles will help, but we need to do much more to reduce driving

Editor’s Note:  City Observatory is pleased to offer this guest commentary by Matthew Lewis.  Matthew is Director of Communications for California YIMBY, a pro-housing organization working to make infill housing legal and affordable in all California cities. For 20 years, he has worked on climate and energy policy and technology development at the local, regional, national, and international levels. 

The deadly heat waves, epic floods, and worsening droughts around the world are forcing a reckoning on climate change: We messed up, badly, and the earth’s rapidly destabilizing climate system is the consequence. 

There are many “mistakes” that have brought us to this moment of truth — powering our economies on coal and methane gas, cutting down rainforest to grow beef, soy, and palm oil — but one of the biggest and most intractable: Car culture, NIMBYism, and their  incumbent challenges in energy use, land use, social equity, and human health and safety. 

Cars are not only the leading source of climate pollution in the United States, they’re also the leading cause of premature death for young Americans and a disproportionate cause of negative health outcomes: air pollution and sedentary lifestyles are major contributors to American morbidity and mortality, not to mention the 2 million Americans who are permanently injured every year by car crashes.

But there’s a problem: We’ve spent 50 years tearing down our cities and remaking them for sprawling, single-family house development, entirely reliant on the automobile. The climate crisis is urgent, and we don’t have time to completely undo this mistake and re-build our cities from scratch.

So, the question becomes: How much time do we have? Can we simply swap out electric cars for gasoline vehicles, and solve the climate crisis? The answer: Unequivocally, no. There are simply too many cars, too many of them run on gasoline, and most new cars sold today still run on gasoline. 

To make matters worse, our current urban land use policies are still controlled, in most cities, by NIMBYs opposed to more housing; who defend parking as a divine right; who oppose safe street interventions that save lives and make communities more walkable; and who block efficient transit interventions, like dedicated bus lanes. 

When it comes to climate change, NIMBYism is a huge factor in exacerbating pollution from cars, but it’s also led to the point where we have no choice in the matter: By forcing workers to live far from their jobs, and by allowing the car industry to continue selling gasoline vehicles, we have foreclosed the option of achieving a climate-friendly car culture.

As of this writing, there are 280 million cars and trucks in the American fleet, and 278 million of them run on gasoline. In an average year, Americans buy around 17 million new cars. So, in a scenario where 100% of new car sales were electric in 2021, it would be 2037 before all U.S. cars and trucks are electric (assuming no growth in the size of the fleet). 

But we’re nowhere near 100 percent EV sales in 2021. Current estimates suggest the earliest date when the last gasoline car will be sold is sometime in the 2040s. In fact, the car industry is still focused on selling primarily gasoline cars, and primarily gas-guzzlers like pickup trucks and SUVs. And it’s made clear that its intention is to continue selling these climate-destroying beasts for most of the rest of this decade.  

What that means: It will be a long, long time before the U.S. vehicle fleet will be all electric. Exactly how long depends on a number of variables, but the various scenarios are not hard to imagine — and in fact, experts have run the numbers and are converging on broad agreement:

If electric cars are going to be a part of the climate solution, Americans will have to drive much less. How much less is somewhat of an open question; but the California experience is illustrative. In 2018, the California Air Resources Board did the math on fleet turnover. What they found:

California cannot meet its climate goals without curbing growth in single-occupancy vehicle activity. 

Even if the share of new car sales that are ZEVs [zero-emissions vehicles] grows nearly 10-fold from today, California would still need to reduce VMT [vehicle miles travelled] per capita 25 percent to achieve the necessary reductions for 2030.

Furthermore, strategies to curb VMT growth help address other problems that focusing exclusively on future vehicle and fuels technologies do not. For example, spending less time behind the steering wheel and more time walking or cycling home, with the family, or out with friends can improve public health by reducing chronic disease burdens and preventing early death through transport-related physical activity. 

California’s existing plans for electric vehicles aimed to have 1.5 million ZEVs on the road by 2025, and 4.2 million by 2030. As of 2021, there are 425,000 BEVs registered in the state — meaning the electrified fleet has to grow 10x in the next 9 years, with sales that exceed the best-year ever, every year

But even at that breakneck pace, it won’t be enough to undo the damage done by the one-two punch of the car industry’s focus on selling gas guzzlers, and California cities’ commitment to defending NIMBY housing and land use policies. 

In sum: With all the gasoline vehicles still driving around for the next 15 to 20 years, EVs won’t be able to close the gap in pollution reduction fast enough. We’re out of time.

Others have done their own calculations, with similar results; one of the more recent among these, by researchers with Carnegie Mellon University, did not mince words about the pickle we’re in:

Transportation deep decarbonization not only depends on electricity decarbonization, but also has a total travel budget, representing a maximum total vehicle travel threshold that still enables meeting a midcentury climate target. This makes encouraging ride sharing, reducing total vehicle travel, and increasing fuel economy in both human-driven and future automated vehicles increasingly important to deep decarbonization.

In other words: In order for electric vehicles to solve climate, we need people to drive less. That combination — of less driving, and vehicle electrification — is the only pathway to climate stability. 

Ironically, this problem of the need for fewer cars is entirely of the car industry’s creation. For decades, the industry has fought efforts to make cars cleaner, to reduce air and climate pollution, and to transition their product away from deadly, inefficient, polluting machines to clean, efficient mobility devices. 

If we had started the transition to clean, electric vehicles 20 years ago, it would be slightly more accurate to say we can continue our current, sprawling housing and transportation patterns and rely on fleet electrification. But the car industry has spent those 20 years pivoting to SUVs and pickup trucks as their primary product. 

What this means: Urban land use reform, affordable housing, reliable transit, and safe streets are now top-priority, must-have interventions for climate change. We have to end the era of car-powered suburban sprawl, and make it legal — and less expensive — to build housing in our urban cores, or in the central business districts of suburban areas, near jobs, transit, and services. 

Denser areas not only have dramatically lower carbon emissions from transportation than suburban and rural developments; but they also use substantially less energy per home — as much as 50% less. 

The writing is on the wall. The car industry missed its climate window for electric sprawl, and the NIMBYs forced the issue with their selfish opposition to infill housing. At this stage, if electric vehicles are to play a major role in solving the climate crisis — which they must — they have to be paired with dramatic land use reform that shortens or eliminates a substantial portion of all vehicle trips, and replaces them with transit, walking, biking, shared vehicles, and other forms of mobility. 

Only by combining a rapid deployment of electric vehicles with an equally rapid elimination of the need for most Americans to own and drive a personal vehicle in the first place can we have a shot at climate stability. 

 

 

Further reading

Burn, baby, burn: ODOT’s climate strategy

The Oregon Department of Transportation is in complete denial about climate change

Oregon DOT has drafted a so-called “Climate Action Plan” that is merely perfunctory and performative busywork.

The devastation of climate change is now blindingly manifest.  Last month, temperatures in Oregon’s capital Salem, hit 117 degrees.  The state is locked in drought, and already facing more wildfires, on top of last year’s devastating fire season.  Cities across the nation are swathed in the smoke from Western wildfires—largest among them Oregon’s Bootleg fire—and are daily witnessing orange-tinged sunsets.

Earth Observatory, July 2021.

All this is plainly due to climate change, and in Oregon, transportation is the leading source of greenhouse gases.

If there were every a time to change direction on climate, this would be it.  But plainly, the Oregon Department of Transportation is locked on cruise control, moving ahead with plans inspired by the dreams of the 1950s rather than the increasingly grim reality of the the 2050s.

ODOT’s climate denialism

While the Oregon Department of Transportation mouths the words of climate awareness, the substance of its plans make it clear that it is engaged in cynical climate denialism.  Its State Transportation Strategy (STS)—which claims to address state greenhouse gas reduction goals, never acknowledges that Oregon is failing to reduce greenhouse gases as mandated by law because of increased driving.

In Oregon, transportation, and specifically driving, is the largest single source of greenhouse gas emissions, contributing 40 percent of statewide greenhouse gases.  And while we’ve made some progress cleaning electricity generation and reducing industrial emissions, transportation greenhouse gases are increasing, up 1,000 pounds per person annually in the Portland metropolitan area over just the last five years

To date, Oregon’s and ODOT’s plans to reduce transportation greenhouse gases have amounted to less than nothing.  According to DEQ statistics, driving related greenhouse gases are were 25 percent above 1990 levels in 2018.  Given the state’s objective of reducing greenhouse gas emissions 10 percent below 1990 levels by 2020 (and 80 percent below 1990 levels by 2050), we’re headed in exactly the wrong direction.

In the face of this epic failure to make progress, and in the wake of the most extreme weather ever recorded, ODOT is proposing to actually make the problem worse by spending billions widened Portland area freeways, and going deeply into debt to do so, as we discussed earlier.

As the Oregon Environmental Council and 20 other local environmental organizations wrote to the Oregon Transportation Commission:

Transportation-related emissions make up nearly 40% of Oregon’s total greenhouse gas emissions and continue to steadily increase. The investments that ODOT has made to address climate, while valuable, are profoundly insufficient given the scale and urgency with which we must reduce emissions to avert the worst of the grim and chaotic future we are already experiencing.

ODOT’s “climate action plan”:  Cynical, meaningless and wrong

In theory, the Oregon Department of Transportation is going to combat climate change through  is a five-year climate action plan, which is supposed to represent the agency’s response to a call from Governor Brown to take climate more seriously.  It’s a sketchy and vague grab-bag of piecemeal and largely performative actions, packaged along with some deceptive claims about transportation’s contribution to climate change.  And its policies make it clear that the agency is principally concerned with continuing to build more roads, and get ever larger budgets.

A fig leaf for business as usual

Appropriately, the plan’s signature  graphic element is a single leaf—a fig-leaf as it were— superimposed over an outline of the map of Oregon. Never has an info-graphic been more appropriate:  this document is fig-leaf covering what would be, upon any closer inspection, an obscene document.

 

ODOT’s climate plan has no measurable goals and no accountability

ODOT is a staffed by engineers, so you might think they’d have some actual quantitative measures of the problem they face, and the impact of the steps they’re proposing.  his a remarkable plan which contains no statement of the problem in terms of quantities of greenhouse gases emitted, no goal for their reduction (to 25 percent of 1990 levels by 2050, as mandated by law), and no evidence that individually or collectively the measures described in the plan will have any quantitative effect on greenhouse gas emissions whatsoever.  It’s merely a perfunctory checklist of actions, which once taken, will absolve ODOT of any further responsibility to worry about climate issues.  There’s simply no accountability for any reduction in greenhouse gases.

The lack of accountability is apparent in the framing of the document; there’s no acknowledgement that transportation greenhouse gas emissions have gotten worse since the development of ODOT’s first climate fig-leaf, it’s “Sustainable  Transportation Strategy” eight years ago.

Reducing GHG’s isn’t a priority:  Getting more money for ODOT is

Climate may be in the title, but the substance of the plan is really about ODOT’s 1950s era highway-building values.  It’s clear from the get-go that this strategy actually isn’t interested in reducing greenhouse gases.  The document states clearly it has three main priorities (pages 7-8):  promoting equity, building the transportation system, and assuring ODOT has enough money.  Reducing greenhouse gases didn’t even make the list.

It’s worth noting that with equity, as with climate, there are no measurable objectives.  It’s just lip service from an agency that has repeatedly pushed city destroying highways through minority neighborhoods, and which plans to atone for this damage by making the highways even wider.

Pricing is a source of revenue, not a means to manage demand or reduce climate pollution

Potentially, pricing could have a major impact on greenhouse gas emissions.  Fuel consumption and greenhouse gas emissions actually did decrease through 2014 when gas prices were high. But ODOT has no plans to using pricing to achieve climate objectives. The plan mentions pricing, but only to pay for more roads; there’s no commitment to use pricing to reduce greenhouse gases, vehicle miles of travel or traffic congestion.  The only stated policy concern is generating enough revenue to fund ODOT construction projects,

“Pricing:  Price the transportation system appropriately to recover the full costs to maintain and operate the system.”

The plan contains no provisions to reflect price of environmental damage back to system users—ODOT wants to get paid to build more roads, but resists having it or highway users pay for the damage their emissions cause.  As we’ve noted, ODOT backed off from pricing based in part on its consultations with avowed climate change deniers it appointed to its “stakeholder” group.

Falsely claims making traffic move faster will reduce emissions

The strategy repeats a false claim that Oregon DOT can reduce emissions by making traffic move faster.  One of the plan’s action items is “System efficiency” which it defines as

“. . . improve the efficiency of the existing system to reduce congestion and vehicle emissions.”

What this really means, in practice, is that ODOT spends more money on elaborate computers, variable advisory speed signs and signboards that show estimated travel times on some highways.  None of these so called “intelligent transportation systems” have been demonstrated to measurably reduce either congestion or greenhouse gases.  Moreover, to the extent they do lower congestion, they induce additional travel demand, and the scientific literature has shown that increased travel more than wipes out any emission reductions from improved traffic flow.  This is precisely the false set of claims that ODOT conceded it made to the Legislature in 2015 when it lied about estimates of emission reductions from these “system efficiency projects.”

It’s “climate lens” is really an eye-patch, turning a blind eye to induced demand

The climate plan says a “climate lens” will be applied to the State Transportation Improvement Program (TIP) projects, but is silent on the application of any such standard to megaprojects like  the I-5 Rose Quarter project, the revived Columbia River Crossing or any of the billions in other proposed Portland area freeway widening projects.  The induced demand calculator developed by the University of California Davis from the best available science shows these kinds of freeway widening projects will produce hundreds of thousands of tons of additional greenhouse gas emissions.

The so-called “climate lens” is simply a glib PR talking point that has no impact or meaning. If you were at all serious about climate, a real climate test would have the agency put on hold all additions to state highway capacity.

The climate plan reiterates a discredited claim that greenhouse gases can be reduced by widening roads and making traffic move faster, something that’s been shown scientifically to induce more driving and pollution.

Technocratic climate denialism

Future generations, enduring the brunt of increasingly intolerable summers and extreme weather, seeing Oregon’s forests and natural beauty decimated by climate change will look back to the decisions ODOT is making now and ask how it could simply ignore this problem, ignore the demonstrated science about its causes, and then commit literally billions of dollars to make it worse, dollars that future generations will be forced to repay.

This may seem like a simple, routine technical matter.  It’s not.  Its an irrevocable commitment to burn our state, to cower in ignorance in the face of an existential challenge, and an effort to cling to an outdated ideology that created this problem.

 

Miami’s E-Scooters: Revisiting the Double Standard

In Miami, e-scooters pay four to 50 times as much to use the public roads as cars

If we want to encourage greener, safer travel, we should align the prices we charge with our values

Florida is home to some of the most unsafe cities to be a pedestrian or a cyclist. Miami is currently attempting to change their image as an auto-dominated city and create a more inclusive transportation network along their downtown streets. They hope to increase the safety of bikers, pedestrians, and e-scooter riders. The Miami-Dade County of Transportation and Public Works Department plans to add bollards and concrete barriers along three miles of city streets to create protected lanes for bikes and scooters. Part of their plan to pay for this new infrastructure? A daily fee on e-scooters. The tax code reads that:

“operators shall remit to the city a motorized scooter fee in an amount of $1.00 per motorized scooter per day. The motorized scooter fee shall be calculated monthly based on the number of scooters authorized by the city of the current period…. During the duration of the pilot program, this motorized scooter fee shall be designated for sidewalk/sidewalk area, and/or street improvements within pilot program area.”

At City Observatory, we examined Portland’s scooter pilot program in 2019. The implementation of e-scooters into the city had positive results. Scooters were most used during peak travel hours, consequently reducing congestion. They provided a greener solution for people to make short trips throughout the city’s downtown. The program was successful and popular, but it left us wondering about how differently the Portland Bureau of Transportation treats these micro-mobility solutions as opposed to large, gas-guzzling automobiles. In Portland, we estimated  that scooter riders were paying ten times as much in fees per mile traveled than car users pay in gas taxes. For example, if payments were based on the amount of space taken up on the streets or the pollution generated, drivers should have been paying significantly more than scooter users. 

Miami’s new pilot program is another example of this stark double standard. Once again, we focus on how much the city charges different vehicles to use its roads. Miami’s tax code charges each scooter $1 per day. How does that compare to what it charges for a car? 

Gas taxes are a key source of local and state revenue for road infrastructure. The total tax imposed in Miami-Dade County on gasoline sums to 36.6 cents per gallon. If we assume the average car gets about 25 miles per gallon, the average vehicle in Miami pays a little more than 1 cent per mile traveled. Miami residents drive 19.2 miles daily according to the Federal Highway Administration. After some quick math, our estimate for the daily price car users pay in Miami comes to roughly 25 cents. This means, on a daily basis, e-scooters pay four times more per day than cars do.

 

Another way to look at this is to consider the amount the city charges per mile. A case study analyzing the e-scooter program in Indianapolis provides us with a template of what micro-mobility travel could look like in a major city like Miami. The Indianapolis e-scooter program averaged 4,830 trips per day and a median number of scooters in service per day of 1,654. This gives us an estimate of 2.92 daily trips for a unique scooter. The median trip length in this program was .7 miles. If we assume that Miami would see similar daily usage, scooters paying the city’s daily dollar fee pay roughly 34.2 cents per trip and 48.9 cents per mile. As we estimated above, the average vehicle in Miami pays roughly 1 cent per mile traveled, nearly 50 times less than our e-scooter estimate. Scooters in Miami will travel significantly shorter distances than cars in the city, however, they will likely be paying at vastly higher rates for using the roads. 

The City of Miami is not making a poor decision adding infrastructure to protect bikes, scooters, and pedestrians. The experiment in Portland showcased that there is a positive impact of increasing micro-mobility accessibility. There will be rewards from restructuring a car-dominated downtown to create safe, viable options for other modes of transportation. While their mission is in good faith, the partial funding by scooter fees begs the question: why are cars paying so much less each day?

Cars impose the most negative externalities onto the roadway. They are heavier. They take up more space. They create unsafe environments for other users of the road. Yet, the 25 pound scooter which is small enough to sit on the sidewalk pays 4 times as much to use the road each day. Research performed in London finds that e-scooter rentals could replace 5 million car trips, reducing both traffic congestion and CO2 emissions. These micromobility options provide a path to a transportation system that is safer, greener, and more efficient. If this is the future we want, it is imperative that we appropriately charge the most damaging and dangerous vehicles on our roads. Our prices ought to reflect our goals and our values. The disparities between what we charge those who use cars and scooters are a double standard that transportation departments must consider. Shouldn’t cars be paying more to improve the roads and make the city a safer place?

The Bum’s Rush

The $800 million project transitions from “nothing has been decided” to “nothing can be changed”

There’s a kind of calculated phase-shift in the way transportation department’s talk about major projects.  For a long, long time, they’ll respond to any challenges or questions by claiming that “nothing has been decided” or that a project is still being designed, that its in its infancy, and that objections will be dealt with . . . at some point in the indefinite future.

But then, a magic moment occurs, with no notice or observable event, and they’ll suddenly proclaim that it’s too late to raise and questions or consider any changes.

That’s exactly what’s happened with the Oregon Department of Transportation’s $800 million I-5 Rose Quarter project in the past few weeks.

You’ve got concerns?  Not to worry, nothing’s been decided

As recently as last fall, ODOT’s Director of Urban Mobility, Brendan Finn was telling OPB’s Think Out Loud host Dave Miller, the project is only 15 percent designed, and that there was lots of opportunity for the community to shape the project:

Well, the project is still pretty much in its infancy, it’s only being at 15 percent design.  I don’t clearly remember the exact verbiage as far as that.  The House Bill that was passed that created the Rose Quarter project, HB 2017, did have certain parameters in it that were expected from the Legislature, and that was one of them.  That said, there is almost an amazing opportunity here to connect neighborhoods and to provide not only multi-modal options, but  community connections.  And for us, making this move right now is signaling to the community . . . especially . . . those who have left the process, that we are willing to do things differently, we are ready to change, we are ready to be deliberative about our commitment to our shared values around restorative justice.

And very publicly at that time, ODOT convened an “Historic Albina Advisory Board “(after blowing up two other efforts at community engagement) and spent several million dollars hiring a team of consultants to conduct an independent highway cover assessment.  ODOT hired a multi-million dollar team of consultants to undertake an independent analysis of the freeway covers, undertaking both a technical analysis, and seeking public opinion.  This work developed a series of alternatives that vary considerably from the proposal being designed by ODOT—devoting more space to housing  and better reconnecting the urban street grid, and moving freeway on and off ramps away from the center of project.

According to ZGF: Oregon DOT’s plan for the Rose Quarter produced irregular parcels that would be “challenging to develop” and create a complex, unintuitive street system.

This process is proceeding according to the timeline ODOT announced when it appointed this new board last year.  In May, consultants presented their analysis and recommendations to the Historic Albina Advisory Board and the project’s steering committee.  Their alternatives, included two that rate much higher technically, and in community support, and which would significantly re-design the project.

Sorry, time’s up: Too late to make any changes

Now, ODOT says, it’s simply too late to think about doing anything different than what the agency first planned.

The community has been pushing for buildable covers for years, and now ODOT says, that any consideration of covers will create unacceptable further delays.  Last week, in response to support by Oregon’s two Senators and local Congressman for buildable covers, ODOT said it was basically too late to think about doing that, according to the Portland Mercury:

A spokesperson for ODOT told the Mercury that while getting the cost of the caps fully funded by federal dollars would be “a dream,” there are still other obstacles to consider when building more substantial freeway caps. In order to build caps capable of supporting five-story buildings, the caps would need larger support pillars on either side of the freeway to ensure that there is enough structural strength to support the buildings.

“That could mean further acquisitions of land in the area, potentially displacing businesses,” said April deLeon, an ODOT spokesperson.

According to deLeon, changing the design of the caps now could also add time delays. The Federal Highway Administration would need to approve the new cap design, but would be under no timeline to do so. That time delay could make the project more expensive due to the cost of inflation.

It’s also worth noting that ODOT’s own consultants have said that buildable covers would be much more economical, if only ODOT would narrow the overly wide project to just the two additional lanes it says it needs, and built shoulders comparable to other urban freeway projects.  According to ODOT, it will build stronger covers only if it gets to condemn other people’s land, not if it has to give up any of the monstrously oversized roadway it intends to built (and then to re-stripe into a ten-lane freeway).

ODOT’s reflexive claim that its too late, and too costly to even consider buildable caps shows that their claims of interest in “restorative justice” are just a sham.  What the really want to do is build a wider freeway, and they’ll engage in whatever performative theatre they think is needed to convince people they care

Kudos to OPB’s Dave Miller for asking hard questions last fall:  But what the media generally fails to do is follow the thread and insist on accountability.  At what point did the project go from “infancy” to unchangeable?  Who made that decision?  Deus ex machina is a great literary device, but it’s no way to run a government.

How highways finally crushed Black Tulsa

Tulsa’s Greenwood neighborhood survived the 1921 race massacre, only to be ultimately destroyed by a more unrelenting foe: Interstate highways

Black Tulsans quickly rebuilt Greenwood in the 1920s, and it flourished for decades, but was ultimately done in by freeway construction and urban renewal

Even now, Tulsa has money for more road widening, but apparently nothing for reparations.

The past week has marked the Centennial of the Tulsa Race Massacre, when hundreds of Black residents of Tulsa’s Greenwood neighborhood were brutally killed and the neighborhood, Greenwood, was leveled.  Recent news stories have made more Americans aware of this tragic chapter of our history than unfolded in May 1921:  Greenwood’s residents were shot and beaten, their homes and businesses burned and their neighborhood bombed.

What’s less known is that despite the best efforts of the violent racists, they didn’t kill Greenwood.  In fact, the neighborhood’s Black residents returned and rebuilt, in less than five years.  The rapid rebuilding, in spite of the obstacles put in its place by the City of Tulsa, and continued racial discrimination grew praise for the neighborhood’s  resilience.  In 1926, W.E.B. Dubois wrote “Black Tulsa is a happy city,“ saying:

Five little years ago, fire and blood and robbery leveled it to the ground. Scars are there, but the city is impudent and noisy. It believes in itself. Thank God for the grit of Black Tulsa.”

Rebuilt and thriving Greenwood in North Tulsa, 1930s.

Greenwood’s heyday stretched into the 1950s, and even its moniker–The Black Wall Street–dates from this period, and not before the massacre.

What finally killed Greenwood wasn’t an angry racist mob:  it was the federally funded Interstate highway system.  Coupled with urban renewal, highways built through North Tulsa’s Greenwood neighborhood in the late 1960’s did what the Klan and white racists couldn’t do:  demolish the and depopulate the place.  That’s the key conclusion of a newly published book by Carlos Moreno, which chronicles the neighborhood’s destruction, re-birth and ultimate demise at the hands of the highway builders.  NBC News interviewed Moreno, and reported:

In his new book set to be released next week, “The Victory of Greenwood,” Moreno explores how the neighborhood had a second renaissance led by Black Tulsans after the massacre, rebuilding even bigger than before. It was not the bloodshed that eventually destroyed most of Greenwood, however; rather, it was this, he said, pointing to the spaghetti of interchanges to the south and the expressway that stretches north.

Carlos Moreno & the freeway that finally crushed Greenwood (NBC News)

In an essay at Next City, Moreno explains:

What often gets erased from Greenwood’s history is its 45 years of prosperity after the massacre and the events that led to Greenwood’s second destruction: The Federal-Aid Highway Acts of 1965 and 1968. As early as 1957, Tulsa’s Comprehensive Plan included creating a ring road (locally dubbed the Inner-Dispersal Loop, or IDL); a tangle of four highways encircling the downtown area. The north (I-244) and east (U.S. 75) sections of the IDL were designed to replace the dense, diverse, mixed-use, mixed-income, pedestrian, and transit-oriented Greenwood and Kendall-Whittier neighborhoods.

As in so many other US cities, the construction of freeways was used to demolish, divide and isolate communities of color.  President Biden acknowledged the federal government’s role in Greenwood’s decline in his proclamation of a Day of Remembrance:

And in later decades, Federal investment, including Federal highway construction, tore down and cut off parts of the community. The attack on Black families and Black wealth in Greenwood persisted across generations.

At a community conversation sponsored by Tulsa Urbanists, Moreno summarized his research on the role of the 1921 race massacre and later highway building and urban renewal efforts in destroying Greenwood:

Greenwood looks the way it does today, not because of the massacre.  It came back. And there’s a video footage of that. But North Tulsa/Greenwood look the way it does today because of the federal highway project, and because of urban renewal . . .

And Tulsa continues to widen highways even as it refuses to discuss reparations for the destruction of Greenwood.  Again, Moreno:

Somehow it’s okay for Tulsa to pay $36 million to repair one mile of road between 81st and 91st on Yale in South Tulsa, like that’s okay. We have no problems doing that. We just passed a road widening bill and every single citizen of Tulsa is paying a part of that $36 million. But somehow reparations for Greenwood is a non starter . . .

A hat tip to Next City for publishing Carlos Moreno’s synopsis of his book and Graham Lee Brewer of NBC News his reporting of how the highway’s wounds still trouble Greenwood to this day.

It’s hard to imagine anything more hateful and horrific than the bloody attack on Greenwood in May, 1921.  In the past century, that kind of violent overt racism has given way to a more subtle, more pernicious and more devastating  kind of systemic or institutional racism, in the form of highway construction.

Single-Family Zoning and Exclusion in L.A. County: Part 1

Single-family zoning, a policy that bans apartments, is widespread in Los Angeles County. The median city bans apartments on 80% of its land for housing.

Cities with more widespread single-family zoning have higher median incomes, more expensive housing, and higher rates of homeownership.

Single-family zoning blocks renter households and low- and moderate-income households from accessing affordable housing in affluent, high-opportunity cities.

Editor’s Note:  City Observatory is pleased to publish this guest commentary by Anthony Dedousis of Abundant Housing LA.

For over a century, single-family zoning has defined the landscape of American cities. Single-family zoning, which prohibits the construction of apartments, including modest townhouses and duplexes, essentially mandates single-family detached houses as the only legal housing option. While single-family zoning predominates in suburbs, it is surprisingly common even in large cities, where apartments are often banned on over 75% of the land zoned for housing.

When a city limits housing types, it limits who can live in that city. Single-family homes are more expensive than apartments; limiting the housing stock to standalone homes thus raises the cost of living. Single-family zoning fits into a constellation of land use restrictions, including minimum lot sizes and on-site parking requirements, which raise housing costs and create barriers based on income and race. 

Here in California, efforts to reform exclusionary zoning and promote denser housing have met with fierce opposition from wealthy homeowners and anti-gentrification activists. Even modest reform proposals like Senate Bill 9, which would legalize duplexes, have attracted hostility from similar corners. As policy and research director at Abundant Housing LA, a “Yes In My Back Yard” organization that advocates for more housing, I sought to better understand the interplay between zoning, income, and race in the Los Angeles area. The outstanding examination of single-family zoning in the Bay Area by the UC Berkeley Othering and Belonging Institute inspired my methodology and interest in this topic.

By analyzing zoning data from the Southern California Association of Governments and demographic data from the American Community Survey, I determined how widespread single-family zoning is in each of L.A. County’s 88 cities. I found that cities with more pervasive single-family zoning tend to have higher median household incomes, higher median home values, higher homeownership rates, and are often more racially segregated.

This two-part series will dive deep into the connection between these variables: Part 1 explores income, housing costs, and homeownership, while Part 2 will focus on race and segregation.

Single-Family Zoning, Income, and Housing Costs

Apartment bans are widespread in L.A. County; the median city zones over 80% of the land for housing as single-family. Even in the City of Los Angeles, America’s second-largest city, apartments are banned on 80% of the residentially-zoned land. 

Cities with widespread apartment bans are often high-income enclaves, like Calabasas in the western San Fernando Valley, Palos Verdes Estates in the South Bay, and La Canada Flintridge in the San Gabriel Valley. However, single-family zoning is common even in some lower-income cities, like Compton, South Gate, and Lancaster.

Some pairs of neighboring cities hint at a link between single-family zoning and cities’ socioeconomic makeup. In Lawndale, a lower-income city in the South Bay with a Latino majority, just 8% of the land for housing is zoned single-family. Its neighbor, Torrance, bans apartments on 80% of the land zoned for housing. Torrance’s median household income is $34,000 higher than Lawndale’s, and just 17% of the city’s population is Latino. Other city pairs, like Claremont/Pomona and Cerritos/Hawaiian Gardens, have similar dynamics.

To explore this link, I grouped cities into five buckets, based on their prevalence of single-family zoning:

  • 0-50% of residential land zoned single-family (13 cities)
  • 50-70% of residential land zoned single-family (11 cities)
  • 70-80% of residential land zoned single-family (16 cities)
  • 80-90% of residential land zoned single-family (23 cities)
  • 90+% of residential land zoned single-family (23 cities)

For those who want more information about individual cities, a here is a spreadsheet showing which zoning bucket each city falls into, as well as other key statistics.

There’s a clear link between more single-family zoning and higher household incomes. In cities with the most single-family zoning, median incomes are more than twice as high as in cities with the least single-family zoning.

Cities with more single-family zoning also tend to have more expensive homes.

The median home value in the most restrictive cities is nearly $700,000, 63% higher than in the least restrictive cities.

Cities with more single-family zoning also have much higher rates of homeownership, especially in very restrictive cities. This stands to reason: apartment-dwellers tend to be renters, so a widespread ban on apartments acts as a backdoor ban on renter households.

Next, I looked at the relationship between single-family zoning and income for individual cities. Here, it’s important to remember that many suburbs were incorporated explicitly to maintain income and racial barriers; for example, suburban Lakewood incorporated in the mid-1950s to avoid annexation by larger, more urban Long Beach. It’s also worth noting that the City of Los Angeles is by far the largest city in L.A. County by population and area, and that zoning, income, and housing costs vary hugely between the city’s neighborhoods.

With these caveats aside, we can see that nearly every city with a high median household income also has a high prevalence of single-family zoning. There are almost no cities with high incomes and low single-family zoning prevalence, though 20 cities have both high single-family zoning (above 75%) and below-average median household incomes.

Similar patterns emerge when we compare single-family to median home values and to homeownership rates.

 

Taken together, we can see that cities with wealthier populations, more expensive homes, and higher rates of homeownership are likelier to have widespread apartment bans. Single-family zoning thus acts as a barrier preventing renter households and low- and moderate-income households from accessing affordable housing in affluent, high-opportunity cities, a point that the Berkeley team made convincingly in its Bay Area analysis.

Analyzing housing policy, income, wealth, and exclusion in cities also requires us to analyze race and segregation. In Part 2, I will dig into the relationship between cities’ zoning, racial composition, and prevalence of segregation.

 

Single-Family Zoning and Exclusion in L.A. County: Part 2

Single-family zoning, a policy that bans apartments, is widespread in Los Angeles County. The median city bans apartments on 80% of its land for housing.

Cities with more widespread single-family zoning have higher white and Asian population shares, and lower Black and Latino population shares.

Cities with more widespread single-family zoning are more segregated relative to Los Angeles County as a whole.

Single-family zoning acts as a significant barrier to Black and Latino Americans accessing affordable housing in affluent, high-opportunity cities.

Editor’s Note:  City Observatory is pleased to publish this guest commentary by Anthony Dedousis of Abundant Housing LA.

In Part 1 of this series, I examined single-family zoning in the cities of Los Angeles County, and found that cities with more restrictive zoning tend to have higher median incomes, housing costs, and homeownership rates. In Part 2, I dive into the link between zoning and cities’ racial composition, and found that cities with more pervasive single-family zoning tend to have lower Black and Latino population shares, and are more racially segregated relative to Los Angeles County as a whole.

Sadly, this was not a surprising finding: in the United States, income and race are closely linked, and racial discrimination helps to explain much of the wealth gap between white and nonwhite Americans. Also, in many cities, single-family zoning was often introduced as a way to maintain race-based barriers to housing opportunities without violating civil rights laws.

In 1916, Berkeley, California became the first American city to implement single-family zoning; one motivating factor was a desire to exclude Black-owned businesses and families from a white neighborhood. After a 1917 Supreme Court case overturned local laws forbidding Black Americans from purchasing homes in white neighborhoods, many cities used exclusionary zoning as a way around the ruling

Even after redlining and restrictive covenants were banned in the 1960s, single-family zoning offered an ostensibly race-neutral way to maintain patterns of segregation into the present. Studies have linked more restrictive zoning to lower Black and Latino population shares in Boston-area neighborhoods and in New York’s suburbs. A Berkeley analysis of Bay Area zoning found that cities with more pervasive single-family zoning were likelier to have a higher white population share, higher median incomes, more expensive homes, better access to top schools, and superior access to economic opportunity.

Present-day L.A. County is both incredibly diverse and also highly segregated. While the county overall is 48% Latino, 26% white, 14% Asian, and 8% Black, the racial composition of individual cities varies dramatically. Cities with more widespread single-family zoning tend to be more white, more Asian, less Latino, and less Black. Cities with the least single-family zoning tend to be majority-Latino.

To explore further, I calculated a “Segregation Index” score, which compares each city’s racial composition to the county’s overall demographics. (This is similar to the Divergence Index used as the UC Berkeley team’s preferred metric of segregation, scaled so that 100 would represent the most segregated city in the county.) 

In Los Angeles County, cities with higher Segregation Index scores tend to have larger white and Asian populations and smaller Latino and Black communities, relative to the county average. High-income cities in the South Bay and western San Fernando Valley, as well as Beverly Hills, Glendale, and Burbank, have high Segregation Index scores and high prevalence of single-family zoning.

Segregation index by city (100 = most segregated)

Again, grouping cities into five buckets, based on the prevalence of single-family zoning, shows that cities with more widespread single-family zoning tend to have higher Segregation Index scores. Cities with the most restrictive zoning have Segregation Index scores that are nearly ten times higher than cities with the least restrictive zoning.

Additionally, while not every city with heavy single-family zoning is highly segregated, the most segregated cities generally have widespread single-family zoning.

As with income, this shows how single-family zoning can create or maintain patterns of ethnic exclusion. Single-family homes are more expensive to buy or rent than apartments, and given America’s significant racial income and wealth gap, regulations that mandate more expensive types of housing in a city effectively make that city less affordable to Black and Latino Americans. Barriers to apartment production therefore block many Black and Latino Americans from moving to higher-income areas that offer upward economic mobility, and reinforce the concentration of lower-income families in low-opportunity neighborhoods.

Single-family zoning has a heavy social and economic cost: it makes it harder for families with low or moderate incomes, who are disproportionately Black and Latino, to move to prosperous cities with good schools and jobs. And by raising the cost of housing, it excludes these households from homeownership opportunities. This is profoundly unfair and unnecessary; simply legalizing apartments in these affluent cities would do much to create more affordable housing and homeownership opportunities for people of all backgrounds.

Building a society where opportunity and prosperity are widely enjoyed, regardless of one’s income, skin color, or place of birth, requires us to end single-family zoning. Ending single-family zoning does not mean banning single-family homes. It would simply make single-family houses one of several types of housing that are permissible. Single-family houses would continue to exist alongside duplexes, bungalow courts, and larger apartment buildings, as they already do in neighborhoods throughout California.

Fortunately, a growing number of cities are embracing zoning reform. Minneapolis and Portland were first to legalize small apartment buildings citywide. In 2019, California legalized accessory dwelling units and has seen strong ADU production as a result. Sacramento recently voted to allow up to four homes on any residential parcel, and Berkeley is poised to do the same. Even Santa Monica, no stranger to NIMBY politics, may allow modestly higher housing density in single-family zoned areas. 

Legalizing apartments will make these cities more affordable and welcoming to people of all walks of life. It’s time for every city across America to do the same.

State DOTs can and should build housing to mitigate highway impacts

If OregonDOT is serious about “restorative justice” it should mitigate  highway damage by building housing

Around the country, states are subsidizing affordable housing to mitigate the damage done by highway projects

Mitigation is part of NEPA requirements and complying with federal Environmental Justice policy

The construction of urban highways has devastating effects on nearby neighborhoods.  Not only has building highways directly led to housing demolitions to provide space for roadways, the surge of traffic typically undermines the desirability of nearby homes and neighborhoods, leading to the depreciation of home values, the decline of neighborhood economic health, and population out-migration.  That story has been told numerous times across the US; we’ve detailed how the Oregon Department of Transportation’s decisions to build three huge highway projects in the 1950s, 1960s and 1970s decimated Portland’s Albina neighborhood.  This predominantly Black neighborhood lost two-thirds of its population over the course of a little more than two decades. At the time, no one gave much thought to the loss of hundreds or thousands of housing units, or the effect on these neighborhoods.  But increasingly, state highway agencies are looking to mitigate the negative effect of current and past highway construction by subsidizing housing in affected neighborhoods.  Here are three examples from around the country.

The Oregon Department of Transportation claims that it is interested in “restorative justice” for the Albina community, which has identified housing as one of the keys to building wealth and restoring the neighborhood.  And ODOT’s project illustrations show how hundreds of housing units might be built near the project–but these are just vaporaware, as ODOT hasn’t committed to spending a dime of its money to making that happen, to replacing the homes it demolished over the decades.  Based on the past and current experience of other states, there’s no reason that they can’t treat investments in housing as mitigation, just as ODOT routinely spends a portion of its highway budget on restoring wildlife habitat, creating new wetlands, or even replacing jail cells.  A real restorative justice commitment would make up for the damage done, as these examples show.

Lexington Kentucky:  A community land trust funded from highway funds

For decades, Kentucky’s highway department had been planning a freeway expansion through Davis Bottom, an historically African-American neighborhood.  The threat of freeway construction helped trigger the decline of the neighborhood.  When the road was finally built a little more than a decade ago, the state highway agency committed to restoring the damage done to the area by investing in housing.  As part of the Newtown Pike Extension project, the Kentucky Transportation Cabinet acquired 25 acres of land and provided funding to establish a community land trust for the construction of up to 100 homes.

In 2008, the Federal Highway Administration gave the project an award for this project, saying:

The Davistown project is the first CLT ever created with FHWA Highway Trust Funds. Eighty percent of the project, including the acquisition of CLT land and the redevelopment of the neighborhood, will be funded with these FHWA funds.

The FHWA Environmental Justice guide highlights the Lexington CLT as as a “best practice.”

Houston, Texas:  $27 million to build affordable housing to mitigate interstate freeway widening

Houston’s I-45 “North Houston Highway Improvement Project” would, like the I-5 Rose Quarter project, widen a freeway through an urban neighborhood.  The Texas Department of Transportation, as part of the project’s environmental impact review process has dedicated $27 million to build or improve affordable housing in neighborhoods affected by the freeway.

 

Reno Nevada, State DOT providing land and money to cities and counties for affordable housing

Nevada DOT committed to using highway funds to pay for housing to mitigate effects of freeway expansion in Reno at the junction of I-80 and US 395.

NDOT will provide funds or land already owned by NDOT to others (Cities of Reno or Sparks, Washoe County) to build affordable replacement housing for non-Reno Housing Authority displacements. Those displaced by this project who wish to remain in the area will be given priority access to the replacement housing. After those needs have been addressed, the affordable housing will then be made available to those who qualify for affordable housing but were not displaced by the project. Residents will be considered eligible for this replacement affordable housing if they meet Section 8 eligibility requirements or Reno Housing Authority’s Admission and Continued Occupancy Policy (Reno Housing Authority 2018).

Federal Regulations encourage or require mitigating impacts.

The key environmental law governing federal highway projects is the National Environmental Policy Act.  It requires that agencies identify the adverse environmental impacts of their decision, and then avoid, minimize or mitigate those impacts.  In particular, NEPA mitigation includes “restoring the affected environment” and “compensating for the impact by . . . providing substitute resources or environments.”  Using highway funds to replace housing demolished by a freeway is one key way in which the negative effects of a highway project on an urban community can be mitigated.

CFR § 1508.20 Mitigation.

Mitigation includes:

(a) Avoiding the impact altogether by not taking a certain action or parts of an action.

(b) Minimizing impacts by limiting the degree or magnitude of the action and its implementation.

(c) Rectifying the impact by repairing, rehabilitating, or restoring the affected environment.

(d) Reducing or eliminating the impact over time by preservation and maintenance operations during the life of the action.

(e) Compensating for the impact by replacing or providing substitute resources or environments.

A federal executive order on Environmental Justice directs agencies to pay particular attention to the impacts, including the cumulative impacts of agency decisions on low and moderate income people and people of color.  THe Federal Highway Administration’s Environmental Justice Policy specifically identifies impacts on neighborhoods as “adverse effects” of federal highway projects, and calls for both mitigating these impacts, and considering alternatives that minimize adverse impacts on communities.

  1. Adverse Effects. The totality of significant individual or cumulative human health or environmental effects, including interrelated social and economic effects, which may include, but are not limited to: bodily impairment, infirmity, illness or death; air, noise, and water pollution and soil contamination; destruction or disruption of human-made or natural resources; destruction or diminution of aesthetic values; destruction or disruption of community cohesion or a community’s economic vitality; destruction or disruption of the availability of public and private facilities and services; vibration; adverse employment effects; displacement of persons, businesses, farms, or nonprofit organizations; increased traffic congestion, isolation, exclusion or separation of minority or low-income individuals within a given community or from the broader community; and the denial of, reduction in, or significant delay in the receipt of, benefits of FHWA programs, policies, or activities.

FHWA Policy:  It is FHWA’s stated policy to mitigate these disproportionate effects by providing “offsetting benefits” to communities and neighborhoods, and also to consider alternatives that avoid or mitigate adverse impacts.

What is FHWA’s policy concerning Environmental Justice?  The FHWA will administer its governing statutes so as to identify and avoid discrimination and disproportionately high and adverse effects on minority populations and low-income populations by:

(2) proposing measures to avoid, minimize, and/or mitigate disproportionately high and adverse environmental or public health effects and interrelated social and economic effects, and providing offsetting benefits and opportunities to enhance communities, neighborhoods, and individuals affected by FHWA programs, policies, and activities, where permitted by law and consistent with EO 12898;

(3) considering alternatives to proposed programs, policies, and activities where such alternatives would result in avoiding and/or minimizing disproportionately high and adverse human health or environmental impacts, where permitted by law and consistent with EO 12898

Taken together, the NEPA requirements for mitigation, and the FHWA’s policy on environmental justice require FHWA projects—like the I-5 Rose Quarter Freeway Widening—to address the cumulative totality of the project’s effects on the neighborhood, including the disruption of community cohesion, the displacement of people and businesses and increased traffic congestion.  The current project adds, as we have shown, to a long history of federally supported highway projects in the Albina neighborhood that have had devastating cumulative effects, including particularly, the destruction of hundreds of housing units, which are essential to the economic well being of the neighborhood and its residents, who, historically have been lower income and people of color.  It is fully consistent with federal environmental policy and environmental justice requirements for ODOT to devote funds to rebuilding housing as a way to mitigate the damage it has done to the Albina neighborhood.

 

 

 

The real “I-5” project: $5 billion, 5 miles, $5 tolls

The intentionally misleading re-brand of the failed Columbia River Crossing conceals the key fact that it is a 12-lane wide, 5 mile long freeway that just happens to cross a river, not a “bridge replacement.”

It’s vastly oversized and over-priced, with current cost estimate ranging as high as nearly $5 billion (before cost-overruns), which will necessitate round trip tolls of at least $5 for everyone using the bridge.

This part of the “I-5 bridge replacement” isn’t even the bridge, it’s the widened approach on Hayden Island. (Bike Portland)

Almost a decade ago, plans for a gigantic freeway-widening between Portland and Vancouver collapsed in the face of budget concerns and deep community disagreements about the project.  For the past year, the Oregon and Washington transportation departments have been trying to breath life into the zombie project—with the help of $40 million in consultants. The project’s-PR led marketing effort has systematically concealed the fundamental facts of the project, while promoting meaningless, unquantified and unenforceable platitudes about promoting equity and responding to climate change.

Like the original Columbia River Crossing (CRC) project, its an intellectually bankrupt sales pitch, not an honest conversation about alternatives.

It’s been apparent for months now that ODOT and WSDOT are trying to pressure the two states into recycling the project’s current record of decision–now more than a decade old.  That “ROD” as it’s called, specifies a massive freeway expansion illustrated above.  While the agency is hinting at the possibility of “design” tweaks—it’s apparent that their plan is to simply recycle the failed CRC proposal.

They’ve rebranded  it the “I-5 Bridge Replacement” but that’s an intentionally misleading title.  Sounds innocuous, right?  Who can be against merely “replacing” a bridge?

The only part of that branding that’s right is the number 5.

But they’ve left out the real meaning of the “5” in the title.  There are really three “5’s” that really define this project.  According to the project’s own documents: It’s five miles long, it’’ll cost $5 billion, and they’ll charge you $5 for a round trip.

It’s not a bridge “replacement” — It’s a five-mile long, 12-lane wide freeway that just happens to cross a river.  It stretches five miles from Lombard to Mill Plain Boulevard.

It’s 12 lanes over the Columbia River, and even wider on Hayden Island, as the above illustration shows.  Congressman Peter DeFazio has called the plan “gold-plated.” (Manning, Jeff. “Columbia River Crossing could be a casualty of the federal budget crunch”, The Oregonian, August 14, 2011).

It’s going to cost upwards of $5 billion, a—and likely more because they routinely have massive cost overruns.

According to their tolling financial estimates, which are part of the current finance plan, they’ll charge a minimum toll of $2.60 each way to cross the bridge, which works out to more than $5 per round trip. Peak tolls would be higher, and heavy trucks would pay four times as much as cars ($20 per round trip, minimum).

It’s time for ODOT and WSDOT to be honest about what they’re really proposing, the 5-5-5 project: 5 miles of freeway, $5 billion, $5 tolls per round trip.

Getting real about restorative justice in Albina

Drawings don’t constitute restorative justice

ODOT shows fancy drawings about what might be built, but isn’t talking about actually paying to build anything

Just building the housing shown in its diagrams would require $160 million to $260 million

Even that would replace only a fraction of the housing destroyed by ODOT highway building in Albina

The Oregon Department of Transportation is going to great lengths to cloak its $800 million I-5 Rose Quarter Freeway widening project in the language of restorative justice.  Starting in the 1950s, ODOT built not just one or two, but three different highways through the historically Black Albina neighborhood, and is now back with plans to widen the largest of these, but is now pretending to care about restoring the neighborhood.  To that end, its appointed an “Historic Albina Advisory Board”—after disbanding another community advisory group which had asked too many uncomfortable questions.

Housing is essential to restorative justice in Albina

The real challenge to restorative justice in Albina is more housing.  The aerial photo here shows the Albina neighborhood as it existed in 1948; the red-lined areas are properties takes or demolished for the construction of Interstate Avenue in 1951, the I-5 Freeway in 1961 and the Fremont Bridge and aborted Prescott Freeway in the early 1970s.  Albina was torn apart by these ODOT highway projects and its housing stock decimated. Given that ODOT’s highway building triggered the destruction of thousands of homes in Albina (the neighborhood’s population declined by more than 60 percent, from 14,000 to less than 4,000, it’s hardly surprising that getting more housing built is a key priority.

 

The ODOT consultants report that that one of the key strategies for building community wealth is to create affordable housing.

ODOT’s making it look like there will be housing as part of its plans

ODOT staff and consultants have presented the HAAB with surveys and focus group information on what restorative justice might look like.  An “Independent Cover Assessment” consulting group has even prepared drawings showing alternative development plans for the area near I-5.  The drawings feature examples from other cities of Black cultural and community facilities, and prominently include diagrams showing large new multifamily housing to be built on top of or near the freeway.  Here are two such diagrams, (the yellow colored buildings are residential apartments).  Concept 1 has four large residential buildings, Concept 5 has 5 large residential buildings.

 

But who’s going to pay for that housing?  It’s going to cost $160-260 million and ODOT is offering . . nothing.

It’s all well and good to talk about housing, but how, exactly, would it get built?  These colorful illustrations are really just misleading puffery and magical thinking unless there’s a realistic plan for paying for the project.  The availability of the land is the easy part.  The hard part is getting money for construction.  To get an idea of how much it would cost, we can look just a few blocks away to the newly completed Louisa Flowers Building.  It was just finished and has 204 studio, one bedroom and two bedroom apartments.  Its development and construction cost about $71 million, and Home Forward, the city of Portland’s housing agency paid $3 million for the site.  The project cost about $380 per square foot, with the overall cost per housing unit working out to about $350,000.

The Louisa Flowers affordable housing building in Portland (Portland Tribune)

The Independent Cover Assessment shows that the residential buildings (shown in yellow) in its two concepts would make up about half to 60 percent of the 900,000 to 1.1 million square feet of buildings to be build atop or adjacent to the freeway.  Those figures imply about 430,000 square feet or about 470 apartments in Scenario 1, to 680,000 square feet or roughly 740 apartments in Scenario 5.

If you could build those apartments for the same cost as Louisa Flowers (you couldn’t, of course; costs have gone up), that means the yellow colored buildings shown in these renderings would cost between $160 million and $260 million to develop and construct.

In short, unless you’ve got between $160 and $260 million (just for the housing part, mind you), those pictures are just a fantasy.

If ODOT actually had a budget for the construction of that housing, in order to, you know, promote restorative justice, then it would be perfectly valid to include this as part of the project discussion.  But ODOT hasn’t committed a dime to actually paying to build this housing.

And you could say, in theory, (and it would have to be theoretical, because ODOT has made no such commitment), that ODOT would be contributing the land for these buildings.  But as noted in the case of Louisa Flowers, the cost of the land is less than 5 percent of the cost of constructing the project.  If housing is key to restorative justice in Albina, and if ODOT is committed to restorative justice, it seems like it ought to come up with the other 95 percent as well, rather than expecting unnamed others to do the heavy lifting for it.

Housing is essential to restorative justice in Albina.  Simply drawing pictures of housing isn’t justice, it’s cynical vaporware, an attempt to create the illusion that ODOT cares, when its only real interest is in building a vastly wider freeway.  The state highway department readily spent money to demolish housing in the 1950s, 1960s and 1970s, but apparently isn’t willing to spend any of its money to replace that housing today. And the irony is, if you really want to have restorative justice and more housing, you don’t need to build a wider freeway.  In fact, a wider road would make the area less desirable for housing. ODOT’s plan to widen the freeway—and further inundate this neighborhood with more car traffic—doesn’t so much heal the repeated wounds it has inflicted on Albina, as it does to make them even worse.

 

The NIMBYs made $6 trillion last year

In 2021, US residential values increased by $6.9 trillion, almost entirely due to price appreciation

Those gains went disproportionately to older, white, higher income households

Capital gains on housing in 2021 were ten times larger than the total income of the bottom 20 percent of the population.

Little of this income will be taxed due to the exemption on capital gains for owner-occupied homes

Gains to homeowners dwarf the profits made by developers, foreign investors, or Wall Street home buyers.

Rising home prices are a transfer of wealth to older generations from younger ones.

So much of our housing debate is a search for suitable villains on which to blame a lack of affordability.  Our problems must be due to rapacious developers, greedy landlords, absentee speculator owners, buying new housing and holding of the market, and private companies buying up and renting out single family homes.  These are the cartoon characters who get blamed for driving up prices.  But they aren’t the ones to blame, and they’re not the ones who are making a killing in the housing market.

Housing affordability melodrama: Where’s Snidely? Sirsalem1, CC BY-SA 4.0 via Wikimedia Commons

The real estate speculators reaping literally trillions of dollars of gains from our capitalist housing system are millions of homeowners, who, whether they acknowledge it or not, are the beneficiaries of “Not in my back yard” policies that have driven up the price of housing.  And this surge of homeowner wealth is a rotten development from the standpoint of addressing yawning disparities by race, income and generation, as the beneficiares of these gains are statistically higher income, whiter and older than the overall US population.

Last year, according to calculations from Zillow, the value of existing residential real estate in the US grew by $6.9 trillion.  (New residential buildings—the construction and upgrading of homes and apartments—contributed about $800 billion).  The gain home values in 2021 was nearly triple the $2 trillion or or so increase in residential values Zillow reported in 2022.  In the post-pandemic era, we’re getting a bit inured to counting  “trillions.”  To put the amount of housing capital gains in perspective, the $6.9 trillion dollar one-year increase in home values is more than ten times the total pre-tax income of the bottom twenty percent of US households (about $600 billion in 2018, according to the Congressional Budget Office).

Here’s another picture of how much housing wealth has been created.  The Federal Reserve tracks “homeowner’s equity”—the net amount of wealth that homeowners have after subtracting outstanding mortgage debt from home values.  After the collapse of the housing bubble, owner’s equity stood at about $10 trillion, and since then has ballooned to $26 trillion.

 

To get an idea of exactly who reaped those gains, we took a look at data compiled by the Federal Reserve Board on the demographics of homeownership. The Fed’s triennial Survey of Consumer Finance provides estimates by age, race and income of homeownership rates and the average value of housing for the nation’s households.  Using these data, we computed the number of households by race and age of the household head (which the Fed Survey tactfully calls “the reference person”) and by the income of the household.  We’ve combined the value of owner-occupied residential property with other residential property owned by households (i.e. second homes, investment houses or apartments).  The Fed’s estimates (based on its household survey) are somewhat different from Zillow’s (derived from its database of homes), but both put total value of US residential real estate in the $30-$40 trillion range.  To a first approximation, these data on the age, race and income of homeowners are our best guide to who reaped the $6 trillion in residential capital gains this year.  That assumption masks some variability in housing price appreciation across markets and classes of homes, but this should be a good rough indicator of the demographics of the nation’s housing wealth winners.

The gerontopoly of housing wealth

As we’ve noted before at City Observatory, older Americans hold most US housing wealth, and have been chalking up a disproportionate share of gains as housing has appreciated.  The latest data from the Federal Reserve show that households headed by a person aged 55 and older own 56 percent of all residential housing wealth in the US. It’s a fair guess that these older homeowners reaped most of the gain in home values in the past year.

As Ed Glaeser has pointed out, rising real housing costs are a straightforward transfer of wealth from younger generations (who must buy the now more expensive homes) to older generations (who own the housing, and will reap gains when it is sold).

The long shadow of race

For a long list of reasons—including discrimination in housing and labor markets, redlining, and segregation—households of color have been systematically denied the opportunities to accumulate housing wealth.  That pattern is still very much in evidence in the latest Fed data:  Non-Hispanic white households own almost 80 percent of all the housing wealth in the US, implying they also reaped 80 percent of the residential capital gains, or about $1.6 billion.

Rising home prices effectively increase the wealth of white households relative to households of color.

High income households own most housing 

While homeownership is touted as a means of wealth accumulation, it has mostly worked out for high income households. While the ownership of real estate is not as skewed to high income households as is the ownership of financial assets like stocks or bonds, it is still the case that the highest income 20 percent of the population owns 59 percent of all the residential housing value in the US.

These data suggest that about $1.2 trillion of the gain in home values went to the top 20 percent of the population, meaning that their residential capital gains exceeded by a factor of about two the total pre-tax income of the bottom 20 percent of the population.

Housing appreciation is untaxed, which benefits older, white and wealthier households

The skewed ownership of housing wealth means that the gains in wealth are highly concentrated in households that are older, whiter, and higher income than other Americans.  But unlike wage income, income from housing appreciation is mostly un-taxed. As a result, the capital gains exclusion for housing is regressive and inequitable.  The capital gains exclusion for owner-occupied real estate,is much more valuable to high income households because they are more likely to own homes, own more expensive homes, and generally face higher tax rates that low income households.

In reality, the $2.2 trillion in capital gains that US residential property owners reaped in 2020 will be lightly taxed, to the extent they are taxed at all.  Federal law exempts from capital gains the first $500,000 in gains on the sale of owner occupied property (for married couples filing jointly).  That is to say that you would need $500,000 of appreciation to have any capital gains liability.  As a practical matter, few households pay capital gains taxes on residential real estate appreciation.  The tax-favored status of income from residential real-estate speculation is a quintessential feature of our system that attempts to promote wealth-building through home ownership.  While well intended, it systematically rewards older, whiter and wealthier households, and effectively denies opportunities to build wealth to the third of the population that is renters.  In many ways, it is the worst of all worlds, making housing more expensive for those least able to afford it, and providing most of the gains to those who are already most advantaged.

There’s one final irony here:  policies to broaden access to homeownership now, by providing subsidies or other support for lower income, younger, and minority homebuyers don’t rectify these gaps, they likely make them worse.  Steps to amplify demand in a surging market tend to drive prices up further, which further enriches incumbent homeowners at the expense of first-time buyers.  If you could enable people to somehow buy houses at 1990 or 2010 prices, they could be assured of wealth gains, but the risk is that buying now offers no such expectation of long term gains. Promoting homeownership primarily helps those who are selling homes, not those who are buying them.

The search for villains

Rather than talk about the capital gains that flow to older, wealthier, whiter households, much of the housing debate is a melodrama, looking to cast suitably evil villains on which to blame the crisis.  It’s fashionable to finger Wall Street investors (who for the past decade or so have been buying up single family homes and renting them in many US markets), foreign buyers of luxury condominiums in New York, Miami, Seattle and other hot cities (who let the units sit vacant while speculating on higher values), and greedy developers, who make excessive profits by building new homes.  None of these supposed villains accounts for more than a trivial part of the problem; at most, they’re picking up crumbs, compared to the the trillion dollar gains logged by incumbent homeowners.

A recent article in the New York Times suggests Wall Street backed investors now own as much as $60 billion in single family real estate.  That’s sounds ominous, but it’s less than 1 percent of the $35 trillion or so of residential investment in the US.  If all these investors earned a 10 percent capital gain in 2020, they would have collectively gotten about $6 billion or a couple of tenths of one percent of the $2.2 trillion in home values. It’s also fashionable to blame the construction of luxury condos in a few superstar cities—held vacant by rich, often foreign speculators.  The trouble is that such units are a tiny slice of the housing market, and there’s no evidence they affect overall housing costs.

And then there are the developers.  Supposedly they make a killing from building new housing. When housing price are appreciating, especially as fast as they have in the past year, the profits that developers earn from building new housing are dwarfed by the capital gains reaped by existing homeowners.  Our friend Josh Lehner, an economist with the Oregon Office of Economic Analysis, has an insightful study estimating the profits earned by homeowners and developers in Oregon over the past decade.  Lehner estimates, that on average, developers reap a margin of about 14 percent on new housing construction.  By comparing that total (14 percent of the value of new housing built in any year), with the appreciation of the existing housing stock in that same year, Lehner is able to show how developers profits stack up against the capital gains enjoyed by incumbent homeowners.  It isn’t even close:

Applied to Zillow’s estimates of national level new construction, Lehner’s 14 percent of building value estimate suggests that developers netted less than $40 billion nationally, an amount equal to about 2 percent of the gains that accrued to owners of existing homes.  It’s not the greedy developer that’s benefiting from rising home prices, it’s the NIMBYs next door who reap the gains.  As our colleague Daniel Kay Hertz has pointed out, we tend to conveniently forget that essentially all of the existing housing stock came into being, not by immaculate conception, but by the profit-motivated efforts of earlier generations of developers. If anything, because new development increases housing supply, it blunts housing price appreciation, so more development tends to increase affordability.

Wall Street investors, speculating oligarchs, and greedy developers all make signature villains in the housing affordability melodrama, but they really conceal the identity of those who are actually reaping the gains of rising housing prices.  It also hides the principal policy that’s driven the appreciation of residential real estate:  the dominance of a range of “Not in my back yard” policies, including excessive single family zoning, apartment bans, high development fees, parking requirements and a host of other policies that have made it harder to build housing, especially in the places people most want to live.

The experience of the past year illustrates the profoundly broken nature our current strategy of “wealth building through home ownership.”  The benefits of home price appreciation accrue disproportionately to those who already have wealth, and if anything, they tend to worsen the existing disparities of wealth among households.  As existing housing appreciates, it increases the wealth of the incumbent homeowners, who are disproportionately white, older and wealthier, and drives up housing costs for those who don’t now own homes.  And our tax system amplifies these inequities by allowing nearly all of this income to go untaxed.  The myth of homeownership as a universal wealth building strategy is the real villain here.

A version of this commentary was originally published in 2021, and has been updated to reflect the latest data on home price appreciate estimated by Zillow.

 

 

Who got trillions? We found the real speculators profiting from higher housing costs

In 2020, US residential values increased by $2.2 trillion

Those gains went disproportionately to older, white, higher income households

Capital gains on housing in 2020 were more than three times larger than the total income of the bottom 20 percent of the population.

Little of this income will be taxed due to the exemption on capital gains for owner-occupied homes

Gains to homeowners dwarf the profits made by developers, foreign investors, or Wall Street home buyers.

Rising home prices are a transfer of wealth to older generations from younger ones.

So much of our housing debate is a search for suitable villains on which to blame a lack of affordability.  Our problems must be due to rapacious developers, greedy landlords, absentee speculator owners, buying new housing and holding of the market, and private companies buying up and renting out single family homes.  These are the cartoon characters who get blamed for driving up prices.  But they aren’t the ones to blame, and they’re not the ones who are making a killing in the housing market.

Housing affordability melodrama: Where’s Snidely? Sirsalem1, CC BY-SA 4.0 via Wikimedia Commons

At City Observatory, we’re proud to announce we’ve found the real estate speculators reaping the literally trillions of dollars of gains from our capitalist housing system:  millions of homeowners, who are statistically higher income, whiter and older than the overall US population.

Last year, according to calculations from Zillow, the value of existing residential real estate in the US grew by $2.2 trillion.  (New construction added a paltry $275 billion in new homes and apartments to that total).  Given current price trends, Zillow expects another $2 trillion or so increase in residential values in 2021.

In the post-pandemic era, we’re getting a bit inured to counting  “trillions.”  To put the amount of housing capital gains in perspective, the $2.2 trillion dollar one-year increase in home values is more than three times the total pre-tax income of the bottom twenty percent of US households (less than $600 billion in 2017, according to the Congressional Budget Office).

To get an idea of exactly who reaped those gains, we took a look at data compiled by the Federal Reserve Board on the demographics of homeownership. The Fed’s triennial Survey of Consumer Finance provides estimates by age, race and income of homeownership rates and the average value of housing for the nation’s households.  Using these data, we computed the number of households by race and age of the household head (which the Fed Survey tactfully calls “the reference person”) and by the income of the household.  We’ve combined the value of owner-occupied residential property with other residential property owned by households (i.e. second homes, investment houses or apartments).  The Fed’s estimates (based on its household survey) are somewhat different from Zillow’s (derived from its database of homes), but both put total value of US residential real estate in the $30-$40 trillion range.  To a first approximation, these data on the age, race and income of homeowners are our best guide to who reaped the $2 trillion in residential capital gains this year.  That assumption masks some variability in housing price appreciation across markets and classes of homes, but this should be a good rough indicator of the demographics of the nation’s housing wealth winners.

The gerontopoly of housing wealth

As we’ve noted before at City Observatory, older Americans hold most US housing wealth, and have been chalking up a disproportionate share of gains as housing has appreciated.  The latest data from the Federal Reserve show that households headed by a person aged 55 and older own 56 percent of all residential housing wealth in the US. It’s a fair guess that these older homeowners reaped most of the gain in home values in the past year.

As Ed Glaeser has pointed out, rising real housing costs are a straightforward transfer of wealth from younger generations (who must buy the now more expensive homes) to older generations (who own the housing, and will reap gains when it is sold).

The long shadow of race

For a long list of reasons—including discrimination in housing and labor markets, redlining, and segregation—households of color have been systematically denied the opportunities to accumulate housing wealth.  That pattern is still very much in evidence in the latest Fed data:  Non-Hispanic white households own almost 80 percent of all the housing wealth in the US, implying they also reaped 80 percent of the residential capital gains, or about $1.6 billion.

Rising home prices effectively increase the wealth of white households relative to households of color.

High income households own most housing 

While homeownership is touted as a means of wealth accumulation, it has mostly worked out for high income households. While the ownership of real estate is not as skewed to high income households as is the ownership of financial assets like stocks or bonds, it is still the case that the highest income 20 percent of the population owns 59 percent of all the residential housing value in the US.

These data suggest that about $1.2 trillion of the gain in home values went to the top 20 percent of the population, meaning that their residential capital gains exceeded by a factor of about two the total pre-tax income of the bottom 20 percent of the population.

Housing appreciation is untaxed, which benefits older, white and wealthier households

The skewed ownership of housing wealth means that the gains in wealth are highly concentrated in households that are older, whiter, and higher income than other Americans.  But unlike wage income, income from housing appreciation is mostly un-taxed. As a result, the capital gains exclusion for housing is regressive and inequitable.  The capital gains exclusion for owner-occupied real estate,is much more valuable to high income households because they are more likely to own homes, own more expensive homes, and generally face higher tax rates that low income households.

In reality, the $2.2 trillion in capital gains that US residential property owners reaped in 2020 will be lightly taxed, to the extent they are taxed at all.  Federal law exempts from capital gains the first $500,000 in gains on the sale of owner occupied property (for married couples filing jointly).  That is to say that you would need $500,000 of appreciation to have any capital gains liability.  As a practical matter, few households pay capital gains taxes on residential real estate appreciation.  The tax-favored status of income from residential real-estate speculation is a quintessential feature of our system that attempts to promote wealth-building through home ownership.  While well intended, it systematically rewards older, whiter and wealthier households, and effectively denies opportunities to build wealth to the third of the population that is renters.  In many ways, it is the worst of all worlds, making housing more expensive for those least able to afford it, and providing most of the gains to those who are already most advantaged.

There’s one final irony here:  policies to broaden access to homeownership now, by providing subsidies or other support for lower income, younger, and minority homebuyers don’t rectify these gaps, they likely make them worse.  Steps to amplify demand in a surging market tend to drive prices up further, which further enriches incumbent homeowners at the expense of first-time buyers.  If you could enable people to somehow buy houses at 1990 or 2010 prices, they could be assured of wealth gains, but the risk is that buying now offers no such expectation of long term gains. Promoting homeownership primarily helps those who are selling homes, not those who are buying them.

The search for villains

Rather than talk about the capital gains that flow to older, wealthier, whiter households, much of the housing debate is a melodrama, looking to cast suitably evil villains on which to blame the crisis.  It’s fashionable to finger Wall Street investors (who for the past decade or so have been buying up single family homes and renting them in many US markets), foreign buyers of luxury condominiums in New York, Miami, Seattle and other hot cities (who let the units sit vacant while speculating on higher values), and greedy developers, who make excessive profits by building new homes.  None of these supposed villains accounts for more than a trivial part of the problem; at most, they’re picking up crumbs, compared to the the trillion dollar gains logged by incumbent homeowners.

A recent article in the New York Times suggests Wall Street backed investors now own as much as $60 billion in single family real estate.  That’s sounds ominous, but it’s less than 1 percent of the $35 trillion or so of residential investment in the US.  If all these investors earned a 10 percent capital gain in 2020, they would have collectively gotten about $6 billion or a couple of tenths of one percent of the $2.2 trillion in home values. It’s also fashionable to blame the construction of luxury condos in a few superstar cities—held vacant by rich, often foreign speculators.  The trouble is that such units are a tiny slice of the housing market, and there’s no evidence they affect overall housing costs.

And then there are the developers.  Supposedly they make a killing from building new housing. When housing price are appreciating, especially as fast as they have in the past year, the profits that developers earn from building new housing are dwarfed by the capital gains reaped by existing homeowners.  Our friend Josh Lehner, an economist with the Oregon Office of Economic Analysis, has an insightful study estimating the profits earned by homeowners and developers in Oregon over the past decade.  Lehner estimates, that on average, developers reap a margin of about 14 percent on new housing construction.  By comparing that total (14 percent of the value of new housing built in any year), with the appreciation of the existing housing stock in that same year, Lehner is able to show how developers profits stack up against the capital gains enjoyed by incumbent homeowners.  It isn’t even close:

Applied to Zillow’s estimates of national level new construction, Lehner’s 14 percent of building value estimate suggests that developers netted less than $40 billion nationally, an amount equal to about 2 percent of the gains that accrued to owners of existing homes.  It’s not the greedy developer that’s benefiting from rising home prices, it’s the NIMBYs next door who reap the gains.  As our colleague Daniel Kay Hertz has pointed out, we tend to conveniently forget that essentially all of the existing housing stock came into being, not by immaculate conception, but by the profit-motivated efforts of earlier generations of developers. If anything, because new development increases housing supply, it blunts housing price appreciation, so more development tends to increase affordability.

Wall Street investors, speculating oligarchs, and greedy developers all make signature villains in the housing affordability melodrama, but they really conceal the identity of those who are actually reaping the gains of rising housing prices.

The experience of the past year illustrates the profoundly broken nature our current strategy of “wealth building through home ownership.”  The benefits of home price appreciation accrue disproportionately to those who already have wealth, and if anything, they tend to worsen the existing disparities of wealth among households.  As existing housing appreciates, it increases the wealth of the incumbent homeowners, who are disproportionately white, older and wealthier, and drives up housing costs for those who don’t now own homes.  And our tax system amplifies these inequities by allowing nearly all of this income to go untaxed.  The myth of homeownership as a universal wealth building strategy is the real villain here.

 

ODOT’s peer review panel admits it didn’t validate Rose Quarter travel forecasts

ODOT has claimed a “peer review panel” vindicated its air pollution analysis

Now the panel says they didn’t look into the accuracy of ODOT’s travel forecast

Travel forecasts are critical, because they determine air and noise pollution impacts

In short:  the peers have done nothing to disprove the critiques of ODOT’s flawed traffic modeling

A key claim opponents made about the I-5 Rose Quarter freeway-widening project is that the traffic projections are wrong, over-stating baseline traffic by pretending the CRC was built in 2015, exaggerating “no-build” traffic levels by allowing link volumes to exceed capacity, and under-estimating “build” volumes by failing to account for induced demand and also modeling a 6-lane roadway rather than the 8- or 10-lane roadway they’re actually constructing.

ODOT’s defense is that their environmental modeling was endorsed by a so-called independent peer review panel.  As we pointed out when panel’s report was released, this was largely a whitewash.  As we wrote at City Observatory last June, when the Panel’s report was released, the critical problem was that the panel failed to look at the flawed traffic projections on which the air and noise estimates are based.  We wrote:

In theory, the PRP undertook an environmental review, looking at air pollution, greenhouse gases and noise pollution. But because all these impacts depend on the volume of traffic and whether the project increases or decreases traffic, they are all subsidiary to the accuracy of the traffic modeling. And the panel apparently did absolutely nothing to validate the accuracy of these traffic projections.

The air and noise impacts of the project come from vehicles using the freeway; both air pollution and noise pollution increase with the increasing number of cars and trucks on the roadway.  If ODOT got the traffic numbers wrong, then the pollution estimates are wrong as well.

The Peer Review Panel admits it didn’t evaluate the validity of traffic forecasts

Earlier this month, the leader of the peer review panel publicly acknowledged that the group she led did not take any critical look at the travel forecasts.  On April 5, ODOT consultant and panel facilitator Grace Crunican presented the so-called “peer review” panel results to the Historic Albina Advisory Board.   Board member John Washington asked about the traffic projections. Crunican  said that the peer review committee didn’t judge whether their were accurate or appropriate and only looked at whether the model’s outputs were correctly used to compute air/noise impacts.
Here’s the transcript:

John Washington  (Historic Albina Advisory Board)
We had a public announcement about some people suing us or something, right. And how closely related is what they’re talking about to what you’re talking about?

Grace Crunican  (ODOT Peer Review Consultant)
It is related. What they [No More Freeways] are saying is that the traffic data that was provided, that ODOT provided as a basis of the analysis that they did, is not accurate. We were not charged with looking at that traffic analysis original data.  We were looking at the implications of the traffic data that was there. We did ask some questions about it and we got mostly the information, some of the information, Megan gave us today.  She can say it again, but in my lay terms, she used the models from the metro area, she used Metro’s models and she used City of Portland’s model. And they did their analysis and then what we looked at is, given how much traffic was going through, and we look at the air quality analysis that was done. And that’s where our work was, and so the underlying data is what somebody is trying to challenge, the No [More] freeway people, I think are trying to challenge, and we looked at how that data was applied, and found that it was applied appropriately.

Historic Albina Advisory Board, April 5, 2021,  Meeting Video at 1:03:45.  (Emphasis added)

Crunican: “we were not charged with looking at that traffic analysis.” (Youtube)

In effect, the peer review panel simply assumed that ODOT’s traffic projections were right.  It took no independent effort to examine those projections, nor did it consider any of the technical objections that No More Freeways and other commenters offered to the model.  For reference, City Observatory has documented the critique extensively:  It includes inflated baseline traffic due the counterfactual assumption that the Columbia River Crossing was built in 2015; the over-assignment of traffic to congested road links in the No-Build scenario, and the failure to model the effects of induced demand in the build scenario, as well as the fact that ODOT modeled a six-lane freeway, rather than the 10-lane roadway that the project actually proposes to construct.
This is important because ODOT relied on the peer review panel to discredit the critique of its flawed traffic modeling. When it released the report, in June 2020, it claimed that the peer review  “supported ODOT’s findings for air, greenhouse gas and noise impacts” for the freeway widening.

 In January 2020, the Oregon Transportation Commission directed ODOT to conduct an environmental peer review associated with the project’s Environmental Assessment after hearing stakeholder concern over the potential impacts from the project related to air quality, noise and greenhouse gas emissions.  The Peer Review Report supported ODOT’s findings for air, greenhouse gas, and noise impacts for the project.

Most recently, ODOT used the peer review panel as a kind of talisman to ward off criticism after No More Freeways filed suit against the Federal Highway Administration challenging the project.  Here is ODOT spokesperson Tia Williams in Willamette Week:

“This project underwent a robust environmental assessment that showed future air quality would improve in part due to the congestion relief provided by this project. Those findings were reviewed and confirmed by a panel of national air quality and transportation experts. We are confident in the findings,” a statement from ODOT provided by spokesperson Tia Williams said.

At long last, the peer review panel has shown a modicum of independence:  It has made it clear that claims that its review “supports” ODOT’s work and “confirms” its findings are simply false.  The panel members were instructed to look only at a small, and as it turns out, derivative question, and simply ignored whether the freeway widening increases traffic.  This is hardly a reasonable basis for a claim that this project has “no significant environmental impact.”

The freight fable: Moving trucks is not longer the key to economic prosperity

It is difficult to get a man to understand something when his salary depends upon his not understanding it.  Upton Sinclair

It’s even harder to get a trucking industry lobbyist or a highway department booster to understand something when their salaries depend on not understanding it.

Oregon’s economy has de-coupled from freight movement; our economic success stems from doing things other that simply moving more and more freight.

State officials and the trucking lobbyists they’ve hand-picked as “public” representatives are selling myths in an effort to justify wasting billions to expand highways.

Here’s a simple fact:  Truck movements across the Columbia River in Portland are down 19 percent in the past fifteen years.  This fact comes from data tabulated by the Oregon Department of Transportation, which has automatic vehicle counters on the roadways leading up to the I-5 and I-205 bridges that connect Oregon and Washington.  Here’s the data, which is taken directly from the traffic counting website operated by ODOT.  It shows heavy freight truck movements.

For highway boosters, this simple fact is an inconvenient truth.  Here’s why:  they’re trying to justify a nearly $5 billion freeway widening project on Interstate 5 as somehow essential to accomodating a flood of trucks, which if they’re delayed even slightly, will somehow mean the demise of one of the nation’s most robust metropolitan economies.  Don’t get us wrong, traffic congestion is a routine feature of successful metropolitan economies, but there’s actually no evidence that adding a freeway lane (or three) has any measurable effect on a metro area’s economic prosperity.  But ODOT and freight industry boosters are keen to argue that freight volumes are increasing in lock step with the economy, and if they’re hindered in any way, our economic ruin awaits us.

The trouble is, as this simple chart shows, that’s not true.   Despite declining freight movement, the Oregon economy has boomed.

The Portland and Oregon economies rebounded sharply after the 2007-2009 recession, and they did so without increasing the number of heavy trucks moving across the Columbia River on the I-5 and I-205 bridges. The truckers and highway types are likely to want to blame the recession, but what’s really striking is that through 2019, I-5 and I-205 truck traffic never recovered to pre-recession levels after a decade of robust economic growth.  Not only that, but truck volumes actually declined from 2013 through 2016, as the economy was growing rapidly.  The key takeaway here is that Oregon’s economy grows just fine, thank you, even with no more trucks crossing the Columbia River.

But this inconvenient truth was treated with dismay and denial, by the Washington Trucking Association’s lobbyist, Sherri Call, nominally a “public representative” on the Community Advisory Group for what the Oregon and Washington Transportation Department’s call the “I-5 Bridge Replacement Project” but which is really meant to be a rubber stamp for a five-mile long, 12-lane wide freeway that just happens to cross a river.  We submitted the ODOT data shown in the chart above for the record at the Community Advisory Group’s March, 24, 2021 meeting.  Call was apoplectic at the idea that anyone could suggest that freight volumes were going down.  Describing a discussion in one of the meeting’s breakout groups she said:

We talked a little bit about the public comment process and I was glad to hear [Project Manager] Greg [Johnson]’s commentary on that, you know, if I share that it kind of got under my skin a little bit, a caller that called in and mentioned the reduction in freight volumes over the years and caused me to go on and do some offline research and that’s actually not the case that has increased and not only that the general traffic has increased as well. And, you know, Greg [Johnson], very eloquently I think put it there basically the people that are calling in are not held to the same standard as, as you folks in the bridge office who are accountable not just to the public but to people internally and people on both sides of the state so that, that is, you know, good for us to be mindful of.

Notably, Call didn’t cite any actual data to prove her point.  But she did confide that she shared her concerns with the project’s manager Greg Johnson, who claimed, according to Call, that “people calling in aren’t held to the same standard” as the project’s promoters in the transportation department.

For the record, it’s important to note that like Call, Johnson didn’t offer any data showing an increase in freight volumes on I-5 across the Columbia River.  Simple asserting an article of faith—and wrapping it in a little sidelong character assassination— was apparently sufficient.  As Upton Sinclair said, it’s difficult to get a person to understand a fact when their salary depends on them not understanding it.

But again, here’s the simple truth:  the volume of freight trucks moving on I-5 and I-205 across the Columbia River is, and remains, lower today than 13 years ago.  And not by a little, by a lot—almost a fifth.  The ODOT data show that there are half a million fewer trucks using the two bridges today than in 2006.

And miraculously, the Oregon economy has not collapsed.  In fact, since 2006, both the Oregon and Portland metro economies have outperformed the US economy, whether measured by employment or gross domestic product.  We’ve managed to grow our economy with less truck movement than we had more than a decade ago.  What that signals is that economic success isn’t simply a product of moving more stuff.  In fact, the most successful economies are the ones which generate new ideas and new services, not simply move more stuff. In the 21st century, success is about doing more with what we have, or even less, and that’s where Oregon has excelled. Our old, resource-based economy could grow only by cutting and shipping more trees or grain; but today, Oregon’s economic growth is driven by a range of knowledge-based industries that expand their output, income, and jobs, without moving ever more trucks.

In the end, though, this argument boils down to simple facts.  If Sheri Call and Gregg Johnson are right, that more and more trucks are needing to move across the Columbia for our economy to succeed, and that widening I-5 at a cost of billions will somehow stimulate more industrial activity, let them present the data, any data, to prove that.  So far they haven’t.  All we have so far is snide claims that they’re somehow held to a higher standard of proof, something they’ve manifestly failed to demonstrate.

 

 

 

Driving stakes, selling bonds: ODOT’s freeway boondoggle plan

The Oregon Department of Transportation is launching a series of boondoggle freeways, with no idea of their ultimate cost, and issuing bonds that will obligate the public to pay for expensive and un-needed highways.

Future generations will have to pay off the bonds AND suffer the climate consequences

The classic Robert Moses scam:  Drive stakes, sell bonds

Debt is a powerful drug.  Issuing bonds to pay for roads passes on to future generations the cost of the choices we make today.  Once issued, repaying bonds takes precedence over any other use of state and federal highway funds. Launching a multi-billion dollar highway expansion plan in the Portland metro area jeopardizes the state’s ability to fund every other transportation priority, statewide, for the next two decades.

The Oregon Department of Transportation has embarked an unprecedented, multi-billion dollar highway expansion spree in Oregon.  And they’re doing it without financing in hand, and instead are planning to issue bonds that will irrevocably commit the state to these projects, no matter how expensive or un-needed they may be.

It has already started one project, the half billion dollar I-205 Abernethy Bridge, without permanent funding in place.    The I-205’s bloated cost and minimal benefits have recently earned it national honors as a highway boondoggle:

Streetsblog, USA: Oregon’s I-205 highway boondoggle (2022)

And this is just one of several billion dollar plus projects for which ODOT is pursuing a build now, pay later approach. ODOT is moving forward with plans for the Interstate Bridge project, which it now admits could cost as much as $7.5 billion. It’s also trying to launch the $1.45 billion I-5 Rose Quarter project with only a tiny fraction of the needed funding. That’s nearly $10 billion in road construction. Significantly, the cost of every one of these projects has increased sharply in the past couple of years.

How can it do that?

First, ODOTit has figured that it once it starts these projects, no one will stop them.  And it’s financing these projects by issuing debt.  Its first step will be issuing up to $600 million in short-term bonds, secured by future state highway fund revenues.  This short term borrowing is the government equivalent of a payday loan.  They’re promising bond buyers first call on moneys in the state highway fund, hoping that, by the time they have to pay back the bonds, they’ll have a system of tolls in place to pay that will provide the needed revenue.  And then they’ll issue permanent bonds, backed by the promise of future toll revenue and use those bonds to pay off the short term “payday” loan.  But that presumes a number of things:  critically, that tolls will produce enough revenue to pay back those bonds.

That’s exactly what’s happening with the I-205 Abernethy Bridge and the I-5 Rose Quarter freeway widening.

First, the Abernethy Bridge. ODOT moved ahead with this project even though it didn’t have an approved funding plan in place.  It also went ahead with the project even as the construction bids came in twice as high as the program’s cost estimate ($500 million, up from $250 million in 2018).  It is paying for the Abernethy Bridge in the short term by taking money that the Legislature initially earmarked for the I-5 Rose Quarter project.  ODOT also plans to issue $600 million in short-term bonds.  And ultimately, it hopes to repay these sources of borrowing with money it gets from selling more bonds to be paid back from future tolls.  But tolling hasn’t gotten approval through the federal environmental review process–and may not–but regardless the state will have to pay off these bonds.

Second, there’s the I-5 Rose Quarter project.  In 2017, the Oregon Legislature approved the project, based on ODOT’s estimate that the project would cost $450 million.  The Legislature earmarked that amount in gas taxes for the Rose Quarter, but in 2021, allowed ODOT to also use this same money for the Abernethy Bridge.  In the mean time, however, the cost of the Rose Quarter project doubled and then tripled:  it now stands at as much as $1.45 billion.  But, as noted, ODOT has diverted most of the original $450 million provided by the Legislature to the Abernethy Bridge project–so now that Rose Quarter project is perhaps a billion dollar–or more short of the money needed to pay for its construction.

Even so, it’s apparent that ODOT plans to move the project forward, even though it doesn’t have identified funds to pay for all of it.  It’s planning an “Early Work” package of a few selected construction projects.  It is the classic “driving stakes” strategy to get the project started, and then come back to the Legislature to ask for money to finish the job—no matter how much it ends up costing.

Ultimately, ODOT is counting on toll-backed bonds to pay for both projects.  Oregon Transportation Commission Chairman Bob Van Brocklin testified in March, 2022 that all these projects hinge on toll financing.  But as yet, ODOT hasn’t undertaken the detailed financial analyses that will be required to sell toll backed bonds.  Both private markets and the federal government require bond issuers to commission independent “investment grade analyses” that develop realistic estimates of actual toll revenue.  Without an investment grade analysis, it’s effectively impossible to know how much money in bonds the state would be able to sell to finance either of these projects.

Oregon DOT has no experience actually collecting tolls, and consequently, no real experience in projecting how much toll revenue these facilities might provide.  But the financial consequences of this approach are very clear.  If, and more likely, when toll revenues aren’t as much as are needed to pay for these projects, the state will be legally obligated to dip into other transportation funds (state highway funds, and under the terms of HB 3055, passed by the Legislature two years ago, federal transportation grants) to pay off bond holders before the state spends money on anything else.

If this seems like a risky and foolhardy strategy, it is because that’s exactly what it is.  Unless, of course, you are a state highway agency that only cares about building more and more roads.  For the manic road-builder, this is an ideal strategy:  it allows you to build as much as you want today, and whether the tolls are sufficient or not, future legislatures and future taxpayers will be required to make up any shortfalls.  And, as we’ve noted, in the face of the climate crisis, this approach to road finance is deeply perverse:  Once the state builds more roadways, if its efforts to reduce driving (and cut carbon emissions) are successful, the toll revenue shortfall will have to be made up by cutting other transportation spending.  The state will even be obligated to use federal funds (which can be used flexibly for transit, walking, and cycling projects that would reduce greenhouse gas emissions) to pay off the bond-holders who financed the under-used highway capacity.

Wholly Moses

This strategy of driving stakes (getting projects started, even before their full costs are known) and selling bonds issuing debt that legally obligates the state to finish the projects, is a classic road-building scheme.   Oregon DOT’s plan to get started on several of these projects, and to finance them by short-term borrowing and bonds, backed with a legal pledge of both future toll revenues and other state and federal transportation funds, mimics the classic scam developed by America’s original highway builder/power broker, Robert Moses, in the 1930s.

Moses guided public investment in New York for decades and the  city and state today still bear the deep imprint of his choices, chief among them, the decision to remake much of the region to facilitate the movement of automobiles. Part of his legacy is the toll bridges and a network of highways that slashed through urban neighborhoods—in his words—like a meat-ax.  But there’s another more subtle, but equally enduring element of the Moses legacy:  a pattern of practice followed to this day, in one form or another, by highway departments around the country.

Moses locked up all the revenue from publicly financed bridges and tunnels, and at a time when public transit was starved for investment, plowed it all back into a steady stream of new road capacity that demolished neighborhoods, furthered sprawl and increased car dependence.  The Oregon Department of Transportation seems determined to take a page out of the Moses playbook.

To see how these two patented Moses gimmicks work, we turn the microphone over to his biographer, Robert Caro.

The Power Broker | Robert Caro

Driving Stakes

The key techniques are two-fold:  First, just getting projects started.  Several of these projects (the I-5 Bridge replacement, and the I-5 Boone Bridge), haven’t even completed their planning, so their full costs are unknown.  The second technique is to issue bonds to pay for the project, secured by toll revenues.

Early on, Moses learned the value of starting construction of a highway—driving stakes in the ground—even if he didn’t have all the financing in place, and regardless of whether he knew (or honestly revealed) the actual total cost of the project.  Just getting something started made it almost impossible for legislators or other officials to deny him whatever resources he needed to finish the project.  Caro writes:

Once you did something physically, it was very hard for even a judge to undo it.  If judges, who had to submit themselves to the decision of the electorate only infrequently, were thus hogtied by the physical beginning of a project, how much more so would be public officials who had to stand for re-election year by year? . . . once you physically began a project, there would always be some way found of obtaining the money to complete it. “Once you sink that first stake,” he would often say, “they’ll never make you pull it up.”

And this tactic turned minimizing or hiding the true cost of a project into an indispensable means of getting things moving:

Misleading and underestimating, in fact, might be the only way to get a project started.  . . . Once they had authorized that small initial expenditure and you had spent it, they would not be able to avoid giving you the rest when you asked for it. . . . Once a Legislature gave you money to start  a project, it would be virtually forced to give you the money to finish it.  The stakes you drove should be thin-pointed—wedge-shaped, in fact on the end.  Once you got the end of the wedge for a project into the public treasury, it would be easy to hammer in the rest.  (Caro at 218-219)

Selling Bonds

One of Moses’ key insights was that municipal revenue bonds worked like an alternative, overriding form of government authority, in his case, overriding future legislative control or second thoughts. The contract between bond issuers (like a state agency) and bond buyers can’t be impaired by future legislative changes. A promise to dedicate certain revenues to an agency in a bond indenture can tie them up for years or decades, or as Moses showed, forever.  Bonds, once issued, become a virtually unbreakable contract.  Once ODOT sells bonds backed by the pledges of toll revenue (and other federal and state transportation revenues) the state is permanently, and preemptively committed to giving it all the toll revenue (and in the case of HB 3065, forfeiting other revenue to make up any shortfall).

Caro explains the original structure Moses crafted when he drafted amendments to New York statutes governing bonds issued by his Triborough Bridge Authority.

Legislation can be amended or repealed.  If legislators were in some future year to come to feel that they had been deceived into granting Robert Moses wider powers than they hand intended—the right to keep tolls on a bridge even after the bridge was paid for, for example—they would simply revoke those powers. But a contract cannot be amended or repealed by anyone except the parties to it.  Its obligations could not be impaired by anyone—not even the governing legislature of a sovereign state.  Section Nine, Paragraphs 2 and 4, Clauses a through i, gave Robert Moses the right to embody in Triborough’s bonds all the powers he had been given in the legislation creating Triborough. Therefore, from the moment the bonds were sold (thereby putting into effect the contract they represented) , the powers he had been given in the legislation could be revoked only by the mutual consent of both Moses and the bondholders.  They could not be revoked by the Authority or by the City whose mere instrumentality he was supposed to be.
(Caro, at 629-630)

The combination of these two strategies—driving stakes and selling bonds—is enough to lock the state into an expensive and environmentally destructive freeway building spree.  And once the bill passes, and the bonds are sold, future legislatures will find themselves as powerless to rein in ODOT as New York was to stop the Moses meat ax from hacking through New York City.

 

 

Wholly Moses: Pave now, pay later

Oregon legislation goes whole hog on highways

HB 3065 would launch a whole new round of freeway boondoggles, and plunge the state into debt to pay for them

The classic Robert Moses scam:  Drive stakes, sell bonds

The Oregon Legislature is considering a bill, HB 3065, which while it sounds technical and innocuous, is really designed to launch a whole series of new freeway expansion mega-projects in the Portland area.  By authorizing the Oregon Department of Transportation to get started on several of these projects, and to finance them by short-term borrowing and bonds, backed with a legal pledge of both future toll revenues and other state and federal transportation funds, the bill mimics a classic scam developed by America’s original highway builder/power broker, Robert Moses, in the 1930s.

If there is a villain in American urbanism, it is Robert Moses, who for decades, guided public investment in New York, a city and state that to today, still bears the deep imprint of his choices, chief among them, the decision to remake much of the region to facilitate the movement of automobiles. Part of his legacy is the toll bridges and a network of highways that slashed through urban neighborhoods—in his words—like a meat-ax.  But there’s another more subtle, but equally enduring element of the Moses legacy:  a pattern of practice followed to this day, in one form or another, by highway departments around the country.

The latest manifestation of Moses’ malignant legacy is in the form of a bill in the Oregon Legislature.  House Bill 3065 proposes to give the Oregon Department of Transportation power start a series of expensive Portland area highway projects, and issue debt to pay for their (unspecified and unlimited costs), and in the process pledge much of the state’s future transportation revenue, including any monies raised from tolls to pay for these projects.  Along the way, the bill undoes most of the positive attributes of road-pricing authorized by the 2017 Oregon Legislature, which specifically mandated value pricing (aka “congestion pricing’ on Portland area freeways.  As we’ve frequently pointed out at City Observatory, and as born out by the experience of cities around the world and in the US, even a modest price charged for peak hour freeway use would likely resolve the region’s congestion problems.  (That conclusion was also echoed by ODOT’s own consultants three years ago).

What we have in HB 3065, however, is something that is a kind of perpetual motion road-building machine. It authorizes ODOT to issue bonds for a series of megaprojects, and to obligate current and future state revenue, including future toll revenues to pay back those bonds.  It’s a kind of “pave now, pay later” strategy that Moses would embrace, and bears two key hallmarks of Moses’ own work:  driving stakes to force funding commitments to ill-defined, open-ended projects, and using bond covenants to block any legislative oversight or repudiation of his future actions.

Moses locked up all the revenue from publicly financed bridges and tunnels, and at a time when public transit was starved for investment, plowed it all back into a steady stream of new road capacity that demolished neighborhoods, furthered sprawl and increased car dependence.  The Oregon Department of Transportation seems determined to take a page out of the Moses playbook with new amendments.

To see how these two patented Moses gimmicks work, we turn the microphone over to his biographer, Robert Caro.

The Power Broker | Robert Caro

Driving Stakes

The key techniques are two-fold:  First, just getting projects started.  Several of these projects (the I-5 Bridge replacement, and the I-5 Boone Bridge), haven’t even completed their planning, so their full costs are unknown.  The second technique is to issue bonds to pay for the project, secured by toll revenues.

Early on, Moses learned the value of starting construction of a highway—driving stakes in the ground—even if he didn’t have all the financing in place, and regardless of whether he knew (or honestly revealed) the actual total cost of the project.  Just getting something started made it almost impossible for legislators or other officials to deny him whatever resources he needed to finish the project.  Caro writes:

Once you did something physically, it was very hard for even a judge to undo it.  If judges, who had to submit themselves to the decision of the electorate only infrequently, were thus hogtied by the physical beginning of a project, how much more so would be public officials who had to stand for re-election year by year? . . . once you physically began a project, there would always be some way found of obtaining the money to complete it. “Once you sink that first stake,” he would often say, “they’ll never make you pull it up.”

And this tactic turned minimizing or hiding the true cost of a project into an indispensable means of getting things moving:

Misleading and underestimating, in fact, might be the only way to get a project started.  . . . Once they had authorized that small initial expenditure and you had spent it, they would not be able to avoid giving you the rest when you asked for it. . . . Once a Legislature gave you money to start  a project, it would be virtually forced to give you the money to finish it.  The stakes you drove should be thin-pointed—wedge-shaped, in fact on the end.  Once you got the end of the wedge for a project into the public treasury, it would be easy to hammer in the rest.  (Caro at 218-219)

Selling Bonds

One of Moses’ key insights was that municipal revenue bonds worked like an alternative, overriding form of government authority, in his case, overriding future legislative control or second thoughts. The contract between bond issuers (like a state agency) and bond buyers can’t be impaired by future legislative changes. A promise to dedicate certain revenues to an agency in a bond indenture can tie them up for years or decades, or as Moses showed, forever.  Bonds, once issued, become a virtually unbreakable contract.  Once ODOT sells bonds backed by the pledges of toll revenue (and other federal and state transportation revenues) the state is permanently, and preemptively committed to giving it all the toll revenue (and in the case of HB 3065, forfeiting other revenue to make up any shortfall).

Caro explains the original structure Moses crafted when he drafted amendments to New York statutes governing bonds issued by his Triborough Bridge Authority.

Legislation can be amended or repealed.  If legislators were in some future year to come to feel that they had been deceived into granting Robert Moses wider powers than they hand intended—the right to keep tolls on a bridge even after the bridge was paid for, for example—they would simply revoke those powers. But a contract cannot be amended or repealed by anyone except the parties to it.  Its obligations could not be impaired by anyone—not even the governing legislature of a sovereign state.  Section Nine, Paragraphs 2 and 4, Clauses a through i, gave Robert Moses the right to embody in Triborough’s bonds all the powers he had been given in the legislation creating Triborough. Therefore, from the moment the bonds were sold (thereby putting into effect the contract they represented) , the powers he had been given in the legislation could be revoked only by the mutual consent of both Moses and the bondholders.  They could not be revoked by the Authority or by the City whose mere instrumentality he was supposed to be.
(Caro, at 629-630)

The combination of these two strategies—driving stakes and selling bonds—is enough to lock the state into an expensive and environmentally destructive freeway building spree.  And once the bill passes, and the bonds are sold, future legislatures will find themselves as powerless to rein in ODOT as New York was to stop the Moses meat ax from hacking through New York City.

 

An open letter to the Oregon Transportation Commission

For years, the Oregon Department of Transportation has concealed its plans to build a ten lane freeway through Portland’s Rose Quarter

We’re calling on the state to do a full environmental impact statement that assesses the impact of the project they actually intend to build.

An open letter to the Oregon Transportation Commission.

Regular readers of City Observatory will know that we’ve long been casting a close and critical eye on plans to spend $800 million to widen a mile and a half long stretch of interstate freeway in Portland, Oregon.  As we’ve explained, we think this particular freeway fight encapsulates many of the fundamental urban issues of our time:  How we grapple with climate change, re-imagine our cities as more just, inclusive and accessible communities, and how we right the damage done by the urban freeway building of the past.

In its advocacy for this project, the Oregon Department of Transportation has sought to convey the idea that it isn’t really widening the freeway at all.  At worst, its PR campaign claims, they’re adding two “auxiliary” lanes.

But for years, the agency has carefully hidden the true scale of the project.  It’s never publicly released detailed plans showing the roadway’s actual width, despite repeated challenges and questions from the public.

Now new documents show the agency has long been planning a 160-foot wide roadway, more than enough for an eight or ten-lane freeway with full urban shoulders.  It’s apparent now, in retrospect, that agency staff have long known this to be the case, and have willfully concealed this information from the public through a combination of misleading illustrations and outright lies in response to direct questions about the size of the proposed freeway.

This matters because portraying this project as the addition of just two lanes dramatically understates its impact in adding traffic, increasing noise, air pollution and greenhouse gases, and impairing the health and livability of nearby neighborhoods. These are exactly the kinds of impacts that the National Environmental Policy Act (NEPA) requires be revealed before undertaking a major federal project, and rather than honestly disclosing them, this agency has intentionally and aggressively hidden them.

In an open letter to the Oregon Transportation Commission, City Observatory’s Joe Cortright calls for the agency to honestly disclose its plans, and to undertake a full and fair environmental impact statement that shows the traffic, environmental and social effects of the actual 10-lane freeway it is proposing to build.

Cortright_to_OTC_RoseQuarterWidth_17March

Is the pandemic driving rents down? Or up?

Since Covid started, rents are down in some cities, but up in most

“Superstar” cities have experienced the most notable declines; the demographics of renters in these cities are different than elsewhere.

Rent declines are also much more common in larger cities, with higher levels of rents.

City Observatory is pleased to publish this guest post from Alan Mallach.  Alan is a senior fellow with the Center for Community Progress, known for his work on legacy cities, neighborhood change and affordable housing. His most recent book is The Divided City: Poverty and Prosperity in Urban America, and he is currently co-authoring a book on neighborhood change.

Alan Mallach

A lot of attention has been given to the decline in rents in a handful of high-profile superstar cities like San Francisco or Washington DC. But, as I’ve had occasion to observe in the past, those cities are only a handful among the hundreds of cities and housing markets across the United States. The real question is whether the trends observed in the superstar cities reflect broader national trends, or whether – once again – they are the outliers in a larger, more complicated picture. 

To get a sense of the trends, I looked at rental data gathered by the website Apartment List (www. apartmentlist.com) by city, pulling out those of the 100 largest cities for which data was provided, supplementing it with data from smaller cities as well as metro-level data. I looked at the trend for all rental units between January 2020 and January 2021, and for comparison purposes, the preceding year. While far from a complete picture of the rental markets in the United States or even in these cities, it’s a useful starting point for an overall perspective on what’s going on. This short piece will try to highlight some initial findings and offer some suggestions about the mechanisms behind the trends. 

The short answer is yes, they are outliers. More cities are still seeing increases in median rents than decreases in the face of the pandemic, by a ratio of roughly 3 to 2. 

Figure 1: The 100 largest cities by rent change from January 2020 to January 2021

Still, the fact that over a third of all cities saw declining rents, and in many cases significant declines, is notable. The year before, rents declined in only 6 out of 95 cities, and in no case by as much as 2 percent. 

Two clear patterns jumped out: 

  • The bigger the city, the more likely rents were to decline. Rents, on average, declined by 6 percent from January to January in the nation’s 10 largest cities. The only one where rents went up was Phoenix, while rents stayed flat in San Diego.
  • The higher rents were before the pandemic, the more likely rents were to decline. Rents on average went up 2 percent in the 10 lowest rent cities but went down by a whopping 13 percent in the 10 highest rent cities. 

Looking at the ‘top 10’ and ‘bottom 10’ in terms of rent increases or declines, and how their January 2020 rent ranked out of 100 cities, shows an interesting pattern. 

Of the cities with the greatest declines, most are recognizable as superstar cities, with Jersey City and Arlington being appendages – from a housing market standpoint – of New York and Washington respectively. Chicago and Minneapolis are less so, but both have seen extensive construction of upscale rental housing over the past 10 or so years. All but the last two are in the top rent quintile. By contrast, the cities with the greatest rent increases are medium-sized and smaller Sunbelt cities well outside superstar cities’ orbits. While these cities tend to skew toward moderately low rents, they are far from the lowest rent cities.  

The following chart (Figure 2B) shows the relationship between the change in rent over the past 12 months and the average level of rents in January 2021.  (The size of circles corresponds to the relative population of each city; hover over a circle to see the identity of each city and its rent level and rent change).

Figure 2B:  Change in rents and rent levels, 100 largest US cities

What can explain this pattern? There may be a few factors at work. A major one is the difference in the character of the renter population. The cities with the greatest declines are cities where large numbers of renters are young and affluent, a market to whom those cities’ rental developers and landlords have been increasingly catering in recent years. Many of these renters appear to be moving – in part out of these cities, but also in part to homeownership in the same cities. Notably, the 10 cities with the greatest rent declines saw a simultaneous average 6 percent  increase from December 2019 to December 2020 in Zillow’s Home Value Index. It also is likely to reflect a decline in in-migration of young, affluent renters, as suggested by recent research from the Federal Reserve Bank of Cleveland, as the same reasons that have prompted out-migration have made in-migration, at least for the time being, less appealing. 

Strong anecdotal evidence suggests that the declines are largely concentrated in the upscale or Class A rental market, as a recent report from WAMU in Washington DC noted,

“The drop [in rent levels] is driven primarily by price reductions in “Class A” apartments — newly-built units that have more luxurious amenities: Think buildings like The Apollo on H St. NE, or The Hepburn in Kalorama. As of October, the average rent for apartments like these dropped from $2,669 to $2,387 per month.” 

It’s not surprising.  Driven by Millennial migration, the upscale rental sector has been riding a wave for the past decade. Reflecting typical copycat developer behavior, upscale rentals have arguably been overbuilt in all of the cities seeing the sharpest declines in rent levels. Thousands of units have been built on spec, and with previous downtown workers moving out and fewer new workers coming in, vacancies increased and demand plummeted. Supply may eventually adjust to reflect lower demand, but that may take years. 

In most cities, however, renters are mostly lower to middle income, which brings in another factor. 

Upper income, highly educated workers are far more likely to have shifted to working from home during the pandemic than lower-income, less educated workers. As Figure 3 (from the Census Bureau’s Housing Pulse Survey) shows, two-thirds of workers in households with incomes over $100,000 (and nearly three-quarters of those with incomes over $200,000)  have moved to full or part time telework, compared to little more than 15% of those earning under $35,000. The education gap is similar. 

Source: Census Bureau, Housing Pulse Survey

With mortgage interest rates at all-time lows, affluent teleworkers can easily segue to homeownership. For a couple paying $2500 or $3000 in rent, a mortgage on a $600,000 house in a large, expensive city is only a moderate reach, and a $300,000 house in a smaller, attractive but more moderately priced city is a bargain.  

Low wage workers, who tend to be concentrated in service, health care, distribution or other sectors where working from home is not an option and often lack the means to become homeowners, are less likely to move. Thus, cities like Cleveland or Des Moines, where renters are predominantly lower income households, are seeing little change in rental demand. What they are likely to see, although its impact will not be visible until well into 2021 or later, is growth in rental arrears, as thousands of lower income tenants who have lost their jobs find it impossible to make rent payments. Unless forestalled by adequately funded rental assistance, that could end up creating far more dire problems for far more people than rent declines in upscale San Francisco apartment buildings. The threat of evictions, however, is unlikely to lead to declines in rent levels, at least in the short run, as landlords try to compensate for lost rental income. 

Few tears need be shed for the owners and developers of upscale apartment buildings in superstar cities. A correction was timely if not overdue. A more important question is whether the high-wage employment that drove the wave will grow back, or whether telework will become increasingly the norm. If the latter, not only are rents unlikely to revive, but the knock-on effects to the retail and service sectors supported by high-wage employment could be disastrous, with bars, restaurants, dry cleaners and other firms going out of business and thousands of lower-wage workers left unemployed. 

If the markets in these cities can be considered losers, those of the secondary cities of the Sunbelt shown in Figure 2 may be considered at least so far the winners. Not only have they seen sharp rent increases, but they saw even greater sales price inflation, with house values going up an average of 13% in the past year, well above the national average. Boise, Idaho topped the charts with a whopping 21% annual increase, while, according to Albuquerque Business First“the [Albuquerque] market shows no signs of slowing down, with homes going for record high prices and newly-listed property going under contract in less than 30 days.”  Whether what’s good for the housing market in these cities is good for the people who live there, of course, is another matter. 

Taking Tubman: ODOT’s plan to build a freeway on school grounds

ODOT’s proposed I-5 Rose Quarter project would turn a school yard into a freeway

The widened I-5 freeway will make already unhealthy air even worse

Pollution from high volume roads has been shown to lower student achievement

ODOT also proposes to build sound walls in Tubman’s school yard

Portland’s Harriet Tubman Middle School is one of the city’s most diverse.  Located in the heart of what historically was the center of the state’s African-American population, more than 60 percent of those enrolled are students of color. The Harriet Tubman School has long been the focal point in the struggle against racist policies.  Community activism helped save the school in the 1980s. There’s another historical legacy nearby as well:  the Interstate 5 Freeway, which, as we’ve chronicled was rammed through the neighborhood in the early 1960s, and which triggered a two-thirds decline in Albina’s population in just two decades.

We’re looking at Tubman Middle School today because the Oregon Department of Transportation is proposing to double-down on the environmental insult done to the area by widening the I-5 freeway, moving it even closer to the school.  As Oregon Public Broadcasting has reported, ODOT wants to move the freeway onto school property, and also build two 1,000-foot long, 22-foot tall noise walls between the school and the expanded freeway.   On Friday April 9, 150 community members rallied at the school to oppose the freeway widening project.  Two of the speakers were 9th grade students Adah Crandall and Malina Yuen, who had attended Tubman.

Harriet Tubman Middle School rally against the Rose Quarter freeway widening project, April 9, 2021.

Moving the freeway closer to the school

When it was built in the early 1960s, the freeway sliced through a portion of the then Eliot School’s grounds, removing a portion of both the school yard and the adjacent Lillis-Albina Park.  Plans for a wider I-5 Rose Quarter Freeway showed it would move the roadway closer to the school.  The group No More Freeways commissioned the following video showing how the freeway would cut away the hillside between the current roadway and the school.  No More Freeways filed suit in federal court to challenge the project’s environmental impact statement on April 2.

ODOT’s $800 million plan to cut away the hillside and move the I-5 freeway closer to Tubman Middle School. (Click for video).

Taking School Property

It hasn’t always been clear that the new freeway would intrude onto school property.  Newly obtained documents, released as part of a public records request, show that ODOT intends to take property now owned by the school district for its expanded freeway.  The following diagram shows the ODOT plans overlaid on a City of Portland map of property lines.  The turquoise lines on the chart correspond to property lines. The red cross-hatched area on the diagram is school property that would be taken for the freeway.

It’s a bit difficult to visualize the proposed taking of school property, so we’ve created a separate map which shows the take as a solid red area.  It’s clear that the new, wider freeway will be built by expanding on Tubman School property.  In a very real sense, we’re taking this land away from neighborhood school kids, and turning its use over to people driving through the area on the Interstate highway.

Estimated area of Portland Public School Property to be taken for freeway expansion (red).

A wider freeway will worsen already unhealthy air at Tubman Middle School

The major environmental issue posed by the proximity of the I-5 freeway to Harriet Tubman Middle School is the air pollution from automobile traffic on the roadway.  In its discussion of the issue, ODOT has been deceptive and inaccurate, both in estimating levels of pollution, and describing the standards of review required by the National Environmental Policy Act (NEPA).

ODOT has dramatically understated the air pollution associated with the freeway widening for several reasons.  First, as we’ve pointed out, it has modeled traffic only for a six-lane freeway, when in fact the 160 foot roadway it is building could accomodate 10 lanes, and vastly more traffic.  Second, its traffic modeling made factually false assumptions about both baseline and no-build levels of traffic on I-5 (inflating them by assuming that the Columbia River Crossing had been built in 2015 and allowing the model to have volumes in excess of capacity in the no-build scenario).  Third, it understated the amount of traffic growth (and pollution) in the build scenario, because its modeling made no allowance for induced demand from added capacity.

In addition, ODOT (and its peer review panel) engaged in a bit of regulatory sleight-of-hand in characterizing the project’s air quality impacts.  First, ODOT has repeatedly claimed that air quality in the area will be better in the future under the build scenario than it is today.  That may be true, but if it is, it is only because of factors entirely external to the project, specifically a changing vehicle mix (i.e. assumptions that fossil fueled vehicles become cleaner due to fuel efficiency requirements, and that a larger fraction of the fleet is electrified).  Whether this is true or not, it is actually irrelevant for purposes of the analysis required under NEPA.  NEPA requires a comparison of environmental effects with and without the project (i.e. “build” vs. “no-build”).  Since we will get vehicle efficiency and electrification regardless of whether this project is built or not, the fact that pollution levels are less than today (or not) are irrelevant.  The real, and unanswered (or incorrectly answered) question is whether air pollution will be higher under the “build” scenario, rather than the “no-build” scenario.  As we enumerated above, because ODOT systematically over-estimated “no-build” traffic, and systematically under-estimated “build” traffic, it got that analysis wrong.  Application of a scientifically calibrated induced travel calculator to the project shows that widening I-5 will generate millions more miles of car travel and thousands of tons more greenhouse gases and pollution each year.

The second bit of sleight of hand is claims by ODOT and its peer review panel that the project is either in compliance with the Clean Air Act or that no analysis of this project is required by the Clean Air Act (CAA).  That’s interesting information, but for the purposes of complying with NEPA, is irrelevant.  NEPA requires a disclosure of a project’s environmental impact, not simply an assurance that the project hasn’t violated some other federal law.  The peer review panel maintains that because Portland is not a “non-attainment” area under CAA, that ODOT is not required to do a “hot spot” analysis of the project.  The panel and ODOT also maintain that because no regulatory standards exist for greenhouse gas emissions, the project need not consider them.  Again, this may be true under the Clean Air Act, but by omitting this information, ODOT is failing to meet its obligations under NEPA to reveal the environmental effects of the measure.  In addition, NEPA requires that a project demonstrate conformity with applicable state and local laws; Oregon statutes call for an 80 percent reduction in greenhouse gas emissions statewide by 2050, and nothing in the project’s environmental assessment indicates how this project would comply; again, estimates using the induced travel calculator indicate the project will generate 8-15,000 additional tons of greenhouse gases annually.

Third, and perhaps most important, none of this analysis indicates whether either current or future levels of air pollution at Tubman School will be healthy or tolerable for Tubman students, staff and the public.  Even at current traffic levels, Tubman School’s air quality is problematic.  An independent assessment advised the school district to restrict outdoor activities by students.  The district spent more than $12 million of its own funds to add air filtration to make air inside the building safe for students to breathe.  Portland School Board member Julia Brim-Edwards raised exactly this issue at the March 22, 2021 meeting of the Executive Steering Committee.  In reply, she was told by members of the peer review panel that they didn’t look at that issue and that they were only tasked with judging compliance.

It is entirely possible for a freeway-widening project not to violate the Clean Air Act but also to worsen air pollution in a way that is deleterious to the health and well-being of people near the freeway.  We have a growing body of scientific evidence that higher levels of traffic near schools tend to impair student performance.  A recent study published by the National Bureau of Economic Research found that after controlling for a variety of other factors, proximity to high volume roadways (like I-5) had a statistically significant negative effect on student performance on standardized assessment tests.  According to ODOT, this stretch of road carries about 120,000 vehicles per day, putting in the highest category on this chart, and the expansion project would allow even more cars in this area.

Jennifer Heissel, Claudia Persico, David Simon,
Does pollution drive achievement? The effect of traffic pollution on academic performance.
NBER Working Paper No. 25489.

A closer freeway means more noise pollution and towering sound walls

But ODOT isn’t just going to take school property for the freeway.  It is also proposing to build two 1,000-foot long, 22-foot tall noise walls between the newly widened roadway and the school.   In the following ODOT diagram—taken from the project’s 2019 environmental assessment—the two walls are shown as “2a” and “2b” and are shaded purple.

 

The exact location of these noise walls doesn’t appear to be final.  The plans are a bit vague as to whether they’ll be built on ODOT property or on the school grounds. In theory, the walls are supposed to attenuate the noise from the closer, wider freeway at the school. In fact, ODOT’s own peer review panel raised concerns about whether the walls would be effective, and recommended that the walls be moved closer to Tubman School.  This would almost certainly mean that the walls would be built on school property—so in addition to having some of its land taken for the roadway, the panel’s recommendation is that a further portion of school property be taken for the noise wall.

From the March 22, 2021, ESC meeting, slide summarizing the Peer Review:

The two twenty foot walls will be roughly as tall as the Tubman building itself (and are nearly twice as tall as the Berlin Wall), while they might reduce sound somewhat, they’ll also turn the school’s grounds into a cramped area dominated by a 1,000 foot long concrete wall, arguably more like a prison yard than a school yard.

Making a mockery of equity and restorative justice

As we’ve made clear at City Observatory, decades of highway building by the Oregon Department of Transportation decimated the Albina neighborhood, leading directly to a decline of two-thirds in the neighborhood’s population (the leading to the Eliot School—original occupant of today’s Tubman building—being merged with the nearby Boise School, because of falling enrollment.  The Oregon Department of Transportation’s I-5 Rose Quarter Freeway widening project benefits car travelers, particularly suburban commuters, while imposing significant financial, health and learning costs on Tubman students.  While the average peak hour single occupancy car commuter from Clark County Washington has a median household income of $82,000; half of Tubman students are on free and reduced price lunches.  In addition, three quarters of those commuters a non-Hispanic whites, while two thirds of the Tubman student body is children of color.  The $800 million project will subsidize these car commuters, and meanwhile Portland Public Schools has had to spend $12 million (that could otherwise be spent improving education) on making the air in the school building fit to breathe.  These stark disparities in who bears the costs and who gets the benefits of this freeway widening make a mockery of its claim to be promoting “restorative justice.”

 

 

Revealed: ODOT’s Secret Plans for a 10-Lane Rose Quarter Freeway

For years, ODOT has been planning to build a 10 lane freeway at the Rose Quarter, not the 6 lanes it has advertised.

Three previously undisclosed files show ODOT is planning for a 160 foot wide roadway at Broadway-Weidler, more than enough for a 10 lane freeway with full urban shoulders.

ODOT has failed to analyze the traffic, environmental and health impacts from an expansion to ten lanes; not disclosing these reasonably foreseeable impacts is a violation of the National Environmental Policy Act (NEPA).

For years, the Oregon Department of Transportation has represented its plans to widen I-5 through the Rose Quarter in Portland as a minor addition of a pair of “auxiliary” lanes to the existing four lanes than carry I-5 through this area.  The agency has repeatedly declined to answer direct questions about the actual physical width of the roadway it is engineering, instead it relies on an incomplete and misleading illustration published in its Environmental Assessment.

Now, No More Freeways, a Portland citizen advocacy group has obtained documents showing that ODOT is actually planning a 160 foot roadway, one more than adequate to accommodate a full ten lane freeway. A story from Willamette Week, “Questions about the footprint of the I-5 Rose Quarter project intensify“, reveals that the Oregon Department of Transportation has long concealed data on the actual width of the freeway it is planning to build through Portland’s Rose Quarter.

ODOT has previously and repeatedly refused to answer basic questions about the width of the freeway.  These documents, which include detailed plans developed by ODOT and its contractors, shows the agency has long known exactly how wide a freeway it is planning, and has designed overpass structures to provide a full 160 feet of buildable space for Interstate 5 roadway at Broadway and Weidler. There’s no doubt, however, that the agency will continue to claim that it’s building a six lane freeway, but no one should believe them: there’s no reason to engineer a massive 160 foot wide roadway for only six lanes, as their own engineering documents, disclosed below show.  Moreover, this kind of deception is an established pattern for ODOT; in 2010 they assured Portland Mayor Sam Adams that they were shrinking the proposed Columbia River Crossing from 12-lanes to 10; instead, they kept the proposed bridge as wide as before and simply deleted all the references to its actual physical width from final environmental documents

Misleading and incomplete information in the Environmental Assessment

The only information provided about the width of the Rose Quarter Freeway was included in a single illustration contained in the project’s 2019 Environmental Assessment.  This illustration, which omits the overall width of the project shows only the dimensions of travel lanes and shoulders.  Together these add up to 126 feet.  That width, as we’ve pointed out at City Observatory, would easily hold an eight-lane freeway.

ODOT’s Misleading EA Illustration:  Six lanes in 126 feet

The following illustrations are taken from the Rose Quarter Environmental Assessment’s Right of Way Report.  We have added the black annotation with arrows indicating the width of the roadway; ODOT’s original illustrations do not provide this information, but we have calculated it from aerial imagery (existing) and by summing the lane and shoulder widths indicated on ODOT illustration (proposed).

What really fits in ODOT’s 160 foot roadway: 10 lanes of freeway 

We’ve drawn a new version of ODOT’s illustration that shows the actual size of the roadway they are proposing to build (on approximately the same scale as ODOT’s illustrations -above).  We’ve also shown how many lanes of traffic this roadway will accommodate.  Our diagram also shows the dimensions of travel lanes, and inside and outside shoulders an allowance for the freeway median and its structures.  This illustration makes it clear that ODOT is building a roadway that will easily accommodate a full ten-lane freeway.

ODOT’s actual freeway cross section at Broadway-Weidler

The undisclosed evidence of the 160 foot roadway.

Throughout the environmental review process and afterwards, the Oregon Department of Transportation has repeatedly declined to disclose the actual width, in feet, of the roadway it is planning to build through the Rose Quarter.

We’ve independently verified ODOT’s decision to engineer a 160-wide roadway based on three different documents.

File 1:  HDR Cover Analysis Memorandum, 2016

In April 2016,  ODOT consultants HDR prepared this memorandum to provide design parameters for the development of a proposal to construct covers over the freeway.  Figure two of this diagram shows the cross section of the freeway as proposed to be built under Broadway and Weidler overpasses.  The measurements on the diagram show two 80 foot spans on either side of a median support structure for a total width of 160 feet.  The detailed roadway section shows six 12 foot travel lanes, four 12 foot shoulders, and two unlabeled, vacant 17′ sections on either side of the outside shoulders.  The measurements on this diagram help explain the 126 foot section shown in ODOT’s illustrations above:  The illustration excludes the two 17 foot sections; adding back these sections brings the right of way to its full 160 foot width (126+34=160).  This detailed plan shows that actual travel lanes (36 feet; three 12 foot lanes on each side of the freeway) utilize less than half of the 80 feet of roadway under the overpass, with more space devoted to shoulders (24 feet in two shoulders) and an unlabeled 17 foot section (41 feet total).

Memo, April 7, 2016. From Andy Johnson (HDR) and Ron Hughes (AECOM), to Mike Mason and John Makler (ODOT), Subject:  Broadway/Weidler Lid Structure Design Concept Feasibility Analysis.

File 2:  CAD-Design Files

No More Freeways obtained a set of ODOT Computer Aided Design (CAD) files showing the plan for the proposed freeway.  We opened this file in a CAD program and used the file’s internal scale tool to measure the total distance of the roadway section as it crossed under the Broadway and Weidler overpasses.  The roadway section is shown in green; the total width of the roadway is 160 feet.

File 3:  Landscape Plans

ODOT hired landscape architect Marianne Zarkin (as a subcontractor to Nelson Nygaard and HDR) to develop a landscape plan for a proposed freeway cover.  Her firm’s website contains a plan of the proposed hardscape and landscaping for the freeway cover, and an included cross-section diagram illustrates the width of the freeway.  The files are un-dated.  While the diagram itself lacks a scale, it does show the size and location of freeway lanes.  Based on the nominal 12′ width of these lanes, the plan shows that the Broadway-Weidler overpass would span a distance of more than 150 feet.

Editor’s Note (February 25, 2021):  After this commentary was originally published, these images were removed from the publicly accessible location on Zarkin’s website. The link that directed to the image shown above now shows as “not found,” as shown below:

City Observatory retains copies downloaded from the website on 24 February 2021.

Why this matters:  More traffic, more pollution, an invalid environmental assessment

ODOT has attempted to minimize the traffic, environmental, health and noise effects of its freeway widening project by representing it as the addition of only two “auxiliary” lanes to the existing four-lane freeway.  These newly revealed plans show that ODOT is actually planning a ten-lane freeway, which would accommodate vastly more traffic, and as a result would have far different and much greater impacts on the area’s livability, safety, and environment.

Constructing additional lanes will induce additional traffic demand, leading to large increases in vehicle miles traveled, air pollution emissions and greenhouse gases.  A ten-lane freeway will, for example, increase the air pollution exposure of students at Harriet Tubman Middle School, which abuts the widened freeway.  The traffic from the ten lane freeway will flood adjacent city streets, making them more hazardous for cyclists and pedestrians.  A higher level of traffic through the Rose Quarter will also make sites on or near the freeway (like the proposed caps) noisier and more polluted than revealed in the Environmental Assessment.

 

Wile E. Coyote hits bottom: Portland’s inclusionary zoning

Portland’s inclusionary zoning requirement is a slow-motion train-wreck; apartment permits are down by sixty percent in the City of Portland, while apartment permitting has more than doubled in the rest of the region

Inclusionary zoning in Portland has exhibited a Wile E. Coyote pattern:  apartment starts stayed high initially, until a backlog of grandfathered units got built.  Since then Portland apartment permits have plummeted.

The Wile E. Coyote Inclusionary Zoning Story in Portland

In December 2016, Portland’s City Council enacted a strong inclusionary housing requirement.  Henceforth, all new apartment buildings in Portland would have to set-aside a portion of their units for low- and moderate income housing. Unlike other cities that either made compliance voluntary, or largely (or entirely) offset the cost of the added units with density bonuses or subsidies (or other quid pro quo), the Portland ordinance applied to nearly all apartment buildings larger than 20 units. The new requirement didn’t kick in until February 2017, and there was a land rush of developers who filed under the old rules.  That produced a temporary flood of new apartment buildings, that have, over the past four years, mostly been built.

Investment markets work with lags for a variety of reasons.  It takes time to plan, obtain permission for, and actually build new housing, and multi-family housing takes longer than single family housing.  As a result, there’s a multi-year pipeline.  When there are housing shortages, as there were in the early days of the recovery from the Great Recession, supply can’t expand as rapidly as demand, and rents get bid up.  The reverse is also true; a glut of building in good times produces new apartment supply that holds down rents, at least for a while.  That effect has concealed the negative consequences of Portland’s inclusionary zoning policy.

As we observed in May of 2019, the initial implementation of inclusionary zoning resulted in a kind of counter-intuitive acceleration of apartment construction.

. . .  the first two years of inclusionary zoning in Portland have been a game-theory win-win for housing affordability. The threat of tougher future requirements prompted a whole lot of investment to happen much earlier than it otherwise would have, and new developments, added to those already under construction, have helped deliver a lot more new apartments in Portland.

Wile E. Coyote hits bottom in Portland

Back in 2019, we said that Portland’s apartment market was in the midst of the “don’t look down” portion of its Wile E. Coyote experience. The momentum from pre-IZ housing applications filled the construction pipeline, and led to a steady increase in the number of new apartment completions.  But that initial surge of construction in response to the IZ grandfathering has petered out.

According to multifamily building permit data reported by Oregon State economist Josh Lehner, multifamily building permits in Multnomah County (mostly, but not entirely Portland), have fallen from more than 5,000 a year in 2018, to about 2,000 a year today.

And that’s not because of a weak regional market for apartments.  Pretty much the opposite trend has been playing out in the Portland area suburbs.  Back in 2018, there were only about 2,000 units per year being permitted in the suburban counties surrounding Portland; today there are more than 5,000 apartments. In the same broad metropolitan market, apartment permitting has fallen 60 percent in Portland and the same time it has increased two-and-a-half fold in the surrounding suburbs.  

There’s other evidence as well. As we reported earlier, there’s been a collapse in demand for 20 to 30 unit apartment projects in Portland (these projects are just above the threshold for having to comply with the inclusionary housing requirement.  There’s evidence that developers are under-building on available sites (something we’re stuck with for the life of the buildings, which is likely to be many decades).

By all measures, Wile E. Coyote has hit the desert floor.

A glass half full (actually 40 percent full in this case) optimist might point to the fact that apartment construction in Portland hasn’t fallen further.  But ultimately, inclusionary zoning won’t wipe out all apartment construction.  What it does mean though is that rents on market rate apartments have to rise high enough to compensate developers for paying the costs of constructing the below market rate units required by inclusionary zoning.  And rents in Portland are high enough to make about 2,000 apartments a year pencil out for developers.  That’s far fewer than we need.  And it’s a symptom of the inherent contradiction built into the inclusionary requirement—it only works because it keeps rents high, which actually makes the overall affordability problem worse.  And make no mistake, housing affordability is a problem of scale:  a few hundred or even a few thousand discounted apartments to essentially nothing to ameliorate the affordability problem, and meanwhile, everyone who’s rent is set by the market pays a higher rent to produce a few trophy units.  It’s as crazy and counterproductive as any of the ACME corporations sure-fire roadrunner catching gizmos.

 

Inclusionary Zoning: Portland’s Wile E. Coyote moment has arrived

Portland’s inclusionary zoning requirement is a slow-motion train-wreck; apartment completions are down by two-thirds, and the development pipeline is drying up

This will lead to slower housing supply growth and increasing rents for everyone over the next two to three years

Inclusionary Zoning (IZ) creates perverse incentives to under-utilize available land

In December 2016, Portland’s City Council enacted a strong inclusionary housing requirement.  Henceforth, all new apartment buildings in Portland would have to set-aside a portion of their units for low- and moderate income housing. Unlike other cities that either made compliance voluntary, or largely (or entirely) offset the cost of the added units with density bonuses or subsidies (or other quid pro quo), the Portland ordinance applied to nearly all apartment buildings larger than 20 units. The new requirement didn’t kick in until February 2017, and there was a land rush of developers who filed under the old rules.  That produced a temporary flood of new apartment buildings, that have, over the past four years, mostly been built.

Investment markets work with lags for a variety of reasons.  It takes time to plan, obtain permission for, and actually build new housing, and multi-family housing takes longer than single family housing.  As a result, there’s a multi-year pipeline.  When there are housing shortages, as there were in the early days of the recovery from the Great Recession, supply can’t expand as rapidly as demand, and rents get bid up.  The reverse is also true; a glut of building in good times produces new apartment supply that holds down rents, at least for a while.  That effect has concealed the negative consequences of Portland’s inclusionary zoning policy.

As we observed in May of 2019, the initial implementation of inclusionary zoning resulted in a kind of counter-intuitive acceleration of apartment construction.

. . .  the first two years of inclusionary zoning in Portland have been a game-theory win-win for housing affordability. The threat of tougher future requirements prompted a whole lot of investment to happen much earlier than it otherwise would have, and new developments, added to those already under construction, have helped deliver a lot more new apartments in Portland.

Portland reaches its Wile E. Coyote moment

Back in 2019, we said that Portland’s apartment market was in the midst of the “don’t look down” portion of its Wile E. Coyote experience. The momentum from pre-IZ housing applications filled the construction pipeline, and led to a steady increase in the number of new apartment completions.  And, as we’ve noted, the increase in supply pushed up vacancy rates, and rent increases, which had been in the double digit range in 2015, fell to just 1-2 percent per year, according to Apartment LIst data.

But now, Wile E. Coyote has looked down, and seen nothing holding him up. Data compiled by local economic consulting firm ECONorthwest tracks the number of apartment permits issued in Portland over the past 15 years.  It shows the surge in new apartments in 2017 largely holding up in 2018 and 2019, but then plummeting by roughly two-thirds in 2020, from an average of 4,500 new apartments per year to fewer than 1,500.

The ECONW analysis of the building permit data is echoed by other market analysts.  Noel Johnson’s website, EnvisionPDXtrends also has data on Portland’s development pipeline showing a diminished volume of new apartment construction activity since the inception of the city’s inclusionary housing requirements.

Portland apartment completions (EnvisionPDXtrends)

Similarly Patrick Barry, of the Barry Apartment Report, says that there’s been a sharp fall off in the number of new multi-family building permits applied for in Multnomah County (which contains the City of Portland).  New apartment permits in the county have fallen more than 60 percent in the past year, from  5,165 units in 2019 to 2,043 in 2020.

By all measures, Wile E. Coyote is plummeting to the desert floor.

Lean years ahead for apartment deliveries

What’s even more ominous, though, is a parallel decline in new projects in the application process.  It takes time (two or even three years) for a project to go from permit application to “for lease,” so much of Portland’s apartment supply for 2022 and 2023 (when, by all accounts the economy is expected to be booming again), is essentially already baked into the cake of filed permit applications.  Again, ECONW tracks these new permit applications using city data.  Entry into the pipeline is defined as “set up” activity, when an applicant pre-files or files for a new building permit.  The number of set ups for apartment units peaked in 2016 and 2017 at slightly more than 6,000 new units per year.  Since then, new setups have declined by a third in 2018 and 2019 to about 4,000 per year.  In 2020, new setups were about 2,600, less than half their 2017 level.

Just as the past two to three years have produced a kind of temporary win-win of greater apartment deliveries and slowing rent inflation, the next couple of years seem almost certain to have exactly the opposite:  declining numbers of new apartments, and rising rents.  Already, national forecasters like Zillow are predicting a rapid rebound in demand for urban markets in the post-pandemic period.

Developers will likely wait for the City of Portland to realize the devastating effects of these burdensome IZ requirements and to relax them, or wait for rents to rise enough to support the costs of building new apartments and covering the cost of subsidized units, or simply resign themselves to building smaller, 20-unit buildings, that do much less to expand housing supply and may mean permanently under-utilizing sites that are well-situated to accommodate even greater density.

The Mansard effect

Some of the effects of Portland’s IZ requirements will be quirky and permanent changes to the building stock  For example, one key feature of Portland’s inclusionary zoning rule is that it exempts buildings with 20 or fewer apartments from the inclusionary housing requirement, apparently based on the assumption that such a requirement would make these smaller projects uneconomical.  But what that exemption has done is to prompt many developers to shift to these smaller buildings.  Over the past couple of years, data from the city show that the number of 16-20 unit apartments has (red line) while the number of 21-25 unit buildings (green line) has disappeared.  The number of 16-20 unit buildings in 2020 was 143 percent higher than the 2014-2016 average; the number of 21-25 unit buildings was 100 percent lower (zero) than its 2014-16 average.

For reference, as noted above, total apartment completions declined about 67 percent over this time. The 20-unit and under exemption essentially incentivizes developers to build fewer apartments, and because residential structures tend to be long-lived, once a site is built out at 20 units when it could have been 25 or 30 units, that additional housing will be foreclosed for 50 or 100 years.

Also, developers have noted that the inclusionary provision applies on a building-by-building basis, so by dividing a development project up into a series of 20-unit or smaller buildings, a developer can build many apartments without having to comply with the inclusionary requirement.  That shows up, for example in a recent development in Northwest Portland, where developer Noel Johnson is building eight five-story residential buildings, with a total of 145 units on a small urban infill site.  Again, because each building is 20 or fewer units, the inclusionary requirements don’t apply.

Northbound 30 (Jones Architecture and Waechter Architecture, via Next Portland).

Doing this development as eight different buildings may not the most efficient arrangement, but (to this economist’s eye) the result isn’t unaesthetic.  Johnson points out that dividing the project into multiple buildings assures that each apartment is a corner unit (dual-aspect to you English housey types), which make them more desirable.  It’s an example of how regulation can prompt innovation.

The effect of restrictive land use regulations on urban architectural form has a long history. Mansard windows, now a cherished hallmark of Parisian architecture, soared to popularity as a dodge on city building restrictions.  Buildings in Paris were limited to just 20 meters (about 65 feet), but the rules provided the height would be measured at the building’s cornice.  As a result, a floor or two stepped back behind a steeply angled gambrel roof didn’t count against the height limit.  (Fun fact: back in the days before elevators, apartments on the top floors of buildings commanded rents because tenants had to walk up every flight of stairs; the mansard-enabled apartments essentially functioned as a kind of affordable housing bonus).

Paris builders used Mansard roofs to evade the city’s 20 meter height limit

While the Mansard roofs are endearing, they’re a visible and enduring symbol of the power of regulation to alter the housing market.  And more significant than the changes to the housing that gets built, are the ways that regulations cause new housing not to be built at all can have even greater, but unfortunately largely unseen impacts.  The new apartments that aren’t being built now in Portland will almost certainly lead to higher rents and less affordability in the years ahead—exactly the opposite of the expressed intentions of those who enacted this policy.  It’s the apartments that aren’t being built that are the real legacy of inclusionary zoning.

Editor’s Note:  Thanks to Mike Wilkerson of ECONorthwest and Noel Johnson of EnvisionPDXtrends.com for sharing their tabulations of Portland apartment permit data.  Opinions and analysis presented here solely reflect the views of City Observatory.

The Fundamental, Global Law of Road Congestion

Studies from around the world have validated the existence of induced demand:  each improvement to freeway capacity in urban areas generates more traffic.

The best available science worldwide—in Europe, Japan and North America—shows a “unit-elasticity” of travel with respect to capacity:  A 1 percent expansion of capacity tends to generate 1 percent more vehicle miles traveled.

The fundamental law of road congestion requires us to fix broken traffic models and stop widening highways in a futile effort to reduce congestion.

Call it what you will:  Jevons Paradox, Braess Paradox, Marchetti’s Constant or Downs’ Triple Convergence, the science confirms them all.

Induced demand:  More road capacity produces more traffic

At City Observatory, we’ve related the classic example of North America’s widest freeway, the 23-lane Katy Freeway in Houston.  It’s been successively widened many times, most recently at a cost of $3 billion, and within three years of its expansion, commute times were even longer than before.

But there’s much more than anecdotes like the Katy Freeway to buttress the observation of induced demand.  Sophisticated, in-depth studies of transportation infrastructure and traffic levels, that look at entire nations and measure traffic changes over decades find what is now being called “the fundamental law of road congestion.”  An increase in road capacity directly generates a proportional increase in traffic, with the effect that congestion and travel times quickly return to (or worsen from) pre-expansion levels.  Simply put, expanding road capacity is a futile, and self-defeating effort.  Urban highway expansion is the labor of Sisyphus.

Two recent and definitive studies are Duranton and Turner’s “Fundamental Law of Road Congestion,” and more recently Kent Hymel’s “If you build it they will drive” both of these studies use data for the US and find a unit elasticity of traffic with respect to roadway expansion. Hsu and Zhang found a nearly identical result for roadway expansion projects in Japan.

Europe:  Still more evidence Induced Demand and the fundamental law of road congestion

The latest evidence of the universality of the fundamental law comes from Europe.  Three researchers from the University of Barcelona use two decades of data for hundreds of European cities to replicate the methodology used by Duranton and Turner and Hymel in the U.S.  They find very similar results, confirming the fundamental law of road congestion.  The best estimate is that the elasticity of travel with respect to capacity is essentially unitary:  a one percent increase in highway capacity generates a one percent increase in vehicle travel.

We use data for the 545 largest European cities to estimate the elasticity of a measure of congestion with respect to highway expansion. The results indicate that this elasticity is in the range close to 1. This suggests that expansion of the highway network induced the demand for car travel, and so, on average, the level of congestion remained roughly unchanged in the period 1985–2005. In other words, we show that investments in highways did not effectively relieve traffic congestion.

Congestion in Edinburgh (The Herald)

Tolling is the only way to avoid the induced demand trap

Garcia-Lopez, Pasidis and Viladecans-Marsal examine how the prevalence of tolled roadways affects the induced demand effect.  Cities that toll a higher proportion of their highway system have a much smaller induced demand effect.  Their analysis concludes that traffic is highly elastic in response to capacity expansions in cities with no tolls (each 1 percent in capacity results in a nearly 2 percent incrase in travel; which traffic is highly inelastic in cities with 100 percent tolled highways (a 1 percent increase in capacity results in a 0.3 percent increase int raffic:

(1) highway improvements increase congestion, (2) the effect is smaller in cities with tolls, and (3) the fundamental law is mainly related to cities without tolls or with a low percentage of tolled highways. In particular, and focusing on our preferred specification in column 3 (using a continuous interaction), a 1% increase in lane kilometers increases congestion by 1.9% in cities without tolls and by only 0.3% (=1.9-1.6) in cities with tolls in all their highways (100% share of tolled highways). Some simple computations show that the fundamental law applies to cities with a share of tolled highways below 56%. These results can be regarded as novel evidence in line with recent literature suggesting that the solution to traffic congestion is the adoption of ’congestion’ pricing policies.

Policy implications:  Fix broken traffic models; Stop widening highways

The fundamental law of road congestion is a demonstrated scientific fact, with unambiguous implications for public policy.  First and foremost, the fundamental law signals that the folk wisdom (repeated by highway boosters) that we can somehow “build our way out” of traffic congestion is utterly false.  More roads simply generate more traffic and more sprawl.  Cities and states should stop spending money on road widening projects to reduce congestion.  There’s a second, more subtle and technocratic point as well.  The fiction that more capacity will somehow reduce congestion is actually hard-wired into many of the “four-step” traffic models highway departments use to plan (and justify) highway widening.  The models are calibrated in a way that either ignores or denies the existence of induced demand, usually by simply assuming that the level of traffic demand is fixed, and unaffected by either journey times, delays or congestion.  At best models may crudely re-route traffic in response to congestion, but they fail to alter aggregate trip demand, especially in the long run.  As a result, as Jamey Volker, Susan Handy and Amy Lee show, most existing travel models create the false illusion that a wider road will lead to faster traffic.  Transportation planners–and funding entities, like the Federal Highway Administration—should insist that transportation models be updated to reflect scientific reality. The induced travel calculator shows how this can be done, now.

A truth with many names and discoverers

While this new study from Barcelona, and the similar papers by Duranton & Turner, Hymel, and Hu and Zhang all elaborate in great statistical detail on this finding, the basic concept is well understood, and has been for decades (or longer).  The scientific validation of the phenomenon of induced demand buttresses a series of related explanations:  the Jevons Paradox, the Braess Paradox, and Marchetti’s constant and the Triple Convergence.

Jevons Paradox holds that an increase in efficiency in resource use will generate an increase in resource consumption rather than a decrease.  English economist William Stanley Jevons predicted greater efficiency in using coal would increase its use in 1863.  The efficiency gain seems paradoxical, if one assumes that demand is unaffected by the lower price of a more efficient process, but by making something more efficient, we generate additional demand.

Braess’s Paradox is the application of this general idea specifically to traffic.  German engineer Dietrich Braess postulated exactly this in 1968.

Marchetti’s constant is a corollary to these paradoxes:  It observes that the amount of time human’s devote to daily travel remains constant regardless of improvements in transportation technology. Whether walking, riding horses or streetcars, or driving cars, we devote about an average of an hour a day to travel.  Marchetti’s constant means that we use improvements in transportation to travel further, rather than saving time, a result fully consistent with the fundamental law of road congestion. (The observation has been made independently by many observers including Bertrand Russell as early as 1934.

Down’s Triple Convergence, in his 1992 book Stuck in Traffic, economist Anthony Downs described the existence of a “triple convergence” in which changes in road infrastructure prompted changes in the mode (i.e. transit to car), time, or destination of trips in ways that would lead to congestion reappearing even after an expansion of road capacity.

Miquel-Àngel Garcia-López & Ilias Pasidis & Elisabet Viladecans-Marsal, 2020. “Congestion in highways when tolls and railroads matter: Evidence from European cities,” Working Papers wpdea2011, Department of Applied Economics at Universitat Autonoma of Barcelona.

 

 

Oregon’s I-5 bridge costs just went up $150 million

Buried in an Oregon Department of Transportation presentation earlier this month is an acknowledgement that the I-5 bridge replacement “contribution” from Oregon will be as much as $1 billion—up from a maximum of $850 million just two months earlier.

The I-5 bridge replacement project (formerly known as the Columbia River Crossing) is a proposal for a multi-billion dollar freeway widening and bridge-expansion program between Portland and Vancouver.  The original CRC project died after costing nearly $200 million for staff and consultants in 2014, but has been revived in the past year.

The cost to Oregon of reviving this boondoggle just jumped to $1 billion.

Late last year, we took a close look at the project’s initial financial plans, which show the project could cost as much as $4.8 billion (and considerably more if more realistic inflation estimates are used). We also identified a fundamental math error in the estimation of the project’s financial gap, i.e. the difference between expected costs and potential revenues.  The Oregon and Washington transportation departments—ODOT and WSDOT—understated the maximum size of the funding gap (i.e. what happens in the two state’s realize the low end of expected revenues and incur the high end of expected costs), by more than $1 billion; the total gap the two state’s face is $3.4 billion.  That hole will have to be filled for the project to move forward.  While both states have indicated an interest in reviving the project, neither has committed funds, so a big question now is, how much will they have to contribute.  The Oregon Department of Transportation was telling legislators one thing a couple of months ago, and something a good deal more expensive now.

December 2020:  Oregon contribution $650 to $850 million

ODOT has been including its estimates of Oregon’s share of these costs in its presentations to state legislators.  On December 10, 2021, ODOT testified to the Legislature that Oregon’s contribution to the I-5 bridge project would be $650 million to $850 million.  (The second colored bar on this chart is identified as “Interstate Bridge Replacement Contribution”

February 2021:  Oregon contribution $750 to $850 million
That estimate is no longer operative.  In a presentation to the Legislature on February 4, 2021, the Department included this diagram, showing the state’s contribution to the project was now $750 million to 1 billion.  The chart is almost identical to the chart presented in December, only the price tag of the I-5 bridge project has changed.
In presenting this chart to the Joint Transportation Committee, ODOT’s Brendan Finn made no mention of the increase in Oregon’s expected contribution.  Instead, he drew the committee’s attention to the timetable for implementation of tolling, and didn’t discuss any of the budgetary amounts listed on this chart.  No one on the committee commented on or questioned the budget amounts.
By comparison to financial plans for the original CRC, this represents essentially a doubling of the state contribution to project costs.  The adopted CRC finance plan called for Oregon and Washington to each chip in $450 million, with the balance of the project to be paid for by tolls, federal transportation funds, and hoped for earmarks.  This change in Oregon’s contribution also implies that the total cost of the project has likely increased by between $200 and $300 million since December, as project costs are divided evenly between the two states by agreement of their state transportation commissions.
Steadily escalating project costs, with under-estimates early on, and cost-overruns later are a routine feature of ODOT projects.  Just a year ago, after long telling the Oregon Legislature that the Rose Quarter freeway widening project would cost $450 million, ODOT raised the project’s price tag to as much as $795 million.  Cost-overruns of 200 percent or more have been common on large ODOT highway projects like the Highway 20 Pioneer Mountain-Eddyville segment, the Newberg-Dundee bypass and Portland’s Grand Avenue Viaduct.
While the allocation of much smaller amounts to bicycle, pedestrian and safety projects generates substantial debate and visible resistance from ODOT, the implied decision to increase the allocation for a major freeway-widening project doesn’t even merit a mention to the Legislature, and is accomplished, seemingly, with a deft change to a single powerpoint slide.  This is typical of ODOT budget practices which conveniently find “unanticipated” revenue whenever it’s time to fund a major highway project.
Even though Oregon and Washington have already spent nearly $200 million on the CRC, and committed another $50 million to the planning effort to revive the project, there’s still considerable uncertainty about the project’s actual dimensions, costs, and revenues.  All one can say with any confidence, based on long experience, is that the project’s total price tag is likely to go even higher.

Equitable Carbon Fee and Dividend

An equitable carbon fee and dividend should be set to a price level necessary to achieve GHG reduction goals; kicker payment should be set so 70% of people receive a net income after paying carbon tax or at least break even.

By Garlynn Woodsong

Editor’s note: City Observatory is pleased to publish this commentary by Garlynn Woodsong. Garlynn is the Managing Director of the planning consultancy Woodsong Associates, and has more than 20 years of experience in regional planning, urban analytics and real estate development. Instrumental in the development and deployment of the RapidFire and UrbanFootprint urban/regional scenario planning decision-support tools while with Calthorpe Associates in Berkeley, CA, his focus is on making the connections between planning, greenhouse gas emission reductions, public health, and inclusive economic development. For more information, or to contact Garlynn, visit this website.  Earlier, Garlynn wrote “A Regional Green New Deal for Portland” at City Observatory.

 

One thing that has been made abundantly clear during the pandemic of 2020 (and beyond) is the importance of making social payments to help people deal with a difficult and shared transition. During 2020, this transition was due to COVID-19, and involved large portions of the population ceasing to commute or otherwise engage in normal activities outside of the home that involved other people or indoor spaces. There was wide recognition that payments needed to be made to help everyone cope with the expense of the pandemic, as mitigation for a shared national social emergency.

As I write this, Congress is currently debating the right size and number of payments to send, and to whom they should be sent, in order to mitigate for some of the personal impacts of this long and drawn-out national crisis. There is wide recognition that, in this context, it is in everyone’s best interest to make sure all of us have the resources we need to survive; the debate is over the details.

We must now also contend with the transition from a fossil-fuel-dominated economy to a post-carbon economy in order to stave off the worst potential impacts of climate change. Climate change is, in many ways, like a much slower moving pandemic, one  where the majority of the body counts still lie years, decades, or centuries in the future, rather than right now (with fears of potential consequences a few weeks from now flowing from any poor decision-making in the present). Yet, we are beginning to see that, even now, the demands for some kind of restitution are being made for those who face potential job or gig losses due to fossil fuel pipelines being canceled under the Biden administration.

These demands are not wrong. 

If we are to be successful in enacting a just transition away from a fossil-fuel-powered economy to a carbon-free economy, there will be a steady decline in job opportunities in occupations tied to fossil fuels, such as coal mining, oil drilling, and fossil fuel pipeline building. We must therefore construct a framework to provide climate adjustment aid payments to individuals, to help pay for retraining, retooling, investments, and for equity reasons. I would argue that this should come in the form of a carbon dividend that is paid for by a carbon fee.

For decades, economists have recommended the use of a carbon tax to achieve the necessary GHG emissions reductions we need to prevent the worst impacts of climate change. Yet, within the United States, carbon taxes have not yet been deployed broadly. Most recently, a proposal for a carbon tax was defeated at the ballot box in Washington State, thanks to heavy spending by fossil fuel interests.

To date, however, we have not yet seen a proposal at the ballot box for an equitable carbon tax, which I would argue should instead be called a carbon fee-and-dividend program. This is what we need, however, to power our just transition away from fossil fuels, and to “build back better” the carbon-free economy of the future that we need — without leaving anyone behind.

A carbon fee will necessarily impose additional costs on households and businesses. To ensure that it is not regressive, it must thus be paired with a carbon dividend payment, so that lower-income households are not unfairly burdened by the expense of the fee. That’s a baseline: that the dividend for the average person should cover (or more than cover) their costs of reducing carbon emissions; they could either spend their dividend to lower their emissions and avoid the carbon fee, or if they didn’t have a good way to do that, the dividend would at least cover the costs of the fee of the typical person. We might fund additional benefits, providing higher payments to folks within certain targeted communities, such as those that experience disproportionate burdens from an economic transition away from fossil fuels. These could include both a sort of extended unemployment payments to individuals, as well as a kind of climate loan (perhaps modeled after the Paycheck Protection Program [PPP]) to finance businesses that convert to carbon-free processes.

Affordable, zero-emission homes could be financed with low-interest loans.

Here are the elements that such a program should contain, whether enacted by local municipalities, regions, states, or at the national level:

  • The carbon fee should be applied upstream, that is, at the level at which fossil fuels or fossil fuel-derived products enter the economy of the taxing jurisdiction, based on the amount of embodied carbon they contain (their carbon emission potential), as well as the amount of carbon used to produce them.
  • The price of the carbon fee should be set at a meaningful level to begin with, such as $15 per ton of carbon dioxide equivalent emissions potential (a level proposed by Brookings); not so low as to be completely ineffective, and not so high as to provide a shock to the economy.
  • The price of the carbon fee should escalate steadily over time, at a rate that is estimated to deliver the emissions reductions we need to achieve our climate goals.  Estimates vary widely on how high the fee would need to go in order to put us on track to keep temperatures from rising above 1.5C by 2030.  The IMF estimates it would need to rise to $75 per ton of carbon dioxide equivalent emissions by 2030; other estimates are higher.  But once we are on track, the carbon fee (and payments) would flatten out automatically. 
  • A robust modeling and monitoring program would need to accompany the program, to ensure that the tax rate is set properly to deliver the needed emissions reductions from all sectors of the economy; for instance, if Vehicle Miles Traveled doesn’t decrease by an amount that, combined with the penetration of electric vehicles and blending of renewable fuels, delivers the necessary carbon emissions reductions by a certain year, then the carbon fee should be automatically adjusted upwards to a level estimated (using best available evidence as to the elasticity of demand for fuel containing carbon based on changes to its price) to deliver the necessary emissions reductions by an agency with authority to do so without political interference.
  • Some of the revenue from the program should be returned to individual households on a sliding scale, with no revenue returned to households making 200% or more of median income (currently about $140,000 per year), but sufficient revenue returned so that 70% of households making less than 200% of median income will experience no net increase in household expenses due to the carbon fee during the first five years of its roll-out.
  • These payments to households should come in the form of a check from the government, delivered monthly or quarterly, so that there is a regular, continuing benefit that can be banked on by regular folks — the same regular folks whose support will be needed to pass an initiative at the ballot box, or support a politician taking a vote in a city hall, county seat, regional council chamber, or state or national capital.
  • The remainder of the revenue from the carbon fee should be used to finance the transition away from fossil fuels to a carbon-free economy. 
  • A public bank should be funded by the carbon fee and empowered to make low-interest loans, using a revolving loan fund, to finance investments that will reduce emissions. This could take the form of a secondary market for loans originated by private lenders; the point is to make the funding available at comparatively low interest rates. These investments could include anything from a household seeking to buy an electric vehicle and install solar/wind power generating facilities and energy storage solutions, to companies seeking to replace fossil fuel consuming processes with renewable processes. Repayment terms should be set to ensure that individuals and companies receiving financing will experience a net benefit, that is, a reduced operating cost level after participating in a bank-funded program in comparison to their previous expenditures for fossil fuel-based energy.
  • Certain public investments, such as high speed rail, electric-powered public transit, creating walkable neighborhoods, and similar public infrastructure, should be funded using carbon fee revenue through a grant program to fund the transition away from fossil fuels. We won’t be able to reduce emissions sufficiently to achieve our targets using electric cars and solar panels alone; we also need to transition to compact communities built around walking, rather than driving, in order to reduce the demand for energy to a level we can provide within the limited time available between now and 2035, and 2050. Public investments will need to be made to implement this transition, and the carbon fee can provide the funding.

The beauty of such a carbon fee and dividend program, is that it provides the funding mechanism to households and businesses to pay for a transition away from fossil fuel consumption. If a household currently owns something that requires fossil fuels, such as an internal combustion-powered automobile, then it should be able to obtain financing from the public bank to purchase an electric vehicle, with payments covered by the divided (up to a certain reasonable amount). Then, if at any point a household wants to switch from using the dividend payments to pay for gasoline, to using them to service the car payments for a new electric vehicle, it will have the mechanism to do so without impacting the balance of the household budget. Critically, the carbon dividend payments should be made first, before the carbon fee comes due, to ensure that the most vulnerable members of the community are not harmed by its initial roll-out.

Trees sequester large amounts of carbon. A carbon tax could fund activities to protect and replant forests.

If a business currently owns something that requires fossil fuels, such as a blast furnace to produce steel that is powered by coal coke, then it should be eligible for low-interest financing to replace the fuel source for the blast furnace with electric power or renewable energy.

Such a program won’t be sufficient, by itself, to achieve our total emissions reductions goals by 2050. However, as a part of a larger Green New Deal-esque program that includes investments in urbanism to reduce overall demand for energy in the transportation and building sectors, it could be the critical factor that provides financing to ensure the success of much of the balance of the policy package.

Critically, an equitable dividend as a part of a carbon fee initiative will ensure that a harsher burden is not imposed on households with lower incomes, and thus it will prove to be a progressive, rather than regressive, solution to climate change.  One valuable lesson of the Covid-19 pandemic has been that it makes sense, in the face of a dire crisis that threatens everyone, to make payments to help those most affected or who need to adapt for the benefit of all.  We should apply that lesson to the climate crisis.

How ODOT destroyed Albina: The I-5 Meat Axe

Interstate 5 “Meat Axe” slashed through the Albina Neighborhood in 1962

This was the second of three acts by ODOT that destroyed housing and isolated Albina

Building the I-5 freeway led to the demolition of housing well-outside the freeway right of way, and flooded the neighborhood with car traffic, ending its residential character and turning into an auto-oriented landscape of parking lots, gas stations and car dealerships.

New York City’s Robert Moses is cast—accurately—as the villain who routinely rammed freeways through city neighborhoods.  Freeways, Moses said “. . . must go right through cities, and not around them, . . . When you’re operating in an overbuilt metropolis you have to hack your way with a meat axe.” (Moses, 1954, quoted in Mohl, 2002).

And Moses, the subject of Robert Caro’s epic biography The Power Broker, actually wielded his meat axe in Portland. The original route of the Interstate 5, which at the time was called the “Eastbank Freeway” was recommended by none other than Moses, who came to the city in 1943 with a group of his “Moses Men,” to recommend a public works program for the region, which recommended the city be carved up by a series of freeways.

As part of its efforts to sell a $800 million I-5 freeway widening project in Portland, ODOT, the Oregon Department of Transportation, has made quite a show of acknowledging its complicity in destroying the Albina neighborhood, which six decades ago, was the segregated home of a plurality of the city’s Black residents.  But its role didn’t start with the construction of I-5 in the early sixties, nor did it end then.  ODOT has made repeatedly hemmed in and destroyed Albina, starting more than seventy years ago.

In part I of this series, we unearthed the largely forgotten—and entirely unacknowledged—role the Oregon Department of Transportation played in triggering the downfall of Portland’s Albina neighborhood in 1951, with its decision to build a mile-long extension of Highway 99W (Interstate Avenue) along the Willamette River. Of all the public “investments” that dismantled Albina, this was the first, but not the last.

1962:  ODOT’s I-5 cuts through Albina

Less than a decade later, the Oregon State Highway Department was back, with another apply another  meat axe to the Albina neighborhood, in the form of the construction of Interstate 5.  It chose a route for the new Interstate 5, largely parallel to and less than a mile east, cutting through the heart of the Albina neighborhood.  And in true 1960’s freeway fashion, the right of way wasn’t just a narrow slice of land, the highway department condemned and demolished businesses and housing for several blocks on either side of the land eventually used for the roadway.

That’s apparent in this 1962 photo showing the project’s construction:

 

In its effort to sell a new $800 million widening of the I-5 freeway through what’s now called the Rose Quarter (to build, as we’ve shown a ten lane freeway), ODOT has made a conspicuous show of apologizing for the original construction of the freeway.  But in our view, the apology has been glossed over ODOT’s role.  Their public relations materials have dramatically understated damage done to the neighborhood.

Here’s a diagram prepared by ODOT consultants, to show how the freeway affected the Albina neighborhoods as it appeared in 1954 (i.e. after ODOT had already built Highway 99W).  ODOT’s historical map shows the homes and businesses as they existed in 1954, and then overlays the I-5 freeway itself as a pair of slender pink lines.  But this significantly understates the scale of the demolition in Albina.  The freeway’s true footprint involved acquiring and demolishing property on both sides of the roadway, as shown in the solid red lines on the right.  Critically, I-5 disconnected much of the Albina street grid.

The area outlined in red on the right hand side of this diagram shows blocks where multiple structures that existed in 1948 had been demolished by 1962, as shown in aerial photographs (see below). This includes both the land occupied by the freeway itself, as well as land cleared as part of the construction process.

The I-5 Freeway Construction Footprint

To get a closer look at this reality, compare these pairs of aerial photographs taken before and after I-5 construction.  The reality is the I-5 freeway leveled whole city blocks on either side of the right of way.  This pair of images allows you to see a “before” and “after” view of the neighborhood.  The before image from 1948 shows the housing and businesses that existed prior to freeway construction; the after shows what was demolished by 1962.  We haven’t been able to obtain data showing a complete list of the properties ODOT acquired and demolished, so we’ve relied on photo interpretation to identify blocks where housing or buildings that existed in 1948 had been demolished in 1962. It may be that some privately owned homes adjacent to the freeway were abandoned by their owners and demolished.

Albina:  From the Steel Bridge to N. Cook Street

Our first pair of aerial photographs shows the entirety of Albina from the Steel Bridge on the South to N. Cook Street (just near the Boise-Eliot School) on the North.  (The original 1948 photograph is truncated on the East)

1948                         ↔                          1962

 

Close Up:  The southern part of Albina

The damage done by the construction of the I-5 freeway is even more apparent when we zoom in to the southern portion of the neighborhood, the area between the Broadway and Steel bridges, and between the Willamette River and Martin Luther King Boulevard (called Union Avenue in 1962).


1948                           ↔                       1962

 

Freeway traffic, not just the roadway, is what doomed Albina

The result of ODOT’s highway construction was to obliterate much of Albina, and to isolate the remaining parts of the neighborhood. Predictably, the neighborhood’s population collapsed between 1950 and 1970, as the area was given over to the automobile. Much of the decline in population in Albina happened years after the freeway was built. The flood of cars undercut neighborhood livability, and population steadily declined in the 60s, 70s and 80s. As people moved away, neighborhood businesses that served local residents, many owned by African-Americans, died. More cars, fewer people, fewer businesses, and a shrunken impoverished neighborhood Building the freeway clearly privileged the interests of those driving through the area, especially suburban commuters, over the people who actually lived here.

 

The construction of I-5 was the second act in a three-part tragedy that doomed the Albina neighborhood.  The first was the construction of Highway 99W in 1951, cutting the neighborhood off from the River.  The I-5 freeway construction in 1962 demolished a huge share of the neighborhoods housing, and irrevocably turned this residential area into an auto-dominated sea of parking lots, roadways, gas stations and car dealerships.  But as we’ll see, there was a third act in the early 1970s that largely completed the encirclement and destruction of the neighborhood by ODOT highways.

It’s sometimes said that what’s past is prologue.  The ODOT public relations campaign for the $800 million Rose Quarter I-5 freeway widening project aims to portray it as a mere minor tweak to the existing roadway, the addition of a couple of inconsequential “auxiliary lanes.” They’re implying that if the footprint of the freeway isn’t expanded much there are no impacts.  Not only is that not true—the hidden plan is to build a 10-lane freeway through—the Rose Quarter, but the real impacts are driven by the flood of cars this would enable.  The real problem with the Rose Quarter freeway is not so much that the project increases the freeway’s footprint—which it does, in ways that ODOT has actively concealed—but rather that by adding additional road capacity, the I-5 Rose Quarter freeway widening project repeats the damage to the neighborhood by injecting even more vehicles into this car-dominated environment.  If we learn anything from history, this is not an error that we should allow to be repeated.

 

How ODOT destroyed Albina: The untold story

I-5 wasn’t the first highway that carved up Portland’s historically black Albina Neighborhood.

Seventy years ago, ODOT spent the equivalent of more than $80 million in today’s dollars to cut the Albina neighborhood off from the Willamette River.

ODOT’s highways destroyed housing and isolated Albina, lead to a two-thirds reduction in population between 1950 and 1970.

Demolishing neighborhoods for state highways is ODOT’s raison d’etre.

As part of its efforts to sell a $800 million I-5 freeway widening project in Portland, ODOT, the Oregon Department of Transportation, has made quite a show of acknowledging its complicity in destroying the Albina neighborhood, which six decades ago, was the segregated home of a plurality of the city’s Black residents.  But its role didn’t start with the construction of I-5 in the early sixties, nor did it end then.  ODOT has made repeatedly hemmed in and destroyed Albina, starting more than seventy years ago.

In 1950, the Oregon State Highway Department built a mile-long extension of Highway 99W that cut Albina off from the Willamette River, and began the process of destroying the housing and businesses that made up the neighborhood.

It’s lost to the living memory of all but a handful of Oregonians, but before 1950, there was no “North Interstate Avenue” between the Steel Bridge and North Tillamook Street (several blocks North of the Broadway Bridge.  In 1950, the Oregon State Highway Department leveled dozens of houses, and removed city streets.  Here’s a grainy contemporaneous news photo from the Oregonian showing the nearly completed Interstate Avenue highway.

(1951, December 23). Oregonian, p. 8

Before the highway was built, this whole area was mostly housing.  In 1950, the Oregon State Highway Department spent the equivalent of $80 million in today’s money to demolish the portion of the Albina neighborhood along the Willamette River to construct a new limited access highway.  The following map shows, bordered in red, the housing that ODOT demolished for Interstate Avenue.  This destruction has been acknowledged only in passing by ODOT in its Rose Quarter freeway widening work.*

 

Albina in 1948 and 1962

To get a sense of how the neighborhood changed, we’ve overlaid the 1948 image of the neighborhood with its 1962 appearance.  The entire area between the Memorial Coliseum and the River was cleared by ODOT for Interstate Avenue.  Albina was now cut off from the river by a state highway.

What ODOT hasn’t acknowledged as part of the Rose Quarter discussion is how its demolition of the neighborhood actually began even earlier, in 1950, when the department  built a highway extension from the Steel Bridge to Interstate Avenue.  Ironically, this highway (US 99W), was an extension of the westside Harbor Drive, which opened in 1943 and famously removed in 1974, and transformed into Portland’s Tom McCall Waterfront Park, replete with verdant lawns and cherry trees. The city’s tonier west-side had its riverbank 99W highway turned into a park; the predominantly Black Albina neighborhood’s segment of the 99W highway remains an auto-dominated arterial to this day.

Conspicuously, the ODOT narratives about its culpability for the destruction of Albina are generally confined to looking just at current right of way of the I-5 freeway.  But in fact, its role in demolishing the Albina neighborhood began more than a decade earlier, with the construction of the Highway 99W/Interstate Avenue extension, and continued for more than a decade later—with the construction of the Fremont Bridge and ramps, which further devastated the Albina community (and which is conveniently left out of ODOT project maps—more about that in an upcoming City Observatory commentary).

But in 1950, to speed the flow of traffic in and through Portland, the State Highway Department (the more accurately named predecessor of today’s ODOT), condemned and demolished a strip of houses along the Willamette River for an  mile-long highway project.

(1951, October 25). Oregonian, p. 40

In 1950, the project cost $3,500,000.  Inflated by the Engineering News Record’s Construction Cost Index, that’s a project that would cost over $80 million today.

Prior to 1950, the dense neighborhood of Albina ran downhill from NE Grand and Union Avenues all the way to the Willamette River.  The neighborhood was a dense network of gridded residential streets, and its two Census Tracts (22 & 23) had more than 14,000 residents. By the time of the 1960 Census, the neighborhood’s population had declined by more than a third, to a little over 9,000.  The construction of Interstate Avenue (Highway 99W), an extension of Harbor Drive was just the first of a series of project that systematically demolished most of the housing in the Albina neighborhood.  In 1960, the city cleared away housing next to Interstate Avenue in part for the new Memorial Coliseum, but mostly to provide for a swath of surface parking lots around the new arena.

 

The result of ODOT’s highway construction was to obliterate much of Albina, and to isolate the remaining parts of the neighborhood. Predictably, the neighborhood’s population collapsed between 1950 and 1970, as the area was given over to the automobile.

 

The real problem with the Rose Quarter freeway is not so much that the project increases the freeway’s footprint—which it does, in ways that ODOT has actively concealed—but rather that by adding additional road capacity, the I-5 Rose Quarter freeway widening project injects even more vehicles into this car-dominated environment.  The local neighborhood association, has come to exactly that conclusion, and they’re correct: The Eliot Neighborhood Association’s land use chair has written:

The only real change the project would make to the surrounding area would be widening the highway, a car-capacity increase that will barely change travel times through the area. It would also serve to put more cars into our local street network, which has led to renderings showing even wider streets through the area than we have now. This would increase road noise and reduce the value of land around the project area.

 

* Editor’s Note:  The originally published version of this story incorrectly claimed that ODOT’s Rose Quarter analysis had not acknowledged the destruction of housing by the construction of Interstate Avenue in the early 1950s.  In fact, one table contained in the project’s environmental justice section concedes that this project demolished at least 80 homes.  Thanks to a regular reader who pointed this out.  City Observatory regrets this error.

 

How freeways kill cities

Freeways slash population in cities, and prompt growth in suburbs

Within city centers, the closer your neighborhood was to the freeway, the more its population declined.

In suburbs, the closer your neighborhood was to the freeway, the more it tended to grow.

It’s been obvious for a long, long time that the automobile is fundamentally corrosive to urban form.  Not only do roads, highways and parking lots devour urban space, they also cause the dispersion of people and activity in ways that make it impossible in many places to live without a car.  Cars, abetted by public policy, have remade cities in their image, and fostered car dependency, effectively acting as a paralytic toxin to urban living.

Even before we had good data, this was manifest to thoughtful observers, such as James Marston Fitch who wrote in The New York Times in 1960:

The automobile has not merely taken over the street, it has dissolved the living tissue of the city.  Its appetite for space is absolutely insatiable; moving and parked, it devours urban land, leaving buildings as mere islands of habitable space in a sea of dangerous and ugly traffic.

Now, six decades later, we can look at the historic record to measure just how true this observation was. Federal Reserve economists Jeffrey Brinkman and Jeffrey Lin have looked at the historical relationship between neighborhood growth and proximity to freeways.  Their data shows freeways have decimated city neighborhoods and propelled suburban population growth.

In the center of the region, more freeways are associated with population decline, and the closer you are to the freeway, the more population tends to decline.

Freeways are toxic to urban neighborhoods and a tonic to suburban sprawl

On the suburban fringe, freeways both tend to stimulate more population growth, and the most positive effect on growth tends to be quite close to the freeway, and the growth inducing effect attenuates the further one gets from the freeway.  The devastation wrought by urban freeways isn’t limited just to knocking down houses, but extends to undermining the vitality of the surviving portion of a neighborhood.  With fewer people and more traffic, a neighborhood loses its critical mass needed to support businesses and civic institutions, triggering a downward spiral.  As images like these, from Portland’s Albina neighborhood make clear, freeways have devastating effects.

The Moses meat-ax slices hacks through North Portland (1962).

The authors’ key findings which aggregate data from 64 metropolitan areas for the period from 1950 to 2010 show the typical effects.

The authors have summarized their findings in one particularly dense graphic, which we present below.  It deserves a bit of explanation.  (For clarity, we’ve annotated it slightly with lines and shading to highlight key issues.)

First, the chart breaks all the neighborhoods (census tracts) in a metropolitan area into four columns, arrayed left to right, based on how close they are to the city center.  The closest-in urban tracts are shown on the left (above the legend “city center”) and in increasing order of distance from the city center are three other groups, 2.5 to 5 miles away, 5 to 10 miles away, and 10 to 50 miles away.  The vertical axis on this chart corresponds to the (log) growth rate of the population of neighborhoods between 1950 and 2010.  We’ve drawn a line at zero (no growth), and shaded negative values (population decline) as yellow.

Finally, each column is subdivided based on the distance from a tract to the nearest local freeway.  So, for example, the leftmost column shows neighborhoods within 2.5 miles of the city center, and the line within the column illustrates the change in population in those neighborhoods based on their distance to the nearest freeway.

Looking at that left-most column, we see that there’s nearly 100 percent decline in population in close-in neighborhoods a mile or less from the nearest freeway.  In an urban setting freeways largely wipe out population.  Population declines are less severe, but still substantial 2 and 3 miles away.  The period 1950 to 2010 was one of urban decline, but as this chart shows, the urban neighborhoods that fared the best were the ones that were furthest from freeways.

Overall, the further you move from the center of the region, the more the effect of freeway proximity becomes positive for population growth.  Population growth is negative close to city centers, split between declining and increasing 2.5 to five miles away, and increasing fastest in tracts 5-10 and 10 or more miles from the city center.

The inflection point where freeways go from being a detriment to a stimulus to population growth seems to be about five miles from the city center.  Beyond 5 miles from the city center, the localized effect of freeways shifts from highly negative, to positive.  From 5-10 miles, the highest levels of growth are neighborhoods closest to freeways.  For those areas beyond ten miles, growth peaks about 2-3 miles from the nearest freeway, but declines sharply thereafter.  Freeway access is a tonic to growth in suburbs and the metro periphery.

Brinkman and Lin’s work adds additional depth to research done on this subject by other economists. Professor Nathan Baum-Snow found that each additional radial freeway constructed through a city reduced the city’s population by 18 percent.  In urban settings, freeways are toxic to population growth.  Neighborhoods close to freeways in and near city centers suffer the most severe population decline.

Jeffrey Brinkman and Jeffrey Lin, “Freeway Revolts!,” Federal Reserve Bank of Philadelphia Research Department Working Paper 19-29, July 2019, https://doi.org/10.21799/frbp.wp.2019.29

 

 

Covid Migration: Temporary, young, economically insecure

There’s relatively little migration in the wake of Covid-19

Most Covid-related migration is temporary, involves moving in with friends or relatives, and not leaving a metro area

It’s not professionals fleeing cities:  Covid-related movers tend to be young (many are students), and are prompted by economic distress

From the earliest days of the pandemic, pundits predicted that the Covid-19 virus would prompt rapid and permanent migration away from cities.  First, it was the concern that city residents were somehow more susceptible to the Coronavirus—prompted by a weak correlation driven mainly by high infection rates in New York in the early days of the pandemic, but which completely reversed as rural areas now have higher rates of cases and deaths.  Then the argument morphed:  Now, thanks to Zoom and other web-technology, we can all work at home, so there’s no reason for companies to pay for expensive offices (or for workers to live in expensive cities).  We think the pessimism about cities is massively overstated for at least seven reasons.   In both cases, this theory has been fueled by anecdotes of highly paid professional workers de-camping from big cities to smaller cities, or rural ones.

But these anecdotes seriously mis-represent the nature and scale of migration.  First, as we’ve noted earlier, migrants in these anecdotes don’t usually leave metro areas at all (many examples are people who were already considering a move to nearby suburbs), but even when they do leave a New York or a San Francisco, its for another, albeit smaller, tech center, like Seattle.

Many of these journalistic anecdotes suffer from what our friend Jarrett Walker calls “elite projection“:  a tendency to view things from the perspective of the richest and most advantaged, rather from the average person. Early stories profiled Upper Eastside neighborhoods deserted by their residents harboring in the Hamptons or Adirondacks.  That genre continues in the vaccine era, as well.  Consider this gem from Bloomberg:

A new study sheds some light on who’s actually moving in the post-Covid world and why.  It’s not so much mid-career professionals moving permanently; it’s really much more economically distressed younger adults, moving in (or back in) with family and friends; movers also disproportionately people of color.

Pew Research, which bills its work as that of a “fact tank” provides some insights into the actual extent, character and motivations of the Covid-induced movement.  Their recent survey of 15,000 adults nationally included a battery of migration-related questions.  They’ve shared some of the top-line results from the survey.  Here are the highlights:

Not many people moved due to Covid. All in all about five percent of all Americans report having moved due to the Covid-19 pandemic and its related fallout. (Even that number is smaller than it seems, because it includes temporary moves.  Of the 5 percent who reported moving due to concerns about Covid (or related economic problems), about one in six said their move was so permanent that they “bought or rented a new home on a long-term basis.”  (Unhelpfully, the Pew migration question asks whether people moved either temporarily or permanently, but fails to define either of these terms. If a family spent a week or a weekend away from New York at the height of the pandemic outbreak in March, would the answer be yes?)

Economic reasons seem to dominate moves. Particularly in the past few months, it has been the economic hardship, rather than concern about the Covid-19 virus itself that seems to be prompting moves.  Financial reasons, including job loss account for a third of all moves.  The more common story is not moving to work remotely, but moving because one has no work.

The demographics: Young, Hispanic, and lower income. Overall, about 5 percent of Americans report either a temporary or permanent move, but 18-29 year olds are twice as likely as other Americans to report a move, Hispanics are nearly twice as likely.  How many of the press accounts of Covid migration speak to twenty-something people of color moving back in with their family, because they’ve lost their job?  That’s a far more representative migrant than the older, wealthier, mid-career professional who can afford to buy a new home and who’s so secure in their work that they can work at a distance.

 

Students and School Figure prominently.  A significant amount of recorded moves seem to be students unable to attend school. That’s implied by the demographic data (those 18-29 are twice as likely to move), by the places they move to (a plurality of movers go to live with Mom, Dad or another relative), and by the direct answer to the question about why people move: About one in seven who reported moving due to Covid through November said it was because their college campus or school had closed.

Few people cite more space or remote work as a motivation.  While the narrative about taking advantage of remote work to get a bigger home, and “zoom it in” figures prominently in journalistic accounts, it’s rare according to the survey.  Pew reports:

. . . people who moved due to the virus said the main reason was that they needed more space (2%) or were able to work remotely (1%).

None of these survey results seem to provide strong evidence for a re-ordering of America’s locational preferences in the wake of Covid. The fact that most moves are among young adults with lower incomes, suggests it’s not the higher income, mid- to late-career professionals abandoning cities for suburbs or rural areas that accounts for much of the reported migration.  Whether in the midst of the pandemic’s rising caseload, or waiting impatiently for the vaccine roll-out, it’s easy to make overblown predictions about urban flight.  Cities have long weathered such crises, and rebounded in their wake.  They will again.

Albina Then and Now

Albina then and now

Basically, Albina was wiped out by
Interstate Ave 99E (ODOT)   1951
Memorial Coliseum (City) 1958
I-5 1962
Emmanuel Hospital (PDC)  1970s
Blanchard Center (PPS)  1980
Convention Center 1990 (expanded 2003)
Moda Center/Rose Garden 1995
But ODOT’s two highways cut all this off from the rest of the city.  99E/Interstate cut the
neighborhood off from the River; I-5 cut if off from the rest of N/NE Portland.

Overview of Albina, 1948 versus Today

 

Memorial Coliseum 1975

a href=”http://cityobservatory.org/screen-shot-2021-02-06-at-5-30-35-pm/” rel=”attachment wp-att-11425″>
<

 

 

I-5


 

The I-5 Freeway’s Construction Footprint

 

Fremont Bridge Ramps

 

How housing segregation reduces Black wealth

Black-owned homes are valued at a discount to all housing, but the disparity is worst in highly segregated metro areas

There’s a strong correlation between metropolitan segregation and black-white housing wealth disparities

More progress in racial integration is likely a key to reducing Black-white wealth disparities

It’s long been known that US housing markets and policy have combined to produce a huge disparity in housing wealth between Black and white families in the US.  Andre Perry and his colleagues at the Brookings Institution, for example, have estimated that owner-occupied homes in Black neighborhoods are undervalued by $48,000 per home on average.

The racial/ethnic home value gap

A new report from real estate analytics firm Zillow drills down on the racial/ethnic home value disparity.  Using a combination of home sales data and information from the American Community Survey, they estimate the average home value in different markets around the country for several racial/ethnic groups—Blacks, Latinx, Asian, non-Hispanic whites, indigenous people and Pacific Islanders.  Home values are systematically lower for most people of color, with Black households experiencing the biggest gap, with their homes being valued about 15 percent lower than all homes in the US.

The Zillow data also track the trends in the racial home value disparity over time.  The collapse of the housing bubble caused racial disparities to widen, but they’ve narrowed a bit in recent years.  Zillow explains:

Prior to the Great Recession, the gap between Black-owned home values and all home values was about 15% — if the typical U.S. home at the time was worth $1, the typical black-owned home was worth $0.85 — according to a Zillow analysis of home values in communities with different racial compositions. The gap grew to 20% by March 2014 after years of job losses and elevated foreclosures. Similarly, the ratio of Latinx home values to all home values hit bottom in May 2012 at 86% — down from 88% before the housing bubble. It has taken almost a decade for the typical home owned by a Black or Latinx homeowner to roughly get back to where it was relative to the standard U.S. home in 2007.

While the overall average is a 15 percent devaluation for Black-owned homes relative to all homes, there are wide variations among metropolitan areas.  As the Zillow report notes, the disparity is as little as 1 percent in some metropolitan areas (Riverside, CA) and more than 40 percent in others (Birmingham, Buffalo and Detroit).  It’s apparent that the pattern of variation across metro areas isn’t random.  Cities in the West, as a rule have lower disparities than cities in the Northeast and Midwest.

How segregation drives the housing value gap

One thing we know about race and US cities is that there is a wide variation in the level of housing segregation.  As we noted last year, some US cities have much lower levels of white/non-white segregation than others.  We investigate the relationship between segregation and racial home value disparities by looking at data for large US metro areas.  We draw on metropolitan level estimates of the black-white dissimilarity index computed by the Brookings Institution, and compare them to Zillow’s estimate of the gap between Black-owned home values and all home values for those same metro areas.

This chart shows the level of Black-white segregation on the horizontal axis (with higher levels of segregation corresponding to higher values on the index) and the relative value of Black-owned homes to all homes on the vertical axis (all the values are negative because in every market, Black-owned homes are valued at a discount to all homes.  These data show a clear negative relationship between segregation and the home value gap:  The more segregated a metro area, the greater the gap in housing values.  Black households who own homes in more segregated metro areas suffer a greater housing value gap that Black households who live in less segregated metro areas.

These data show that there are seven metro areas—Birmingham, Buffalo, Chicago, Cleveland, Detroit, Milwaukie and St. Louis— that have particularly high levels of segregation and a particularly wide racial housing value gap.  These metros are clustered in the lower right of our diagram; all have Black-white segregation index values of 70 or more, and in every metro, Black-owned housing is valued at at least an 37 percent discount to all housing.

At the other end of the spectrum, metro areas with low levels of segregation have very small racial housing value gaps.  San Antonio, Riverside, Portland, and Virginia Beach have Black/white segregation scores of less than 51 and have racial housing gaps that are less than 6 percent.

But even excluding these extreme cases, there’s still a noticeable relationship between the racial housing value gap and segregation in less segregated metro areas:  In general, the less segregated a metro area, the small the racial housing value gap.

We are rightly concerned about the wealth gap between people of color and the nation’s non-HIspanic white population.  We have a system where homeownership is a large fraction of wealth for most households, especially those who do not have high incomes.  These data suggest that making continuing progress in promoting neighborhood integration is key to ameliorating the housing value gaps that underlie the observed wealth gap.

America’s K-shaped housing market

Home prices are soaring, rents are falling

The disparate impact of the recession on high income and low income households in driving the housing market in two directions at once.

Job losses have been concentrated among the lowest earning workers, who are disproportionately renters. Meanwhile high earning workers have seen no net job losses, and they are disproportionately homeowners and home buyers.

The K-Shaped Recession

When the Covid-19 virus struck in early 2020, it abruptly plunged the nation into the sharpest economic downturn we’ve ever recorded.  But job losses weren’t even distributed across the economy.  Some workers, especially those in front-line service work, were much more likely to lose their jobs (and to be unable to work-at-home), while others have pretty much kept their jobs, despite disruptions to commuting and work routines.  What we’ve observed is what many are calling a “K-shaped recession” with devastating economic consequences for some, and no change in earnings for others.

Harvard economist Raj Chetty and his team at Opportunity Insights have assembled an impressive array of high-frequency big data from private sources to provide an unusually detailed look at the change in the economy in the wake of the Covid-19 outbreak.  Their website, Track the Recovery, has a compelling chart that shows the very different employment trajectories of highly paid and low paid workers.  They’ve broken up all US workers by earnings quartile; this chart shows the employment levels for those in the lowest quartile (annual earnings under $27,000 annually) and in the highest quartile (over $60,0000).  While employment for high wage workers is actually higher now than before the pandemic, employment for low wage workers has plummeted by 21 percent since January of 2020.  The recession is essentially over for high paid workers, but lingers on for those with the lowest earnings.

A principal reason for this divergence has to do with the differential effects of lockdowns and business closures on different occupations.  High paid professional workers have vastly more opportunities to continue their jobs by working at home.  Service and retail workers at essential businesses, meanwhile can’t “zoom it in” and have experienced much greater layoffs and reductions in hours of work.

The K-Shaped Housing Market

The K-shaped trajectory of the overall economy is mirrored in the housing market.  Home prices and rents have moved in opposite directions since the start of the pandemic.  Home price inflation (shown in blue), according to the Case-Shiller National Home price index had been in the 4-5 percent range prior to the pandemic, have essentially doubled to more than nine percent.  At the same time, the BLS estimates of rents (shown in red) paid by US city residents have fallen from a little under 4 percent year over year, to barely two percent.  (The yellow shaded area is the recession)

Other sources of housing market data confirm the K-shaped divergence in rents and housing prices since the advent of the Covid-19 pandemic and recession.  Here we’ve mapped monthly year-over-year changes in housing prices (from Zillow) and apartment rents (from Apartment List.com).  These data show that the rate of home price inflation, which had been ebbing for the previous two years, has essentially doubled from 4 percent annually to 8 percent annually since the onset of the pandemic.  Meanwhile rent inflation, which had been steady at about slightly over 2 percent per year has turned negative, and is declining at about a 1.5 percent annual rate.

Low wage workers are renters; high wage workers are homeowners (and home buyers).

There’s an obvious explanation for the different trajectories of house prices and rents:  Low income workers rent; high income workers own and buy homes. High income households have been barely grazed by the Covid-19 recession.  In fact, the combination of low interest rates and enforced savings (because many kinds of consumption spending, including dining, entertainment, travel and even much retail have been constrained by lockdowns), mean higher income households may find housing a much more attractive spending item.  If you can’t go out to dinner, or take a vacation, you have more money to spend on a new home.  Low wage workers are in the opposite situation.  Low wage workers have borne the brunt of the recession; they are also much more likely to be renters than higher income households.

According to data from the 2019 American Community Survey—via the indispensable IPUMS* website—among working age Americans, households with incomes of less than $40,000 a year (roughly the bottom quartile of households in this category), about 66 percent are renters.  In contrast, of households in the top quartile (with incomes of more than $100,000 per year), 78 percent are homeowners.  The decline in employment during this recession has been concentrated on those households most likely to rent.  Meanwhile, employment among those households most likely to be homeowners has actually increased.  This divergence clearly explains why rents are falling, while home prices are rising:  In the aggregate, renters are bearing the brunt of job losses while homeowners have largely avoided a decline in employment.

The distinctly K-shaped nature of the recession, and of the housing market, are closely related. This sharp and sudden divergence in the fates of high income and low income households, and rental and for-sale housing markets is clearly a product of the Covid-19 recession. Over the course of the coming year, as the Covid-19 vaccine rolls out, and the economy recovers, it seems likely that we’ll see employment gains among low income workers.  As their economic condition improves, that’s likely to diminish substantially the downward pressures we’ve seen on rents for the past year.

* – Steven Ruggles, Sarah Flood, Ronald Goeken, Josiah Grover, Erin Meyer, Jose Pacas and Matthew Sobek. IPUMS USA: Version 10.0 [dataset]. Minneapolis, MN: IPUMS, 2021. https://doi.org/10.18128/D010.V10.0.

Calculating induced demand at the Rose Quarter

Widening I-5 at the  Rose Quarter in Portland will produce an addition 17.4 to 34.8 million miles of vehicle travel and 7.8 to 15.5 thousand tons of greenhouse gases per year.

These estimates come from a customized calibration of the induced travel calculator to the Portland Metropolitan Area.

It’s scientifically proven that increasing freeway capacity in dense urban environments stimulates additional car travel.  The explanation is simple:

Attempts to address traffic congestion commonly rely on increasing roadway capacity, e.g. by building new roadways or adding lanes to existing facilities. But studies examining that approach indicate it is only a temporary fix. They consistently show that adding roadway capacity in congested areas actually increases network-wide vehicle miles traveled (VMT) by a nearly equivalent proportion within a few years, reducing or negating the initial congestion relief. That increase in VMT is called “induced travel.”

The phenomenon of induced demand is so well-demonstrated that its known as the “fundamental law” of road congestion.  As the experience with Houston’s 23-lane Katy Freeway shows, no matter how many lanes you add to a freeway in a dense urban setting, added capacity simply prompts more driving

Transportation experts at the University of California Davis National Center for Sustainable Transportation have developed an induced travel calculator, based on the best available scientific information on the effect of added freeway capacity on vehicle travel.  The developers of the calculator—Jamey Volker, Amy Lee and Susan Handy—have published a peer-reviewed article describing the empirical estimates in the literature showing the connection between capacity and VMT.  In their journal article, the authors find that highway departments usually only address induced demand in response to public comments, rarely apply state-of-the-art modeling to their analysis, and routinely underestimate the effects of induced demand, by as much as an order of magnitude.

The purpose of this model is to provide an independent, scientifically sound means of measuring the environmental effects of major transportation projects.

Induced Demand Calculator; Results shown for Sacramento-Davis metropolitan area.

In addition, because greenhouse gases are directly related to vehicle miles of travel—each thousand mile traveled by a typical automobile produces about .466 tons of greenhouse gases—the calculator also can be used to show the climate change impact of freeway expansion projects.

A Portland-calibrated version of the Induced Travel Calculator

After consulting with the authors of California calculator, we calibrated the calculator for use in the Portland metropolitan area.  The key variables in the calculator include the number of interstate freeway lane miles in the urbanized portion of the metropolitan area, and the number of vehicle miles of travel on those roadways.  The literature on induced demand shows that vehicle travel exhibits and unit elasticity with respect to roadway capacity:  a one percent increase in road capacity tends to result in a one percent increase in vehicle miles traveled.

Using data from the US Department of Transportation (vehicles miles traveled) and from the Oregon and Washington departments of transportation (interstate freeway lane miles) inside urbanized areas, we created a Portland-metro specific version of the California calculator.

We computed the induced travel impact of two possible scenarios for the proposed I-5 Rose Quarter Freeway widening project using Portland-calibrated version of the calculator.  Option one was expanding the current 1.5 mile stretch of freeway from 4 lanes to 6 (which is how the Oregon Department of Transportation describes the project, although they misleadingly call the added lanes “auxiliary” lanes.  Option two is expanding the same stretch of freeway to eight lanes, which is what would easily fit in the 126 foot wide right of way that Oregon DOT proposes to construct.

The calculator suggests that the expansion to six lanes would add about 17.4 million vehicle miles per year to travel in the Portland metropolitan area, and that the expansion to eight lanes would add 34.8 million vehicle miles of travel.

These additional vehicle miles of travel have many negative effects.  As “fundamental law” suggests, they’ll entirely offset congestion reduction benefits of freeway widening.  In addition, there will be an increase in air pollution proportionate to the increase in vehicle travel.  Additional VMT translates directly into increased greenhouse gas emissions, at nearly a half ton of greenhouse gases per thousand miles, this means that the widened freeway can be expected to increase greenhouse gas pollution in Portland by between 7.7 and 15.5 thousand tons per year.

The Volker/Lee/Handy model represents the latest, independent, state of the art method for estimating greenhouse gases associated with freeway expansion projects.  This, and not the self-serving and incorrect estimates generated by the Oregon Department of Transportation should be used to define the environmental impacts of this project.

References

Jamey M. B. Volker, Amy E. Lee, Susan Handy, “Induced Vehicle Travel in the Environmental Review Process,” Transportation Research Record, Volume: 2674 issue: 7, (July, 2020) pages 468-479

The author wants to thank  Doctor Volker for reviewing the methodology and data used in the Portland version of the calculator.  Any errors are solely the responsibility of City Observatory.

 

Congestion Pricing: ODOT is disobeying an order from Governor Brown

More than a year ago, Oregon Governor Kate Brown directed ODOT to “include a full review of  congestion pricing” before deciding whether or not to do a full environmental impact statement for the proposed I-5 Rose Quarter Freeway widening project.

ODOT simply ignored the Governor’s request, and instead is delaying its congestion pricing efforts, and proceeding full speed ahead with the Rose Quarter with no Environmental Impact Statement that would include pricing.

ODOT has produced no analysis of the effects of pricing as part of its Rose Quarter environmental review, and has said “congestion pricing was not considered” 

Congestion pricing could dramatically reduce congestion at the Rose Quarter according to ODOT’s own studies (which are not included in the project’s Environmental Assessment).  Pricing is exactly the kind of effective and also reasonably foreseeable alternative that the National Environmental Policy Act (NEPA) requires be considered.  ODOT has both disobeyed the Governor and violated NEPA.

A little background.  In 2017, the Oregon Legislature passed HB 2017, transportation finance legislation that raised the state gas tax and vehicle licensing fees, and which authorized several freeway widening projects, and also directed the Oregon Department of Transportation to implement congestion pricing on Portland area freeways. Pricing the I-5 freeway, rather than expanding it, could reduce or eliminate traffic congestion faster, and at far lower cost. According to the National Environmental Policy Act, that’s exactly the kind of alternative that ODOT and the Federal Highway Administration are required to evaluate and discuss in the environmental review of a project.  And ODOT’s own studies have shown that pricing the I-5 freeway would dramatically reduce traffic congestion.  But there’s simply no mention of congestion pricing in the Rose Quarter freeway widening Environmental Assessment.

Governor Brown:  “Include a full review of congestion pricing.”

In December, 2019, Oregon Governor Kate Brown instructed the Oregon Department of Transportation to include  review of congestion pricing, in its decision on how to proceed with an environmental review of the proposed $800 million I-5 Rose Quarter Freeway widening project. In her December 16, 2019 letter to the Oregon Transportation Commission, Governor Kate Brown asked for a “full review of congestion pricing, and how its implementation would impact the Rose Quarter,” before the OTC made a decision on the environmental review path.

The environmental review path, in this case, consisted of a decision as to whether to move forward with a full Environmental Impact Statement, one which included a full and complete assessment of the effects of road pricing.  Despite the Governor’s explicit instruction to undertake a “full review of congestion pricing”  ODOT simply ignored this instruction, and said it would not undertake an Environmental Impact Statement at all.

ODOT:  We’re not going to look at congestion pricing in the Rose Quarter environmental review

When ODOT and the Federal Highway Administration released their FONSI—Finding of No Significant Environmental Impact—they simply ignored the Governor’s instruction, and claimed that they weren’t required by federal law to consider tolling (untrue), and that they would look at the effects of tolling later—only after they move forward with the Rose Quarter project.

On October 30, 2020, ten months after the Governor’s letter, having neither published nor provided any additional information about the impacts of congestion pricing, ODOT and its federal Partners adopted a “Finding of No Significant Impact” (FONSI).  In the FONSI, ODOT made it clear that it would not look at the impacts of tolling on the Rose Quarter, saying that this congestion would be the subject of “further study” with analysis “expected by the end of 2022.”

Tolling: Tolling (also referred to as congestion pricing or value pricing) on I-5 was not considered to be reasonably foreseeable at the time the Environmental Assessment was being prepared because tolling on I-5 was not included in the financially constrained project list in the 2014 Regional Transportation Plan (RTP), nor is it currently included in the financially constrained project list in the 2018 RTP. Congestion pricing on I-5 is currently (as of October 2020) being studied by ODOT, consistent with Legislative direction to the OTC in House Bill 2017 “to pursue and implement tolling on I-5 and I-205 in the Portland metropolitan region to help manage traffic congestion.” During the 2018 ODOT Value Pricing Feasibility Analysis, the I-5 corridor segment between SW Multnomah and N Going was identified for further study. Managing traffic congestion and mobility through tolling on this I-5 segment could provide one of the largest benefits to the most regional travelers and the state-wide economy. Further, additional traffic and mobility analysis will be initiated that will help identify where tolling would begin and end on I-5 and the type of tolling to be utilized; this planning work and technical analysis is expected to be completed by the end of 2022. The results of this analysis will inform the starting timeframe and alternatives for a formal environmental review process.

[Emphasis added.]

In short, instead of including congestion pricing in the Rose Quarter environmental review, ODOT simply announced that it would proceed with the Rose Quarter as is, and address road pricing only after the Rose Quarter project moves forward.

ODOT is delaying action on congestion pricing until after Rose Quarter starts construction

ODOT is dragging its feet on a 2017 legislative mandate to implement congestion pricing. And contrary to the claims made in the FONSI, ODOT has no plans to even finish planning for congestion pricing before 2023.  Just weeks after issuing the FONSI, on December 10, 2020, ODOT Director Kris Strickler and ODOT Manger Brendan Finn presented a schedule to the Oregon Legislature showing that the congestion pricing planning phase would continue until the end of 2023.  The schedule also shows that the agency plans to commence construction on the I-5 Rose Quarter project a year before it even completes planning for congestion pricing in the Portland Area.

Under this schedule, Rose Quarter construction (the diagonally shaded section) would start as early as 2022, while the planning for congestion pricing would not be complete until 2024.  There’s also an ambiguous “design/build, test and implement” phase that lasts until 2027.

Congestion pricing is highly foreseeable:  It’s mandated by law

Notice that ODOT’s explanation simply ignored the Governor’s explicit instruction, and instead makes the assertion that due to federal regulations, ODOT need not address pricing, because somehow it was not “reasonably foreseeable.”  That of course, is nonsense:  congestion pricing has been mandated by state law since 2017, well before the completion of ODOT’s Environmental Assessment.  It’s simply false to claim that it isn’t foreseeable. Whether or not a project is listed in the Regional Transportation Plan or not does not determine whether it is “reasonably foreseeable.”  The legal standard under NEPA is much broader as the Environmental Protection Agency says:

The critical question is “What future actions are reasonably foreseeable?”. Court decisions on this topic have generally concluded that reasonably foreseeable future actions need to be considered even if they are not specific proposals. The criterion for excluding future actions is whether they are “speculative.” The NEPA document should include discussion of future actions to be taken by the action agency.

ODOT has been directed by law to adopted congestion pricing; it is not in any sense speculative, and it is plainly a “future action to be taken by the action agency” and needs to be addressed in the environmental review, whether or not its part of the Regional Transportation Plan.

ODOT’s other studies show pricing would reduce congestion at the Rose Quarter

The studies undertaken by the Oregon Department of Transportation conclude that congestion pricing could measurably reduce traffic congestion on I-5. The analysis concludes that the project would reduce congestion and improve travel time reliability on I-5.  It would save travel time for trucks and buses.  It enables higher speeds and greater throughput on the freeway–because it eliminates the hyper-congestion that occurs when roads are unpriced. Here’s an excerpt from page 17, of the report.  We highlighted in bold the most salient bits of the analysis:

Overall, Concept 2 – Priced Roadway, will reduce congestion for all travelers on the priced facility. This will produce overall improvement in travel time reliability and efficiency for all users of I-5 and I-205.  [Concept 2 is] Likely to provide the highest level of congestion relief of the initial pricing concepts examined. [It] Controls demand on all lanes and, therefore, allows the highest level of traffic management to maintain both relatively high speeds and relatively high throughput on both I-5 and I-205. Vehicles 10,000 pounds and more (such as many freight trucks and transit vehicles) would benefit from travel time improvements on the managed facilities.  Pricing recovers lost functional capacity due to hyper-congestion, providing greater carrying volume with pricing than without. This means that diversion impacts may be minimal, but still warrant consideration and study.

This concept is relatively inexpensive to implement, and significantly less expensive than concepts that include substantial physical improvements to the pavement and bridge infrastructure.

Oregon Department of Transportation,, (2018). Portland Metro Area Value Pricing Feasibility Analysis Final Round 1 Concept Evaluation and Recommendations Technical Memorandum #3, 2018. [Emphasis added].

Why parking should pay its way instead of getting a free ride

Hartford Connecticut considers a pioneering move to make parking pay its way

A higher parking tax works much like a “lite” version of land value taxation (LVT)

Surface parking lots are highly subsidized polluters

As Donald Shoup lays out in exhaustive detail in his 733-page masterpiece, The High Cost of Free Parking, the subsidies we provide for car storage have shredded the fabric of America’s urban areas.  By giving over so much land to cars, we weaken and undermine the things that make cities work well:  the opportunities for easy interaction. There’s  evidence that the effects of parking are causal:  from 1960 onward, an increase in parking provision from 0.1 to 0.5 parking space per person was associated with an increase in automobile mode share of roughly 30 percentage points according to a study of nine cities.  We have too much parking for many reasons:  we’ve subsidized highway construction and suburban homes, we’ve mandated parking for most new residential and commercial buildings, and we’ve decimated transit systems. But a key contributor to overparking is the strong financial incentives built into tax systems.

Parking is subsidized by our current tax system

In effect, our system of local property taxation plays a key role in subsidizing parking and car use.  In nearly all US cities, the property tax is assessed equally on the value of both land and improvements, so if one improves a piece of property (by constructing or enlarging a building), the owner’s property taxes go up.  The contrary is also true, if a property is unimproved, or just covered in gravel or asphalt, the owner typically pays lower taxes based only on the value of the bare land.  In cities with high vacancy rates, the property tax actually rewards landowners who demolish buildings. The perverse incentives created by raising taxes on those who improve their land with active uses like offices, stores and homes, led Henry George in the 19th Century to propose a “single” tax on land, what is now generally called the “land value tax” (LVT).

The Land Value Tax fixes the anti-development incentives built in to the property tax:  Constructing a new building doesn’t cause the owners taxes to rise.  And those who own valuable property can’t avoid or minimize taxes by leaving it fallow; if a downtown block is zoned for office use, for example, it pays high taxes even if it’s a vacant lot.  But despite its appeal, there’s been little enthusiasm for land value taxes in the US; only one large city, Pittsburgh has seriously flirted with the idea, for a time taxing property more heavily than improvements.  But, elsewhere, it’s been a non-starter.

Higher fees for parking as a “lite” land value tax

The City Council of Hartford Connecticut is considering an expanded fee on private commercial parking lots and structures that mimics some of the important features of a land value tax:  Call it LVT-lite.  In Hartford, as in many US cities, much of the downtown area is given over to car parking, and surface parking lots pay lower rates than those lots with improvements.  The low rate of taxation on parking lots lowers the holding costs for landowners, and makes parking a more profitable use than developing these lots for other more intensive uses. One way to change that dynamic is to raise taxes or fees on parking, which is exactly what the city’s proposed ordinance would do.

As University of Connecticut engineering professor Norman Garrick has shown, Hartford’s downtown has been hollowed out by the construction of parking.  As Garrick explains:

Since 1960, the number of parking spaces in downtown Hartford increased by more that 300 percent — from 15,000 to 46,000 spaces. This change has had a profound and devastating effect on the structure and function of the city (see accompanying maps) as one historic building after another was demolished.

Not surprisingly, the proliferation of freeways and surface parking lots (shown in red) has coincided with a dramatic decline in the city’s population.

The Hartford ordinance would establish a sliding scale of fees for parking lots and structures based on the number of parking spaces.  The fee would start in 2022, and be phased in over a period of years.  When fully implemented, for large parking lots, the incremental fee works out to $125 per parking space per year, which is about 25 cents per working day per parking space.  Hartford’s proposed policy would but in place incentives to better use urban space, and to discourage excessive car travel.  As the Parking Reform Network‘s Tony Jordan told us:

. . . parking stall fees are good policy because they would contribute simultaneously to several important policy objectives. Parking stall fees, particularly surface stalls, will encourage better uses of urban space, which I think is a big consideration in Hartford. Per stall fees internalize more of the costs of someone’s decision to drive and raise revenue that can and should be used to encourage and subsidize other modes. The environmental and traffic benefits from mode shift are obvious.

Is taxing parking fair?  A parable of parking subsidies:  Stormwater

We’ll bet you didn’t know that Hartford had a multi-billion dollar subway system.

Superficially, it might seem like raising fees on parking is somehow “picking on” cars and car ownership.  There should be a rational basis for any tax, and when it comes to parking there are very good reasons to raise fees and taxes to offset for the car subsidies built into our current systems of public finance.

It’s worth stepping back a bit and considering just how much subsidy is extended to parking, and to car travel in general.  Some of the biggest subsidies are not on anyone’s radar.  Take for example the cost of dealing with stormwater.  Hartford, like many US cities, has an antiquated system of combined storm and sanitary sewers; when there’s a heavy rainfall, water from streets, roofs and—wait for it—parking lots, flows into storm sewers, overwhelms the sanitary sewer system, and produces combined discharges of untreated sewage and stormwater into, in Hartford’s case, the Connecticut River.  The city is under a court order to fix the system, and is in the process of spending $2.5 billion to solve the problem.  The solution includes building a four-mile long 18-foot diameter tunnel—essentially a subway for stormwater.  (That’s not hyperbole; single-track subway tunnels, like San Francisco’s central subway are 20 feet in diameter).

Hartford’s 4-mile cross-town subway—for stormwater

To pay for the project, Hartford and surrounding cities are charging their residents a “Clean Water Project Charge” on their water bills.  Local residents actually pay more for the stormwater system, per gallon, than they pay for their domestic water ($4.05 per hundred cubic feet for domestic water consumed, and are charged $4.10 per hundred feet of water used to pay for stormwater).  But keep in mind that the stormwater doesn’t result from domestic water use—it results from runoff from roofs, roads and parking lots.

While some cities charge based on the amount of a user’s “impervious surfaces,” Hartford does not.  As a result, neither cars, nor parking lots pay anything toward the stormwater problems they cause.  And it is these impervious surfaces, like parking lots that contribute enormously to the quantity and (bad) quality of stormwater.  Since they are large and impervious, they create the huge peak flows that cause overflows. And it’s also the case that cars—via pollution from tires, leaking engine oil and gasoline, and brake linings—produce some of the most toxic elements in stormwater runoff.

Charging residential water users, but not car users and parking lots is, in turn, an equity issue:  a Hartford resident who may not even own a car (per the US Census, roughly 23 percent of Hartford residents live in households that don’t own a car) has to pay for this problem through their water bill.  Meanwhile, a suburban commuter from outside the water district area would pay nothing toward Hartford’s stormwater system.  In short, their are good reasons of efficiency and fairness for asking parking lot owners to pay more toward dealing with the costs they impose.

For cities, imposing fees on parking makes fiscal and land use sense.  For too long, we’ve subsidized the assault on urban living by cars, and nothing has been more detrimental to cities than dedicating scarce and valuable urban land to car storage. Many of these subsidies are buried—literally and figuratively—in the way we pay for urban infrastructure, like stormwater runoff.  In Hartford, and many other cities, we have the perverse situation where carless households are taxed to cleanup runoff from streets and parking lots, while road users pay nothing for the damages they cause. Ultimately, an full-fledged land value tax would help correct the perverse incentives in the current property tax, but until then, charging a higher fees for parking lots is more efficient and fairer.

Editor’s note:  This post has been revised to correct a math error in the originally published version.  We originally reported the fee worked out to approximately 50 cents per day when fully implemented, but failed to note that the fee is biennial, rather than annual. The fee at the margin, when fully implemented in several years, will be 25 cents per working day.  The fee is phased in over a period of years.

A regional green new deal for Portland

by Garlynn Woodsong

Editor’s note:City Observatory is pleased to publish this commentary by Garlynn Woodsong. Garlynn is the Managing Director of the planning consultancy Woodsong Associates, and has more than 20 years of experience in regional planning, urban analytics and real estate development. Instrumental in the development and deployment of the RapidFire and UrbanFootprint urban/regional scenario planning decision-support tools while with Calthorpe Associates in Berkeley, CA, his focus is on making the connections between planning, greenhouse gas emission reductions, public health, and inclusive economic development. For more information, or to contact Garlynn, visit this website.

  • The Portland region should rethink its transportation vision and explicitly pursue only projects that reduce greenhouse gas emissions.
  • The climate emergency demands a new regional transport authority, with a clear and objective climate test.
  • This should be paired with a new regional infrastructure bank featuring a revolving loan fund for both transportation and housing projects, to provide pathways to opportunity for historically oppressed populations.

With the failure of Metro’s 2020 transportation measure at the ballot, it has become clear that the Portland region needs to take a clear, hard look at its transportation system, and how it can manage it to reduce GHG emissions, build out incomplete transit, walking and bicycling infrastructure networks, ensure that critical regional infrastructure maintenance needs are fulfilled, and address historic inequities.

The solutions that could work for Portland certainly could point the way towards models replicable elsewhere; many others regions are also experiencing the same pressures and needs as Portland and Oregon.

Need for Visionary Leadership

First off, what we need, what each place needs to implement these solutions, is visionary leadership: leaders who have the vision to see the enormity of the challenges we face, and to respond appropriately.

These challenges include those related to climate change, such as the need to reduce GHG emissions, to sequester carbon, and to engage in managed retreat from areas at risk of sea level rise or wildfire burn; those related to inequality, such as historic racial and ongoing economic segregation; those related to the economy, such as the impact of COVID-19 on our local businesses, homelessness, and the housing crisis; and those related to sustainability, including the need for municipal fiscal solvency, and access to clean air, water, and energy.

 

These challenges must be clearly stated as problem statements for strategic planning events where political leaders and senior staff from local jurisdictions and partner agencies are empowered to develop consensus around strategies to achieve solutions as quickly as possible, given the urgency of the overlapping crises we face. These strategies should guide legislative agendas, work programs, and all other related work, so there is no question as to strategic direction or how day-to-day work fits into the puzzle of implementing change for goal achievement.

(Garlynn Woodsong)

The vision that is needed to meet the moment and guide the path forward towards a metropolitan region in 2050 where our goals have been achieved must involve three basic elements:

  • A well-managed, adequately funded multi-modal transportation system that produces less than 10% of the emissions of our current system by right-sizing roadway capacity to match the lower total VMT that will be needed to attain goals, allowing for greater mobility through zero-emission transit, complete and safe bicycle and pedestrian networks, and the deployment of hi-tech mobility management solutions including on-demand transportation and ride-sharing tools
  • An equitable land use system that reduces demand for transportation system elements and energy in ways that are conducive to meeting our climate goals for 2050, by providing plentiful housing and economic opportunities for people of all backgrounds to live in 15-minute neighborhoods.
  • A sense of urgency that powers the quick implementation of temporary solutions that can evolve over time in response to lessons learned through deployment, and is facilitated by visionary leadership engaging in enlightened decision-making to deploy strategies and tools quickly that are supportive of every aspect of the vision, such as a carbon tax with an equity kicker that can serve to provide income to populations in need while taxing activities that produce carbon emissions at levels sufficient to bring down those emissions.

We need leaders who can quickly pivot to articulating and implementing this vision, and abandon the efforts that so many current leaders engage in of defending and perpetuating status quo programs that are not designed to deliver components of the vision, and indeed suck resources away from other efforts that could more quickly implement aspects of the vision. We must stop efforts to defend business as usual for government programs, such as highway expansion, that are antithetical to achieving this vision.

Declare a Regional Climate Emergency

One part of the solution is declare a climate emergency, and, using emergency powers, to re-consider all committed TIP and RTP projects, ranking each one by its estimated effectiveness. Any project that results in a net potential increase in GHG emissions must be eliminated from further consideration for funding. Of those that remain, those that should be prioritized for funding include those that will provide the most benefit to historically underserved communities, those that change the design of facilities to ensure safer outcomes, and those that will produce the greatest decrease in GHG emissions, including the build-out of historically under-built modal networks for bicycles, pedestrians, and zero-GHG-emission transit.

The project assessments that are needed to rank projects by their potential effectiveness at reducing GHG emissions and attaining other elements of the vision must be independent of agencies that have a history of pushing a single agenda, such as state transportation departments that evolved from origins as highway departments without ever shedding a single-minded focus on highways as the only solution worth the lion’s share of transportation expenditures. Project assessment must be science-based, evidence-based, and outcomes-based, and not based on results from travel models originally built for the purpose of sizing freeways to accommodate growing automobile traffic. Empirical evidence must replace model results as the gold standard, meaning that places that already embody the principles and elements of the vision should be studied and replicated.

The vision of places that achieve our goals for climate action and for equitable development is of places that are walkable, with most regular destinations accessible by foot within 10 to 15 minutes, and transit available as a pedestrian trip extending tool that allows destinations further away to be accessed. We can achieve this vision by building more such neighborhoods, by building more housing within the 15-minute neighborhoods we already have, and by ensuring that housing is provided that is affordable for people of all incomes, not just those that can be provided for by the profit-dependent sectors of the market. We can cost-effectively achieve no part of this vision by continuing to spend hundreds of millions of dollars widening and expanding our already-over-built highway system.

Regional Transportation Authority

The Portland metro region, like any metropolitan region, contains a network of arterial streets and railroads that are not currently being managed for their highest and best use: there are many ways to potentially measure this, including the maximum amount of movement of people and goods, or the greatest amount of economic activity, or the most general benefit for the most people, for the lowest amount of carbon emissions. However you measure it, the current system is weighted too heavily towards providing room for vehicles that emit large amounts of carbon emissions to transport relatively few people or goods per square foot per hour.

We can look to Translink in Vancouver, B.C.,, and to the Bay Area Toll Authority (BATA) and the Golden Gate Bridge, Highway, and Transportation District in the San Francisco region, for models that show the way towards raising revenue from transportation system operations to pay for the operations, maintenance, expansion, and system completion of bridges, arterial transportation system elements, transit, and regional bicycle and pedestrian facilities including on and off street trails and facilities.

A proposal that is suggested by these examples would be to create and/or vest regional government with the authority to price and manage the regional arterial road, bridge, and trail system, as well as the regional heavy rail system, so that the right of way and facilities could be upgraded using funds from congestion pricing, tolls, and other sources to provide for a complete regional transit and bicycle/pedestrian system, while reducing the automobile capacity of facilities in line with anticipated VMT reduction goals, under a policy framework that places the highest priority on reducing carbon emissions.

The simplest path forward in the Portland region would be to create a new authority, managed by Metro, and given functions now split between ODOT, TriMet, Metro, and the transportation departments/bureaus of the cities and counties within the region. These functions would include those related to operations, management and pricing of the regional arterial road, transit, bicycle, and pedestrian systems.

Regional Infrastructure Bank

To ensure that transport projects play their part in reducing GHG emissions without causing harm to existing communities, a new regional transportation authority should be paired with a complementary new regional infrastructure bank that is empowered with seed funding for a revolving loan fund for building both new links in the transportation system and new homes nearby, to provide pathways to opportunity for historically oppressed populations to build equity and avoid displacement.

Projects needed to achieve regional goals that would be eligible for funding by a new regional Bank could include housing, transportation, or other eligible projects, such as providing seed funding for Local Improvement Districts and Tax Increment Financing efforts that otherwise would face multiple gap years following their initiation until their tax base began producing expected returns, or seed money for a regional revolving housing loan fund accessible by community land trusts, housing cooperatives, and non-profit housing developers.

Urgent Times Require Urgent Action

Finally, Portland’s Rose Lanes project points the way towards a new way of thinking about quick-build-and-deploy projects that can deliver the most benefit as quickly as possible for the least amount of up-front capital.

(NACTO)

Could regional Bus Rapid Transit (BRT) systems be quickly deployed using existing rolling stock (at least to start off with), paint, signs, and temporary platforms, so as we all come out of COVID we emerge into a re-structured world that is hyper-focused on reducing GHG emissions as a part of a comprehensive community-building program?

Could regional protected bikeway networks similarly be completed as a part of the same or similar initiatives?

Could regional commuter rail systems be deployed on existing tracks to quickly add center-to-center zero-emission express transit connectivity, such as downtown Portland to downtown Vancouver, WA, much sooner than would otherwise be possible if dependent on traditional 8-12 year capital project delivery schedules?

A Regional Green New Deal

We need to act like our house is on fire, and this thinking needs to permeate every aspect of our collective decision-making processes. We need a Green New Deal locally, regionally, and nationally. Let’s work together to articulate what that means, and how to achieve it. If capital funds are holding us back from achieving that vision, then let’s put forth a funding proposal, such as an equitable carbon tax, that truly matches the moment.

In Portland and Oregon, much of our current success rests on the shoulders of decisions made by visionary leaders of generations past, including the vision to abandon freeway building in favor of light rail, to institute and enforce metropolitan Urban Growth Boundaries, and to embrace downtown revitalization rather than continued suburban sprawl. Portlanders are lucky to have had such visionary past leaders. It is time for the leaders of the current moment, in the Portland region and everywhere else too, to be visionary in articulating solutions to today’s problems, to build on the vision of leaders from years past in ways that advance the goals of protecting farm and forest land, reduce GHG emissions sufficiently to meet our 2050 targets, and build more equitable communities that offer housing opportunities affordable for and economic opportunities available to all residents.

Portland carbon tax should apply to all big polluters

By all means, Portland should adopt its proposed healthy climate fee, a $25 ton carbon tax

But make sure it applies to the biggest and fastest growing sources of greenhouse gases in the region

The healthy climate fee should apply to freeways and air travel, not just 30 firms who produce 5 percent of regional GHG emissions.

The City of Portland has proposed a new “healthy climate fee” on a range of businesses and organizations that it says discharge more than 150 tons of greenhouse gases in the city.  Like most economists, we’re all on board with the idea of pricing carbon.  The reason we have a climate crisis is we allow everyone to use the common atmosphere as a dump for carbon without compensating society for the damage done.  Economists are unanimous that pricing carbon is essential to avoiding climate catastrophe:  it simultaneously discourages bad behavior, rewards low polluting activities, creates incentives for cleaner investments and spurs the innovation needed to make all this happen.  And, as we’ve pointed out the price needed to trigger these changes is modest—less on a per pound basis than the fee we charge for grocery bags or the deposits charged on soda cans or indeed for disposing of solid waste.

The only objection we have to this proposed fee is that it doesn’t go nearly far enough.  It exempts more than 95 percent of all the region’s carbon pollution.  The proposed climate fee is targeted at those who own or operate facilities that generate 2,500 tons of carbon dioxide or more per year.

The city’s inventory lists 35 such facilities, including steel mills, bakeries, oil storage depots and other manufacturing facilities, but also universities, hospitals, and a wastewater treatment plant.  The city has used air pollution permit data to estimate the tax liability of the firms it thinks will be subject to the tax.  Collectively, it expects them to pay “healthy climate fees” of about $9.2 million annually on 370,000 tons of carbon emissions.

That seems like an impressive number, but in fact only amounts to about 5 percent of the city’s greenhouse gas emissions.  But by its own reckoning, the City of Portland (Multnomah County) produces more than 7.7 million  tons of greenhouse gases per year.  So this measure taxes small  but visible fraction the areas total emissions.  What it leaves out is revealing.

As we’ve pointed out repeatedly at City Observatory, Portland is failing in its effort to meet its climate goals because of the big increase in carbon emissions associated with transportation, particularly from increased driving.  Transportation accounts for 41 percent of county greenhouse gas emissions, and unlike emissions from residential, commercial and industrial uses, which are all significantly lower than in 1990, these emissions are increasing.  In essence, the key reason the city is failing achieve the goals laid out in its 2015 climate plan is due to more transportation emissions. The bulk of the increase is due to increased driving since 2014, when gas prices declined, so this is actually an area where some economic incentives might do some good.

If the region is going to take climate change seriously, it should be focused not just on a handful of institutions that account for less than a twentieth of greenhouse gases, and instead focus on the biggest and fastest growing part of the climate problem.  If we think about transportation as a system, it’s clear that the region’s transportation facilities, its freeways and airports are a major source of greenhouse gas emissions.  If we’re going to charge such a fee at all, we ought to  include in our list of large “facilities” that are even bigger sources of greenhouse gases: the city’s interstate freeways (owned and operated by the Oregon Department of Transportation) and Portland International Airport (operated by the Port of Portland),

Carbon emissions from ODOT Interstate freeways in Portland:  550,000 tons per year

So how much greenhouse gas pollution is produced by ODOT’s interstate freeways in the City of Portland?  We don’t have exact data, but we can triangulate a reasonably good estimate.  According to the Federal Highway Administration, on an annual (Pre-Covid) basis, there are about 2.9 billion vehicle miles of travel on Interstate freeways in the Oregon portion of the Portland metropolitan area.   We apportion this traffic proportional to lane miles of interstate freeway in each county; Multnomah County includes about 53 percent of the tri-county area’s inventory of interstate lane miles, according to ODOT.  We further assume that 80 percent of Multnomah County mileage is in Portland.  This means that driving on Portland’s interstate freeways amounts to about 1.2 billion miles per year, and at a very conservative passenger car average of .445 kilograms of carbon per mile, produces about 550,000 tons of carbon emissions per year.

Freeways are fossil fuel infrastructure. These ODOT facilities produce more carbon emissions than all the industries taxed under Portland’s proposed “Healthy Climate Fee.”

If ODOT’s interstate freeways were treated as a “facility” and paid the $25 per ton fee for the emissions from the operation of the facility, it would pay the City of Portland about $14 million per year.

Carbon emissions from Portland International Airport:  1.5 million tons per year

Air travel is a major contributor to global greenhouse gas emissions, and Portland International Airport (aka PDX), the region’s principal air terminal the facility in Portland most closely associated with those emissions. In 2019, according to Federal Aviation Administration data, nearly 20 million passengers flying out of Portland International Airport logged about 11 billion “revenue passenger miles” of travel.  Air travel produces about 88 grams of greenhouse gases per revenue passenger kilometer, which means that PDX air travel generated about 1.5 million tons of emissions  per year (88 * 11,000,000,000*1.6/1000000).  This calculation suggests that PDX is responsible for about three-fourths of one percent of all US aviation greenhouse gas emissions, which total about 200 million tons per year, a figure roughly consistent with the region’s share of US economic activity.

Air travel is a large source of global greenhouse gases.

If Portland International Airport  were treated as a “facility” and paid the $25 per ton fee for the emissions from the operation of the facility, it would pay the City of Portland about $38 million per year.

Parenthetically:  It has to be said that the Port of Portland has a profound blind-spot when it comes to its greenhouse gas footprint.  According to the Port’s environmental report, it only counts emissions from its own vehicles, utility plant and purchased electricity, and not from the travel to and from the airport.  The port owns up to just 50,000 tons per year in greenhouse gas emissions, just 3 percent of the amount attributable to air travel.  Neither does the port count the emissions from the cars that drive to park in its garages.  That’s rather like a gun manufacturer counting as “gun deaths” only those people who were pistol-whipped and excluding those killed by bullets.

Let’s have a fair and inclusive Healthy Climate Fee

Together, these two facilities (ODOT’s interstate freeways and PDX) account for more than five times as much carbon pollution than the 35 facilities inventoried by the City of Portland and included in its proposed ordinance.   (We believe the city’s own inventory of Portland greenhouse gas emissions significantly undercounts greenhouse gases associated with air travel in and out of PDX).

It’s clear that the city believes it has the authority to impose this fee on local, state and federal governments.  It’s proposing to charge the city-owned wastewater plant more than $1 million annually, and also charge state entities (Oregon Health & Science University, the Port of Portland and Portland State University), and even the federal government (the Veterans Administration, which operates a large hospital.  The Port of Portland and the Oregon Department of Transportation both own and operate facilities that emit more than 150 tons of greenhouse gases per year.  They ought to be subject to the healthy climate fee, too.

In our view, it makes perfect sense of the city to implement a carbon tax or “healthy climate fee.”  But if it does, it should do so in a way that applies it to most or all carbon pollution, not just a twentieth of all emissions.  The city’s approach to a carbon tax is indefensibly narrow.  It gives a pass to the biggest and fastest growing sources of carbon pollution in Portland:  transportation emissions.  It’s a kind of “carrotism” the notion that climate change can be dealt with without inconveniencing anyone, when in fact, it is all of our daily decisions, especially of how much to drive and fly, that is principally responsible for greenhouse gases.  It’s politically convenient to tag a few large, visible polluters, but collectively they’re absolutely smaller than just two entities we’ve identified here, and they’re not the areas where we’re failing to make progress.

A carbon tax makes a huge amount of sense, but exempting 95 percent of all the emissions in the region from the tax, and loading its cost on just a few entities is just arbitrary.  If we’re really in a climate emergency, we should apply the carbon fee broadly.

Notes:

  1.  An empty soda can weighs 17 grams; several states impose a 5 cent deposit per can; that works out to a deposit of about $2,900 per ton of can weight).  Metro will charge you about $100 ton for the non-recyclable garbage you dispose of, about 5 times what Portland would charge you if you put that same amount of carbon in the atmosphere.
  2. Carrotism is a term coined by Economist Guilio Matteoli, it is the idea is that “climate change policy should consist entirely of enticing incentives (carrots) avoiding any restriction, regulation or even monetary disincentives (sticks); I.e. we will just glide smoothly into zero-carbon without anyone being inconvenienced ever.”

Building more housing lowers rents for everyone

A new study from Germany shows that added housing supply lowers rents across the board

A 1 percent increase in housing is associated with a 0.4 to 0.7 percent decrease in rents

Housing policy debates are tortured by the widespread disbelief that supply and demand operate in the market for housing. In our view, its been a growing demand for cities and urban living, running headlong into a relatively fixed, or at best slowly growing supply of urban housing that’s been the principle reason for affordability problems in many cities.  But many housing advocates refuse to believe that increasing housing supply will have any beneficial effect on rents.

A new study from Andreas Mense an economist at the University of Erlangen-Nuremberg, using detailed data on housing construction and rents in Germany, documents the direct and widespread effects of new market rate construction on rent levels. The paper uses variations in the completion rates of new housing units over time to tease out the effects of increments to supply on rent levels.  Here’s a typical chart showing how rent increases vary in response to additional housing completions in the month of December (the red line on the chart).  In the wake of completions (the period to the right of the red line) rent changes are negative.  The core finding is that a one percent increase in housing completions tends to be associated with a 0.4 percent to 0.7 percent decrease in rents.

 

 

Importantly, Mense’s work shows that added supply influences rents across the entire housing market.  At best, market-skeptics will aver that new market rate units will reduce rents at the top-end of the market, but can’t conceive of how that will affect lower rent units. But Mense’s work shows that it isn’t just high end rents that decline.

To the contrary, they suggest that new housing supply shifts the rent distribution as a whole. When considering the point estimates, it seems that the lower parts of the rent distribution reacted more strongly in the first months after the new units came on the market, while the upper part reacted more strongly several months later. Overall, none of the two main forces — substitutability of housing units, and moving costs —, seems to dominate. The key implication is that new housing supply provided by private developers effectively lowers rents throughout the rent distribution, shortly after the new units are completed.

(emphasis added)

One of the most helpful aspects of Mense’s paper is that it has quantitative estimates of how much additional housing a city might need to build to stave off rent increases. For example, he estimates that Munich would need to increase the number of new units built over the past seven years by about 20 percent above the actually completed levels in order to hold rent increases to zero.  Berlin, where markets are tighter and rent increases greater, would need an even bigger increase in production; most German cities would need to produce about 10-20 percent more new housing units than they actually built to hold rent inflation in check.  (Note that the required increase is not a 10-20 percent increase in total housing stock, but a 10-20 percent increase in the number of new units built compared to actual new construction).

One of the most common objections to “supply side” approaches to promoting affordability is that somehow building market rate housing only affects the price of higher priced housing.  This paper adds to a growing body of evidence that new market rate construction triggers a chain-reaction of moves and price adjustments that rapidly propagate through an entire housing market and ultimately benefit low income households. Building new housing sets of a chain of moves—the kind of musical chairs progression modeled by the Upjohn Institute’s Evan Mast—that yield increases in supply in existing units with various prices elsewhere in the region.  The vacancy of these units not only creates additional housing opportunities at lower price points, but puts downward pressure on rents.

Housing costs of the population as a whole can be reduced effectively by letting developers provide enough market-rate housing. Consequently, denser development has great potential to reduce the housing cost burden of low-income households—in addition to other possible benefits such as shorter commuting distances and larger productivity spillovers.

This is an important finding for housing policy.  Too much of the housing debate is a kind of myopic particularism, looking at whether a single housing unit is affordable, with no attention given to affordability across the market spectrum.  Too often, policies are obsessed with highly visible but microscopically small interventions, which may provide affordability for a few lucky tenants, but which do little or nothing to lower costs and increase affordability across the entire housing market. The truly pernicious part of some strategies, like inclusionary housing requirements, is that the negative effects of such requirements on new housing supply (which operate across an entire market) outweigh the benefits in terms of a few dedicated affordable units. Ultimately, housing affordability is about scale, and increasing housing supply is a necessary precondition to making sure affordability is achieved for many, rather than just a few.

Mense, Andreas (2020). The Impact of New Housing Supply on the Distribution of Rents, Beiträge zur Jahrestagung des Vereins für Socialpolitik 2020: Gender Economics, ZBW – Leibniz Information Centre for Economics, Kiel, Hamburg

The only reason some people drive is because we pay them to

Here’s an insight from tolling:  A substantial portion of the people driving on our roadways are only there because we’re subsidizing the cost of their trip.

When we charge a toll to use a road, suddenly many of those using it find they don’t value it enough to pay even a fraction of the cost of the road’s cost.

Our most egregious subsidies are to those drivers who demand to travel at the peak hour when roadway space is scarce, and which is the most expensive problem to fix.

The biggest problem with transportation is we don’t price road use correctly, to reflect back to road users the costs that their decisions impose on society and everyone else traveling.  In essence, we have daily urban traffic jams for the same reason Ben and Jerry have long lines on their annual Free Cone Day: when you don’t price something valuable, it gets rationed by queuing and patience.  What’s worse is a lot of people who are driving on many roads at the rush hour is essentially because we are paying them to do so.

Freeways are only “free” in the sense that, no matter when you drive, you pay the same marginal price:  Nothing.  Of course,  we all pay for roads through things like the gas tax and vehicle registration fees (as well as a host of other general taxes), but the point is, whether you use a road at the peak hour or not has no effect on how much you pay to drive.  Whether you use it at 2 am when there are literally no other cars on the road, or you use it when it’s jammed at rush hour, makes no difference.  And because peak hour capacity is in short supply, and is damned expensive to increase, “free” roads are a massive subsidy from the general public to the relatively few of us who use roads at the peak.

When we ask road users to pay even a modest fraction of the cost of providing that expensive peak hour road capacity, many of them tell us, by voting with their feet (or their tires), that they simply don’t value the roadway enough to pay for even a tiny fraction of its cost.

The latest example of this comes from Seattle, where Washington State has spent about $3.3 billion tearing down the old elevated Alaskan Way highway viaduct that marred the city’s waterfront for decades, and replaced it with an expensive new tunnel bored under downtown.  After the tunnel was completed, it was “free” for a while, but a year ago, the state asked users to pay tolls, which vary from $1.25 to $2.25 per trip, plus a $2 surcharge, if you don’t have a transponder.

Now keep in mind that tolling will cover less than 10 percent of the cost of the tunnel, about $200 million of the more than $3 billion cost. When the tunnel was first opened, it carried more than 75,000 cars per day.

The Alaskan Way Viaduct (Wikimedia Commons)

As soon as the state asked those using the tunnel to make a modest contribution to its cost, that number dropped to 55,000.  Here’s the official analysis from WSDOT:

Of the 20,000 trips that diverted from the tunnel after tolling began, data indicates that roughly 10,000 trips used other routes (primarily Alaskan Way, Elliott Avenue, and I-5) or switched from driving to riding transit. Ferries and water taxi ridership were largely unchanged. The remaining 10,000 trips are not accounted for using existing roadway sensors and automated passenger counters on buses, however, anecdotal data suggests that some trips were discontinued as people embraced teleworking, bicycling and other active forms of transportation, or by using an unmonitored route.

What this really means is more than a quarter (20,000 of 75,000 users) used the tunnel because only somebody else paid for more than ninety percent of the cost of their trip.  And contrary to the usual doomsday scenario’s of highway agencies, most of that traffic isn’t displaced onto city streets or alternate routes, it simply disappears.  Again, the evidence from Seattle’s tunnel has been that neither abrupt changes in capacity or tolling produced noticeable worsening of traffic elsewhere in the city.

Here’s another example, also taken from the State of Washington, although it spills over—as we’ll see—into the state of Oregon.  To set the scene: Vancouver Washington, sits just north of the Columbia River, which forms the border between the two states, and is part of the Portland Metropolitan area. Washington has a sales tax, Oregon doesn’t.  There just two bridges—I-5 and I-205—connecting the 400,000 Vancouver area residents with jobs and stores in Oregon.  Washington residents who shop in Oregon avoid state and local sales taxes of more than 8 percent, meaning a $200 shopping trip saves a resident more than $16 in sales taxes.  Washington is, in effect, paying its residents to drive to Oregon to shop.

Washington residents fill the parking lots at Oregon stores to avoid their state’s eight percent sales tax; they’re essentially paid to drive to Oregon to shop.

We’ve estimated that the average Vancouver household saves more than $1,000 in sales taxes per year by shopping in Oregon and that shopping trips account for between 10 percent and 20 percent of the traffic on the two Interstate bridges.  Not incidentally, this shopping traffic is a key reason why the two states are considering spending more than $3 billion to widen the I-5 bridge.

Want less traffic and pollution? Stop paying people to drive

This has to be the key insight for transportation policy:  The reason we have traffic congestion is essentially because we’re paying people to drive.  As soon as we ask drivers to pay even a fraction of the cost of the roads they’re using, large numbers of them find other routes, or simply take fewer trips.  In a world where we’re losing the fight against climate change principally because we’re driving more and more each year, a logical first step is to stop paying people to drive their cars.  That’s exactly what policies like road pricing, and parking pricing do:  they ask those who benefit from using their cars to pay for a larger fraction of the cost of providing the services they enjoy.

One of the fascinating, and least widely understood aspects of the science of traffic jams is that it takes only a small reduction in traffic levels to keep roads flowing freely.  As long as traffic levels stay below a tipping point where the road becomes saturated and loses capacity, the road works well. Adding (or subtracting) just a few cars when traffic is at this tipping point makes a huge difference both in travel times and how many cars a road can carry. This is just what happened during the height of the pandemic, when reduced demand kept Portland-area freeways below their tipping point, and enabled them to carry more cars at the rush hour than they did in “normal” times. Pricing a roadway causes a minority of travelers to change their trips, but it’s enough to keep the roadway flowing freely. A common argument against pricing is that some people simply don’t have alternatives to driving, or driving at a particular time; and that’s true.  But the data show that a significant minority of those on the road will immediately change their behavior given even a small financial incentive—and that segment of the population staying away is enough to make the road system work much, much better for everyone else who doesn’t have that flexibility.

The old saying “you get what you pay for” applies with a vengeance to transportation:  We pay people to drive, and so they do, with the result that we get chronic congestion, air pollution, and aggravated climate change.  A sensible policy of charging road users based on when and where they travel would both dramatically reduce traffic congestion—by encouraging those who are only on the road because someone else is paying for the trip to choose another time or destination—and also reduce pollution and greenhouse gases.  Those who paid would be getting what the British call “value for money”—by tolls would get them a quicker and more predictable trip than is possible at any price now.

 

 

The truth about Oregon DOT’s Rose Quarter MegaFreeway

The Oregon Department of Transportation (ODOT) desperately wants to build a mega-freeway through NE Portland, and is planning to double the freeway from 4 lanes to 8 or 10 lanes.  But it has hidden its true objective, by claiming only to add two “auxiliary” lanes to the existing 4 lane freeway, and arguing (falsely) that these lanes won’t increase traffic.  But when you look closely at its sketchy information revealed in its Environmental Assessment, it’s actually building a 160 foot right of way, easily wide enough to accommodate 8 or 10 full traffic lanes. Recent modifications to the I-5 project design on a viaduct section of the project, where ODOT is going to re-stripe the freeway rather than widen the viaduct to add a lane, shows that ODOT can easily squeeze 8 or even 10 lanes into the project.  That’s important because the project’s environmental assessment has neither considered the traffic and pollution from an eight lane freeway, or explained why the right of way needs to be vastly wider than needed to accommodate the supposed 6 lane configuration the agency says it is building.

  • The Rose Quarter freeway widening is now  narrower, which shows ODOT has wide discretion to re-stripe lanes and shoulders to add capacity
  • To avoid intruding on the Eastbank Esplanade, ODOT has dropped its plans to widen the viaduct overhanging the pathway, but will squeeze in another lane of traffic by re-striping the existing 83′ wide viaduct.
  • At the project’s key pinch point, the Weidler overpass, ODOT is engineering a crazy-wide 160 foot wide roadway ostensibly for just for six travel lanes.  But this roadway is more than enough to fit 8 or even 10 lanes of traffic, just by striping it as they are now planning to stripe the viaduct section of the project.
  • ODOT’s environmental analysis has violated NEPA by failing to consider this “reasonably foreseeable” eventuality that ODOT will re-stripe the project to 8 or more lanes, which would produce even more traffic, air pollution and greenhouse gases, impacts that are required to be disclosed.
  • ODOT has also violated NEPA by failing to consider a narrower right of way: 96 feet would be sufficient to accommodate its added “auxiliary lanes” and would have fewer environmental impacts and lower costs.
  • ODOT’s safety analysis for its narrowing of the viaduct section shows that narrower lanes and shoulders make almost no difference to the crash rate, and further show that the project’s claims that it would reduce crashes on I-5 by 50 percent are exaggerated by a factor of at least seven. The analysis also shows that the project has a safety benefit-cost ratio of about 1 to 200, meaning it costs ODOT $2 for 1 cent of traffic crash reductions, about 2,000 times less cost-effective that typical safety projects.
  • The safety analysis also confirms that ODOT made now allowance for the effects of induced demand:  It makes it clear that the project assumed that traffic levels would be exactly the same in 2045 regardless of whether the freeway was expanded or not.

In a seemingly innocuous appendix to its Finding of No Significant Impact (FONSI), ODOT has tipped its hand:  It’s really building the right of way for an 8- or 10-lane freeway at the Rose Quarter.  The reason this project’s budget has ballooned from $450 million in 2017 to $1.45 billion today is because they’re not simply adding a single so-called “auxiliary” lane, but instead are doubling the freeway’s footprint as it cuts through North Portland.

ODOT narrows part of its proposed I-5 freeway to protect a park

ODOT had a problem with the Eastbank Esplanade. It’s a linear parkway that between  the Willamette River and I-5.  The original proposal for the freeway widening project called for this section of freeway, which rises on a viaduct, to be widened into the park.  Unlike other requirements of NEPA, which are essentially procedural, and only require disclosure of impacts, provisions pertaining to parks have real teeth.  Expanding highways onto public park lands has to comply with tougher requirements under what is called Section 4(f).

ODOT initially tried to conceal the intrusion of the widened freeway viaduct onto the Esplanade.  The agency didn’t publish detailed design drawings as part of the Environmental Assessment.  It produced these preliminary engineering plans only after threat of legal action (and after denying that they even existed). The plans became available only days before the deadline for public comment on the project EA. Even then, it took sleuthing by project opponents to discover and document the widened viaduct.

A widened viaduct would have extended over the Eastbank Esplanade. Illustrations produced by Cupola Media from data suppressed by ODOT.  (Bike Portland)

Recognizing that they were unlikely to meet the 4(f) requirements, or would be dramatically delayed by doing so, ODOT project engineers decided to simply lop off the viaduct-widening portion of the project.  Instead of widening the viaduct section to 105 feet, ODOT will simply re-stripe the existing 83 foot wide viaduct section with narrower lanes and narrower shoulders. (Lane width data calculations shown in Appendix, below). And apparently, they decided this very recently:  The safety analysis for the new narrower viaduct section memorandum is dated October 6, 2020 and was last revised on October 23, 2020, only a week before the FONSI decision (dated October 30, 2020).  (This document now appears in the Rose Quarter Library here).

Because it is not widening this 1,200 foot stretch of the I-5 freeway, but wants to add another travel lane to I-5, ODOT has now decided that it can have both narrower shoulders and narrower vehicle travel lanes on this stretch of the freeway. Here is its description of current conditions (the No Build-83′), the revised plan (“Scenario 1”-also 83′), and the widened viaduct originally proposed in the EA (“Scenario 2”-105 feet).

This decision and its supporting analysis proves two key points:

First, that ODOT can and will re-stripe existing roadways to provide more capacity, and there’s nothing about their regulations, engineering practice or concerns about safety, that preclude them from doing so.  The memorandum essentially argues that there’s only a minor increase in crashes as a result of the lane and shoulder narrowing, and that this is acceptable.  It also notes that a “design exception” will be required, but that’s in no way a barrier to undertaking this action.  (In essence, if it’s necessary to avoid an environmental impact, ODOT believes it has complete flexibility to make an exception to its design standards to narrow both lanes and shoulders).

Second, the safety analysis presented here undercuts all of the overblown claims that widening I-5 is a “safety project”—time and again, ODOT officials have cited the inadequate width of the freeway and its narrow shoulders as safety problems.  All this is was a big lie, as we and other independent observers have documented..  And ODOT has even conceded that we were right in our critique.  But this new analysis shows that the crash reduction from wider roads is trivial.  The analysis estimates the value of crash savings from the project $400,000 per year. (Never mind, for a moment that these estimates are almost certainly wrong because they are computed using an ISATe model that explicitly says that it cannot be used to compute crash rates for freeways with ramp metering, which this section of I-5 has).  Even relying on ODOT’s own calculations, the safety benefits of this project are negligible, so much so, that ODOT has no trouble narrowing both lanes and shoulders in a significant section of the project.

Why is this 6 lane freeway 126 feet wide? ODOT’s real plan for an 8-lane freeway

Nearly two years ago, we called out ODOT for failing to disclose that what it was proposing to build at the Rose Quarter was really a massive, eight-lane freeway. It was engineering a broad right-of-way, that with a few gallons to paint, could easily be striped to provide 8 full travel lanes, four in each direction, essentially doubling the capacity of the roadway through the Rose Quarter.

Forget for a moment the labels ODOT attached to lanes, and look, simply at the actual width of the highway right of way they are proposing to build. Take out your tape measure, and follow along.  From curb to curb at the NE Weidler overpass, they’re planning to widen the freeway’s footprint from 83 feet to 126 feet. It’s actually a challenge to figure out how wide the freeway is and how much wider it’s going to be, because ODOT consciously chose to exclude that data from most of the project’s Environmental Assessment.

Currently, most of the corridor is 83 feet wide.  This image, taken from Google Maps, show cross-section of the existing I-5 freeway, just south of the key Weidler overpass, As Google Maps scale measure indicates, this freeway cross sections is approximately 83 feet wide.

I-5 at NE Weidler Overpass. Width 83 feet (Google Maps)

ODOT has not released detailed cross-sectional plans of what it proposes to build.  The project’s Right of Way technical report, has this crude illustration of existing and proposed widths. Note that in the “Existing conditions” section, ODOT has left out the width of inside and outside shoulders (“varies”), which obscures the exact width of the overall cross section, which we know from Google maps to be 83 feet at the Weidler overpass.

The proposed width of the I-5 freeway can be calculated (roughly) from the ODOT diagram.  If we add together the travel lanes, the shoulders and an allowance for the width of the median barrier, we get roughly 126 feet—six 12-foot travel lanes, four 12-foot shoulders and an additional six feet for the width of the barrier (72+48+6=126).  (See Appendix below for additional detail).

Why, might you ask, has ODOT selected 12-foot inside and outside shoulders?  No explanation is provided for this exact choice in the EA.  (ODOT asserts that wider shoulders are needed for “safety”—more about that claim below—but that doesn’t explain the choice of 12 feet.  In fact, 12 feet is vastly wider than any shoulder width on an urban stretch of I-5 anywhere in the City of Portland.  Typical shoulder widths are 4-8 feet.

The real reason for these excessively wide shoulders is to allow ODOT to re-stripe this section of freeway at some future date to allow eight lanes of traffic.  Here’s how.

A 126 foot right of way can easily accommodate 8 lanes of traffic.  Here we’ve juxtaposed ODOT’s illustration of the proposed freeway cross-section with six traffic lanes from the EA, with a revised version that adopts the shoulder widths ODOT has adopted for the revised viaduct section.

 

These shoulder and lane width measurements for this potential future re-striping  are taken directly from ODOT’s own memorandum describing its revised design for the viaduct section of the I-5 freeway.  We’ve simply applied these same measurements—6.5 foot outside shoulders, 4 foot inside shoulders and a mix of 11 and 12 foot travel lanes—to this 160 feet of right of way.  We even have about 8 feet of roadway left over.  The point is that ODOT has proposed to construct a roadway that could easily be reconfigured to carry eight or ten lanes of traffic with a few gallons of paint.

Re-striping for 8 or 10 lanes is a reasonably foreseeable event

Not considering the possibility that a wider I-5 could be re-striped for 8 or 10 lanes of traffic is a violation of the National Environmental Policy Act’s requirement that the agency disclose and evaluate the cumulative impacts of reasonably foreseeable future events.

ODOT’s analysis of the environmental effects of the Rose Quarter project is based on the premise that there are only six lanes of freeway capacity through the Broadway-Weidler section of the I-5, and that widening the road to from four to six lanes in this area will have no effect on freeway traffic volumes. (This flies in the face of what we know about induced demand).

But more to the point:  the EA failed completely to consider the traffic and environmental effects of an eight- or ten-lane freeway.  And because the project is constructing a 160-foot right of way, it has more than enough room to increase the freeway to 8 or 10 full travel lanes with even wider shoulders than it is proposing on the re-striped viaduct section.  This is important because, having built such a wide right-of-way, it is in fact “reasonably foreseeable” that in the future, ODOT will fire up its paint trucks and re-stripe this section of the freeway—just as it has effectively done with the revised plan for the viaduct section.

It doesn’t matter whether ODOT has “no current plans” to widen the freeway to 8 or 10 travel lanes or not.  The agency’s intent is not the legal standard. The standard instead is whether it’s reasonably foreseeable that an agency might do so.  Given that they designed a right-of-way easily wide enough to handle eight or ten lanes, and that the agency’s own analysis shows neither lane widths nor shoulder widths make much difference to safety, and that the agency is willing to re-stripe an existing section of the same roadway to accommodate more traffic, and that the sponsoring agency, FHWA, regards such restriping projects as a “best practice”—even holding up ODOT’s work as a national example—it’s quite clear than one can reasonably foresee that once this project is built at its much larger scale, with wider shoulders than found anywhere else on the urban sections of I-5, that in the future the agency may re-stripe it to eight or ten through travel lanes.

What NEPA then requires is that ODOT evaluate the likely environmental impact of an eight- or ten-lane freeway in this area—something it simply hasn’t done.  If it analyzed the impact of an 8- or 10-lane roadway here, it would get much higher traffic counts, many more vehicle miles traveled, more air pollution and greenhouse gases.  These are exactly the kinds of impacts that ODOT is required to reveal as the likely “cumulative impact” of their chosen project—and they have not done so.  This omission alone should force the agency to conduct a full Environmental Impact Statement.

Why build 160 foot freeway?  Why not much narrower and less disruptive?

ODOT has also violated NEPA for failing to consider a narrower right of way through the project area.  What remains completely unexplained and unexplored in the EA is why, assuming it needed only six lanes, the agency elected to build a 160 foot wide right of way.  This will be extremely expensive, both for additional land acquisition and construction costs.  If it were truly interested in only a six-lane cross-section, the right of way, using ODOT’s own standards (from the viaduct section), would only need to be 96 feet wide (See Appendix for calculation).  That smaller cross-section would be cheaper to build and would have fewer environmental impacts, both because it would be less disruptive to existing adjacent uses like Harriet Tubman Middle School, but also because it would foreclose the future risk of the freeway being widened to eight lanes.  ODOT’s own engineering consultants, ARUP, have said that the project could be at least 40 feet narrower than what ODOT is proposing if the need is for a total of six travel lanes.  Again, the purpose of NEPA is to force agencies to consider a range of options, and present objective information on the relative environmental consequences of different options. By not looking at a 96-foot (or similarly narrower) right of way for the widened freeway, ODOT has violated NEPA.

In addition, the much wider freeway right of way makes it much more difficult and expensive to construct freeway covers that would support buildings, as some in the community are calling for.  The reason is that a 160 foot freeway footprint would require spanning a 80 foot clear area (assuming that supports are built on the side and in the median of the freeway).  If the freeway were only wide enough for 3 lanes, plus the type of shoulders ODOT is now proposing for the viaduct, the clear span distance would be about 50 feet. It would also reduce the cost of the project—less excavation would be required, overpasses and other structures would not have to be as large.

Widening the I-5 Rose Quarter does virtually nothing for safety

In theory, one reason that ODOT could argue that they need a 160 foot right of way for the Rose Quarter is safety.  But this new safety memorandum disproves that point.  It argues that shoulder widths of as little as 4 feet and narrower travel lanes make almost no difference to the safety level of I-5.

ODOT’s consultants say they have used the “Enhanced Interchange Safety Analysis Tool (ISATe)”  model to estimate crash rates under the No-Build scenario, and for the original and modified build alternatives (i.e. with and without a widened viaduct section).  This analysis claims that narrowing the lanes on the viaduct section would actually increase crashes on that section of the highway, increasing annual damages by about 6 percent over No-Build levels.  Implicitly, ODOT is saying that when there are environmental concerns—in this case the intrusion of the freeway onto the Eastbank Esplanade—safety is here is no big deal and it’s perfectly justifiable to shrink the freeway shoulders and lane widths.

It’s also important to note in passing, that the ISATe model is not valid for making estimates of crash rates or crash changes on I-5.  The model’s methodology explicitly states that it does not apply to freeways with ramp-metering.  Ramp meters smooth traffic flow and tend to dramatically reduce crash rates.  ODOT and its consultants erred in using this model to estimate and make claims about the I-5 project.

Overall, the safety memo concludes that the $1.45 billion dollar project in total reduces the dollar value of crashes in the area by about 7.5 percent, from about $5.9 million per year to about $5.4 million per year.   Not only is this a very small (and speculative) reduction in the economic cost of crashes on I-5, the magnitude of these crash losses is tiny, especially relative to the cost of the project.  Total savings of the project with the modified design are about $432,000 per year, compared to a project cost of roughly $1.45 billion.  At a 5 percent discount rate, the $432,000 in annual crash savings has a present value of about $6.6 million, meaning that this project has a benefit cost ratio of less 1 to 200, i.e. you have to spend 2 dollars to get one cent in benefits. This is an extraordinarily poor return on investment.

 

This analysis shows that ODOT made false claims about the safety effects of the project.  Project materials claim that the project will reduce crashes by up to 50 percent. The Project’s website claims that the project’s added lanes and wider shoulders will speed the flow of traffic and concludes:

Adding these upgrades is expected to reduce crashes up to 50 percent on I-5 . . .

It has gotten this same claim repeatedly published in the media by Willamette Week, The Columbian, and the Portland Tribune, and ODOT officials have testified to the same to the Oregon Legislature.

These data show that the change in crashes will be less than 10 percent. This means that ODOT’s claims that the project will reduce crashes by as much as 50 percent overstate the reduction by a factor of seven (a 7.4 percent reduction is roughly one-seventh of a 50 percent reduction).

In addition, these crashes are overwhelmingly non-injury fender-bender crashes.  They’re not the crashes that regularly occur on ODOT highways in the Portland area that maim and kill hundreds of Oregonians each year.

Judged as a safety project, the Rose Quarter is an incredible waste of resources.  ODOT runs a competitive safety grant program “All Roads Transportation Safety (ARTS)“.  Typically the projects that are awarded funding have benefit cost ratios well above one.  The top nine Region 1 projects awarded in 2018 had benefit/cost ratios of more than 10 to 1, meaning they are about 2,000 times more cost effective than the Rose Quarter freeway widening as a “safety” project.

Proving ODOT failed to consider induced demand

The ODOT safety analysis sheds some light on an issue that ODOT has long concealed:  whether the project considered the effects of induced travel.  A robust and growing body of scientific literature has confirmed the “fundamental law of road congstion”: expanding urban roadways tends to encourage more driving.  The best estimates are that a one percent increase in freeway capacity leads to a one percent increase in vehicle miles traveled.  ODOT has made vague claims that its modeling somehow dealt with this issue, but the safety report shows that ODOT either completely discounted or ignored induced travel.  At two points, the safety report states that traffic levels on the viaduct section of the Rose Quarter project will be the same whether one lane is added, or whether the freeway stays its current width.  In three separate locations, the memorandum states:

AADT [annual average daily traffic] is the same for all scenarios

Here’s a screenshot that explicates this assumption:

What the safety analysis is saying is that whether the freeway has five lanes or four in this cross-section it will carry exactly the same number of vehicles.  That’s contradicted by the induced travel science, which suggests that a 50 percent increase in freeway capacity (from two lanes to three) will likely lead to a 50 percent increase in vehicle travel.  This is important, because if traffic volume increases, that will more than offset the gains from the small reduction in the rate of crashes.  By ignoring induced travel, the safety analysis understates the number of crashes associated with the build alternative.

Appendix:  Analysis of Freeway Width, I-5.

The following table shows the estimated width of the I-5 freeway, as currently built, and as proposed to be constructed as part of the I-5 Rose Quarter freeway widening project.  These data are estimated from the EA Right of Way Report, the Predictive Safety Analysis Memorandum and Google Maps as described in the text above.

 

While it has done this analysis for the viaduct section, the EA has no analysis of an alternative with a narrower cross section through the Broadway-Weidler section of the project.  If ODOT were to narrow this section of the project to 96-feet from 126 feet, it would reduce the impact on adjacent properties.

 

 

Phoenix: Climate Hypocrisy

You can’t be a climate mayor—and your city can’t be a climate city — if you’re widening freeways

Phoenix says it’s going to reduce greenhouse gases 90 percent by 2050, but the city’s transportation greenhouse gases have risen 1,000 pounds per person since 2014, and it’s planning to spend hundreds of millions widening freeways.

Around the country, and around the world, leaders are pledging to solve the climate problem—someday, in the distant future.  Typically, these pledges claim that a city (or other organization) will be “net zero” by some year ending in zero (2040, 2050), or that it will reduce its emissions (or usually, some carefully selected fraction of the greenhouse gas emissions it is responsible for), by a stated percentage. These multi-decade pledges aren’t really pledges that these leaders will be responsible for achieving:  it will be their successors, several steps removed, who will be in charge when the day of reckoning comes.

One of the world’s leading climate activists is calling “BS” on this phony and deceptive strategy.  Greta Thunberg is challenging leaders to  commit to change now, rather than waiting:

“When it’s about something that is in 10 years’ time, they are more than happy to vote for it because that doesn’t really impact them. But when it’s something that actually has an effect, right here right now, they don’t want to touch it. It really shows the hypocrisy.”

“They mean something symbolically, but if you look at what they actually include, or more importantly exclude, there are so many loopholes. We shouldn’t be focusing on dates 10, 20 or even 30 years in the future. If we don’t reduce our emissions now, then those distant targets won’t mean anything because our carbon budgets will be long gone.”

“. . . we can have as many conferences as we want, but it will just be negotiations, empty words, loopholes and greenwash.”

One doesn’t have to look far to find a city that is pledging to be much, much better (in a few decades), while its current efforts are failing perceptibly, and its actively spending money that will make the problem worse.  Today, we’ll pick on Phoenix, but much the same story could be told about many other cities.  Mayors are proclaiming loudly that there’s a climate emergency, and very visibly endorsing the Paris Accords, but at the same time are planning to put vast sums of scarce public resources into building more roads that will only make the problem worse. What makes this particularly egregious is that nearly everywhere, increased driving is now the single largest and fastest growing source of greenhouse gas emissions.  The one thing cities can do to fight climate change is reduce the need to travel by car; and widening freeways does just the opposite:  it subsidizes driving, and promote sprawl and car dependence.

Phoenix is developing a new climate action plan, because it’s required to do so for joining the C40 cities organization

The city’s goal is to complete CAP updates by year’s end in part due to Phoenix having joined C40 Cities Climate Leadership Group earlier this year.

“One of those things that C40 cities require is completion … or an updated climate action plan by the end of this year,” Environment Program Coordinator Roseanne Albright . . .”

Like a lot of cities, it has goals of reducing climate pollution . . . someday.  Here’s the provisional goal according to the city’s website.

Headed in the wrong direction

All well and good to have a reduction goal for the next couple of decades.  But what climate data show is that Phoenix–like most cities is utterly failing to make progress in reducing its greenhouse gas emissions.  Data from the national DARTE database of transportation greenhouse gas emissions shows that metropolitan Phoenix is rapidly going in the wrong direction.  Its GHG per capita which had been flat to trending downward in the first half of the last decade (even as the economy was recovering from the Great Recession) grew rapidly after the big drop in gas prices in 2014.  Today, the average Phoenix resident emits about 1,000 pounds more greenhouse gases from transportation than in 2014.

 

Going faster—in the wrong direction:  A wider freeway

Even as they proclaim their climate goals, the Phoenix is embarking on a massive $700 million freeway widening project, with the full support of its mayor.  The plan would widen an eleven mile stretch I-10 through Phoenix to as many as 16 lanes.

According to Planetizen, Phoenix’s Mayor is all on board:

Phoenix Mayor Kate Gallego is quoted in the article touting the safety benefits of the proposed project, along with its potential economic benefits. On that latter score, Mayor Gallego cites the potential for $658 million in new economic activity.

The project’s video notes that there will be as many three pedestrian and bike overpasses, but makes it clear that these are currently only conjectural:  “multi-use bridges for pedestrians and bicyclists could span the freeway.”  Plus, as we’ve noted at City Observatory, this kind of “pedestrian” infrastructure is really primarily designed to serve cars and doesn’t contribute to a more walkable city.

Finally, much of the cost of the measure is being subsidized from a regional sales tax.  So in essence, the region’s residents will have to pay for the wider freeway whether they use it or not.  This amounts to a massive subsidy to more driving, and predictably will lead to even more sprawling development, longer commutes and more greenhouse gases.

In a way, its unfair to pick on Phoenix. (For the record, we’ve been unstinting at City Observatory in our critique of Portland’s failed climate efforts).  Other cities around the country, who ostensibly care about climate change or who have endorsed the Paris accords are pursuing their own massive freeway widening projects, as James Brasuell has chronicled. The list includes projects in Houston, Los Angeles, Akron, Indiana, and Maryland.  As Brasuell says:

Some of the politicians and agencies behind these plans claim to be climate warriors without being held accountable to their promises. Others are climate arsonists, who are not being held accountable to the consequences of these actions.

Tragically, these efforts are even being promoted by the National Science Foundation, under whose auspices that Transportation Research Board published a report calling for billions more in subsidies for road construction to facilitate literally trillions of more miles of driving, building a true highway to hell.

Why—and where—Metro’s $5 billion transportation bond measure failed

Portland voters resoundingly defeated a proposed multi-billion dollar payroll tax to pay for transportation projects

The two areas slated for the biggest benefits voted against the measure:  The Southwest Corridor and East Portland both opposed the measure

A generous electorate didn’t want to spend billions on transportation

A few months back, we laid out the case against a $5 billion transportation bond measure proposed by Portland’s regional government, Metro. In the November election, voters rejected the measure by a 58 percent to 42 percent margin.

It’s difficult to argue that the measure failed because of widespread anti-tax sentiment:  Voters in the Portland and Multnomah County, in the center of the region, approved every single money measure on the ballot, except for Metro’s transportation tax.  Portland voters approved a $400 million parks levy (64 percent yes), a $1.2 billion school bond (75 percent yes), a $400 million library bond (60 percent yes). They voted to impose a new high earner tax to fund tuition-free preschool education.  These votes come on top of other measures they approved strongly in the May primary election, including multi-billion dollar regional tax to fund homeless services and extension of a 10 cent a gallon Portland city gas tax—which passed with an 77 to 23 percent margin of victory.

Opponents of the measure, chiefly the area’s business community, mounted a  $2 million campaign that mostly emphasized a series of anti-tax messages. But many in the community shared our concerns that the measure did nothing to reduce greenhouse gases. More generally, it appears that the public wasn’t convinced of the benefits of this proposed spending package, which would have been the most expensive single such effort in the region’s history. You’d think that $5 billion worth of projects, chosen after a year-long involvement process, would produce strong support, especially among the likely beneficiaries of these projects. But the evidence suggests that the measure failed to generate much support in two sets of neighborhoods that were singled out for major expenditures, in Southwest Portland and in East Portland.

The pattern of support and opposition in Multnomah County

While there hasn’t been any post-election polling to lay out the reasons for the measure’s crushing defeat, there are some clear clues in the geography of election returns. In particular, election returns for Multnomah County, which are available at the precinct level show, the distinct geography of support for and opposition to the Metro measure.  (Precinct data for Clackamas and Washington counties weren’t available as of publication date).

This map shows the overall pattern of votes in Multnomah County, with areas supporting the measure shaded green, those opposed shaded yellow, and those extremely opposed (fewer than 35% “yes” votes), shaded red.

Support came from a green core. Within the county, the precincts which voted in favor of the measure are in the city’s close-in urban neighborhoods, including downtown, and adjacent neighborhoods in Southeast, Northeast and parts of North Portland.  Outside these areas, voters were opposed. The support for the measure came from a handful of precincts on Portland’s westside, in and near downtown, and a swath of close-in urban neighborhoods in North, Northeast and Southeast Portland. These neighborhoods tend to have the highest levels of transit ridership and bike commuting in the region, as well as being politically liberal. The pattern of voting in favor of the measure closely resembles the split between incumbent Mayor Ted Wheeler (who was re-elected) and his challenger, Sarah Iannarone; Iannarone, the progressive, did best in the same precincts that voted for the measure.

Source: Multnomah County Elections.

The map of the left shows the electoral split for the Metro Bond, with the light colors representing yes votes and the darker colors “No” votes.  The map of the right shows the plurality winner of the Portland Mayor’s race, with moderate incumbent Ted Wheeler in light blue, and progressive challenger Sarah Iannarone in dark gray.  The Metro measure did best in those precincts most favorable to Iannarone.  More liberal-leaning neighborhoods supported the measure, but it wasn’t enough to produce a majority even in Multnomah County.

Neighborhoods that stood to get projects voted against the measure

In theory, at least two geographic constituencies should have been big beneficiaries of the measure.  The biggest single project in the package, the SouthWest Light Rail, earmarked for nearly $1 billion of Metro money, would have built a new light rail line with stations in neighborhoods in Southwest Portland.  But apparently this area didn’t care:  All of the Multnomah County precincts through which the project would have run voted against the measure.

Another constituency that was supposed to benefit from the measure were residents of East Portland. Metro stressed that the measure was designed to improve equity, by investing in transit, safety and pedestrian improvements, especially in low income neighborhoods like those along 82nd Avenue in East Portland.  This area, formerly part of unincorporated East Multnomah County, has some of the region’s weakest transportation infrastructure, with many unpaved streets and relatively few sidewalks. In recent decades, income levels in this area have slipped relative to the region, and advocates of the measure made a particularly strong point that spending in East Portland would address serious equity concerns. But the election returns show that there was no outpouring of support from these neighborhoods. East Portland is generally regarded as the area East of 82nd Avenue.  The map below zooms in on Portland’s East side, and shows a clear dividing line at 82nd Avenue (the red line on the map). Only 2 precincts East of 82nd Avenue voted in favor of the measure; nearly all were opposed. Many of the precincts in the area were strongly opposed, with fewer than 35 percent “Yes” votes. Claims that this measure would advance equity and improve safety generated no support in the very communities would ostensibly benefit from the measure.

No support in the suburbs, either

The Metro electorate includes voters in three counties, Multnomah (which includes nearly all of Portland) and two suburban counties:  Clackamas and Washington.  The measure lost in Multnomah County by a 54/46 margin, and was defeated by an even larger margin in each of the two counties. Neither of these counties has yet released precinct-level returns through the Secretary of State’s website, so we haven’t analyzed their geography here.

 

Limits of the “Christmas Tree” approach

From the outset, the Metro measure was fashioned as a kind of giant Christmas tree, with a raft of projects each designed to appeal to a specific constituency: a billion dollars for Tri-Met’s next light rail line, a freeway interchange for the Port of Portland’s airport, new highways in suburban Clackamas County, a big contribution to Multnomah County’s plan to replace the Burnside Bridge, sidewalk and safety projects to appeal to pedestrian advocates, and transit fare subsidies for students.  Seemingly everyone participating in Metro’s process got a little something and endorsed the package. Metro’s effort’s seemed dominated by an at times cynical political perspective. Early on, they fielded misleading push-polls falsely claiming that measures to reduce traffic congestion would reduce greenhouse gas emissions.

It was all well and good to come up with a list of projects, but Metro’s political calculus doomed this effort when it came to deciding how to pay for it. Supposedly because a gas tax didn’t poll well, Metro’s leader’s tried to stick it to “someone else” —the someone else being the region’s larger employers. The actual funding, a selective payroll tax, was decided only at the last minute, and with little public debate, and Metro chose to exclude itself another other local governments from paying the tax. In theory, a measure that only taxed big businesses and exexempted politically popular small businesses, and which was pitched as a tax that someone else would pay, should poll well. But in the end, it generated considerable ire with the business community, which opposed it, and the “no” campaign’s messages clearly resonated with the voters.

In our view, it was unwise from a policy standpoint to choose a method of paying for this package with little or no relation to transportation. The payroll tax amounted to the equivalent of a 30 cent a gallon subsidy to gasoline purchases, compared to the more conventional way of paying for road improvements through the fuel taxes. Ironically, as the 77 percent “yes” vote for extending the City of Portland’s 10 cent per gallon just six months ago demonstrates, voters and the business community had little problem, when asked,  in paying for transportation in a direct and visible way. Moreover, Portland passed a gas tax with a margin of more than 100,000 votes, a block that would have dramatically bolstered chances for regional success even if suburbs were intransigent.

For whatever reason, 2020 was a year when Portland area voters were willing to vote for big tax increases for a wide ranges for services, from dealing with homelessness, to parks, to libraries, to schools and pre-school education. The failure of the Metro measure in this environment shows considerable miscalculation, both in terms of what the measure proposed to fund, and how it proposed to pay for it.

 

 

 

Frog Ferry: The slow boat to nowhere

A proposed Portland area ferry makes no economic or transportation sense.

Why the Frog Ferry is a slow boat to nowhere

A ferry between Vancouver and Portland would take 20 minutes longer than existing bus service

From flying cars to underground tunnels to ferry boats, there’s always an appetite for a seemingly clever technical fix for urban transportation problems.  But in the end, transportation is about performance, cost and geometry, and nearly all of these inventive ideas fail dramatically one one, two or all three of these criteria.  When it comes to making urban transportation better, what is needed, as Jarrett Walker told a recent YIMBYtown audience, is less charisma and more scale.

The latest of these transport fantasies is a Portland proposal for a “Frog Ferry”—the cute name is a dead giveaway that this idea is long on hype and short on substance.  The pitch for the ferry goes like this:

Wouldn’t it be cool, if instead of taking a bus or a car, you could take a boat from Vancouver Washington, or Lake Oswego Oregon, to downtown Portland.  Wouldn’t it be great if it was faster than driving or a taking a bus?

With a keen eye to PR, the proponents of the Frog Ferry have produced a glossy video implying the ferry would be faster than car or transit service.

Source: Frog Ferry video.

You can imagine it, for sure.  But it is imaginary.  It’s not possible for a regular ferry service to travel faster between Vancouver and Portland than a car, or even today’s bus service. In the real world, boats are slower than both cars and buses. Water transportation, especially given the circuitous water route between Vancouver and Portland, the slow speeds of even “fast” ferries, the need to minimize damaging wakes at higher speeds, and the relative remoteness of docks from actual destinations, means that ferries in Portland are an unwise, uneconomic folly.

The devilish details

What exactly, is the Frog Ferry?  The Frog Ferry is a proposal to establish water ferry service on the Columbia and Willamette Rivers, connecting Vancouver, Washington, and downtown Portland and Lake Oswego and or Oregon City.  The Ferry plan is to operate 6 boats (four carrying up to 100 passengers and two carrying sixty).  With the help of the Oregon Department of Transportation and the City of Portland, ferry proponents have gotten hundreds of thousands in public funds to undertake a series of feasibility studies. So far, they’ve mostly figured out that the project would need at least $40 million to pay for boats and docks, as well as continuing subsidies to support operating costs.

In recent days, its seemed like the Frog Ferry proposal sinking fast.  Tri-Met, the region’s transit agency, charged with being fiscal agent for a grant from the Oregon Departmetn of Transportation, disputed and refused to pay invoices from the Frog Ferry organization.  The Portland City Council declined to bail out the organization.  The ferry advocates are trying to revive the project, but should anyone care?  Let’s take a close look at what’s proposed and see if it makes any sense.

Frog Ferry:  The slow boat to nowhere

As the marketing materials make clear, a faster trip is a big part of the promise here.  But would a ferry, particularly between Vancouver and Portland, be very fast?  Downtown Portland and downtown Vancouver, Washington are only about seven miles apart as the crow flies, but not as the salmon swims.  To get from Vancouver to Portland by water, you have to go west on the Columbia River five miles to its confluence with the Willamette, and then make a U-turn and go south-east on the Willamette River almost ten more miles to get to downtown Portland.  The trip by water is twice as long as the straight line distance, making for a lot of out-of-direction travel.

In New York City, ferry operating subsidies are costing the city $50 million a year.

There isn’t lots of traffic on the river, but there are still real limits to how fast ferries can go. As a practical matter they’re generally limited to top speeds of 20 to 25 knots, and often much slower.  A big problem with fast ferries is that they create a big wake, which can be disruptive to other water users, and damaging to sensitive riverbanks.  Limiting damaging wakes is a key reason that ferries travel as slowly as they do.  For example, even in Vancouver, British Columbia where ferries cross the wide Burrard Inlet Seabus, travels at about 11-12 knots.  These catamarans are capable of higher speeds, but are forced to travel at lower speeds to avoid the excess wake.  The Frog Ferry’s proponents assume their vessel will be able to cruise at 24 knots, but they’ve yet to address the damage that this high speed wake will do along the confines of the Columbia and Willamette Rivers.  In addition, the ferries will have to make a 130 degree turn around Kelly Point, which would be an amusement park ride experience at 24 knots.

Today’s buses are already faster

The ferry system is actually likely to be much slower than implied by the advocate’s estimates.  The 15 mile-river trip from Vancouver with a stop in St. Johns, would take around 45 minutes according to their own very optimistic estimates, as much as 10 minutes longer than current rush hour bus service.

That’s not surprising:  by road, the distance traveled is a more than a third shorter (9.3 miles) and an off-peak car trip takes about 15 minutes according to Google.

The Frog Ferry marketing claims that the ferry would be faster, but it wouldn’t.  It’s not only not faster than driving, it’s actually slower than today’s transit.  How long does it take to get by bus from Vancouver or Lake Oswego to Salmon Street Springs, the proposed downtown Portland terminus?  Let me Google that for you.

For bus service from Vancouver to Portland its, 34 to 38 minutes:

And the ferry is just the line-haul portion of the trip:  Since almost no one lives or works right at these stops, they’ll have to walk (or drive or take a bus) onward from the place the ferry docks to their destination.  The Vancouver, St. Johns and Lake Oswego sites are remote from housing, and aren’t currently served by transit.  Downtown Portland’s Salmon Street Springs is a four minute walk from the nearest light rail stop, and about seven minutes to Portland’s transit mall.  And these are for travel to Salmon Street Springs, not to someone’s work or shopping destination. It’s a half mile, 10 minute walk up or down a steep hillside from downtown St. Johns to the river side in Cathedral Park.The C-Tran and Tri-Met buses that ply these routes actually stop at multiple locations, like downtown Portland’s transit mall, which are much closer to offices, stores and other destinations.  So if anything, these estimates understate the travel time advantage of today’s buses.

It’s almost certain that Frog Ferry advocates have overestimated how fast their ferry will be.  They told Metro to assume that the ferry had an average speed of 24 knots (27.6 miles per hour).  While high speed ferries can cruise that fast (or few knots faster), during much of their travel cycle they’re either going much slower (maneuvering into and away from docks, accelerating or slowing, or are restricted by traffic or low wake zones).  As a result average speeds are far less than cruising speeds.  Moreover, the 24 knot average speed is higher than the 22 knot “cruising speed” assumed in the Frog Ferry’s own operational feasibility plan.

Those with real world experience report much slower service speeds.  New York—which has broad experience with ferries—estimates much different speeds that the Frog Ferry’s advocates.  Its Citywide Ferry Study concluded that an eight-mile leg with a medium-sized catamaran ferry would take a total of close to half an hour, with time for dwell, maneuvering and cruising—the Frog Ferry’s estimates don’t seem to allow for maneuvering to and from the dock.  Seattle’s proposed 10.5 mile Kenmore Ferry Route, operated with a 28-knot cruising speed ferry similarly requires close to 40 minutes one-way when allowance is made for loading, unloading and maneuvering.

Between Portland and Vancouver, the Frog Ferry would travel more than 15 miles in two nearly equal segments stopping at St. Johns.  That would put its travel time close to two-legs eight mile legs by the medium catamaran (Type E), in the table above, or one full hour.

The real world performance of ferries shows that the proposed Frog Ferry simply would not  be time competitive with existing transit service in Portland.  A ferry ride between Vancouver and Portland is going to take nearly an hour—twenty minutes or more longer than a current-day bus ride.  And the ferry will pick up and deposit its customers and the river’s edge, not their final destinations, unlike transit buses which actually traverse widely patronized destinations.  As a practical matter, a boat that takes considerable longer than a bus moving in mixed traffic just isn’t going to attract very many riders.

The romance of water travel and the apparent illusion of escaping traffic has certainly charmed some into pursuing the Frog Ferry idea, but as a practical form of transportation it is far less efficient and direct that the far more humble and prosaic transit bus, which despite its boring character is still faster than the fastest feasible water transport.

The fact that ferry service would dramatically under-perform existing bus service ought to be enough to reject the Frog Ferry idea out of hand, but that’s just the most obvious problem with ferry service.  It also turns out that ferries are enormously expensive, highly polluting, and tend to principally serve higher income commuters, points we’ll explore in future commentaries.  If a community is interested in efficient, equitable and sustainable mass transportation, the Frog Ferry shouldn’t be part of its plans.

It may not float or have charisma, but it will get you to Portland at least ten minutes faster than a Frog Ferry.

 

Equity and Metro’s $5 Billion Transportation Bond

Advocates for a $5 billion transportation bond that Portland area voters will be deciding in November are making a specious argument about it being an equity measure.

Its largest single project, a multi-billion dollar light rail line serves the some of the region’s whitest and wealthiest neighborhoods and has as its destination a suburban lifestyle mall.

The bulk of the money in the measure supports projects in highway corridors, including a subsidy to cars driving to the Portland airport.

Because the measure does nothing according to the advocate’s own estimates to reduce greenhouse gases, it’s inequitable to the frontline communities that bear the burden of climate change.

Some of the proponents of a $5 billion tax measure being presented to Portland’s voters on November 3 are claiming that it’s a way to right historic wrongs to the poor and people of color.  CityLab published one such commentary last week, with a headline asserting that the measure will help communities of color; it features a picture of one of Portland’s right rail lines.

But if you look closely at the measure, it’s really just transportation pork barrel politics, like the days of old, and the biggest shares of the money go to projects that disproportionately benefit whiter communities and higher income households. Despite a relative handful of measures—like free and reduced price transit for school aged kids—that make sense on their own, it’s a package that consists mostly of road projects that could and should be paid for by road users through the gas tax.  Instead, car users get the equivalent of a 30 cent a gallon subsidy for driving. What’s worse, is that the measure cannibalizes the payroll tax—which the region has used for 50 years to subsidize transit operations—to fund capital expenditures, at a time when the local transit system is facing desperate financial conditions.

But let’s focus on the biggest single project in the Metro package:  roughly a billion dollars toward the local share of the costs of a $3 billion expansion of the region’s light rail system.  On paper, that seems good, but as long-time Tri-Met planner GB Arrington pointed out, this particular light rail project makes no sense as a transit or urban development measure.

The claim in the CityLab article is that the measure “invests in Black and Brown communities.” But when it comes to the single biggest project in the package, the proposed Southwest Light Rail benefits the whitest, wealthiest part of the region. The same is true of other major components of the package, which chiefly invest in highway corridors, not neighborhoods. Here are the facts.

SW light rail would disproportionately serve Portland’s whitest neighborhoods

At City Observatory, we’ve extensively studied the racial and ethnic diversity of the nation’s largest metro areas, including Portland. While Portland has fewer Black residents than most large metros, it has proportionately more Hispanic and Asian residents, and it is overall, one of the least racially and ethnically segregated metro areas in the nation. But like every large US metro area, it has a share of neighborhoods that are not diverse, and that are disproportionately composed of white, non-Hispanic residents.

As part of our study, America’s Most Diverse, Mixed Income Neighborhoods, we identified all of the non-diverse predominantly white neighborhoods in the nation’s 50 largest metro areas.  We computed racial diversity using the Racial and Ethnic Diversity Index (REDI), and flagged those tracts that were among the 20 percent least diverse of all tracts in large metro areas nationally, and in which the majority racial/ethnic group was white, non-Hispanic.  Here’s a map of the Portland area’s non-diverse, white neighborhoods.

Source:  American’s Most Diverse, Mixed Income Neighobrhoods (Red-shaded areas are low-diversity white, non-Hispanic neighborhoods.)

Red-shaded areas are low diversity, majority white neighborhoods. (Source:  America’s Most Diverse, Mixed Income Neighborhoods).

The area with the largest concentration of these non-diverse white neighborhoods is the the southwest quadrant of the city of Portland, an area running from the city’s West Hills to the city of Lake Oswego. The route of the proposed Southwest Corridor light rail line bisects this large concentration of non-diverse neighborhoods.  Here’s a close-up of the same data, with the route of the light rail line super-imposed on these low-diversity white neighborhoods.

Red-shaded areas are low diversity, majority white neighborhoods. (Source:  America’s Most Diverse, Mixed Income Neighborhoods).

 

Light Rail to Portland’s high income suburbs

Not surprisingly, these predominantly white neighborhoods are also among the wealthiest in the region. The neighborhoods of southwest Portland and the suburbs in this part of the region are hardly distressed communities.  Don’t take our word for it:  Let’s look at the Distressed Communities Index just released by the Economic Innovation Group.  It ranks all the zip codes on the US on a 100 point scale of economic distress, based on a combination of income, poverty, and employment indicators.  The proposed SW Light Rail project runs through four zip codes outside of downtown Portland:  97239, 97219, 97223 and 97224.  Three of these four are rated “prosperous”–the highest income of five categories in the EIG ranking, and one is rated “comfortable.”  None are mid-tier, at-risk, or distressed.

Source: Economic Innovation Group

The average incomes of these neighborhoods are higher than for Multnomah County, the region’s most central, urban county.  Median household incomes in zip code 97219 are among the highest in the region, at just slightly less than $100,000.  The following chart shows the county-wide median and the median incomes in the four zip codes directly served by the proposed light rail line.

Destination:  Suburban shopping mall

The southern terminus of the proposed Southwest Light Rail line is a “lifestyle center” shopping mall called Bridgeport Village.

Bridgeport Village is home to a host of national chain stores catering to high-end households.  The mall’s owners describe it as:

. . . an outstanding and enviable array of exclusive, internationally renowned stores and boutiques which include The Container Store, Anthropologie, GAP, Lululemon, Apple, Crate & Barrel, Brandy Melville, Madewell, Sephora, Sundance, Eileen Fisher, MAC Cosmetics, Tommy Bahama, Soft Surroundings and the largest Regal IMAX Theatre in the state.

They too, note the area’s high income demographics.  The mall’s primary trade area has an average household income of $89,000 compared to $74,000 in the rest of the metropolitan area.

Other projects also chiefly benefit white, wealthy populations

Another project subsidized by the bond measure is a massive freeway interchange serving Portland International Airport.  The interchange will make it quicker and easier for people to drive to the airport (and not incidentally, undercut the relative attractiveness of the already existing light rail service that goes direct to the airport terminal). And the users of facilities like the airport have higher incomes that the rest of the region’s population.  According to Statista, person with an income of $80,000 a year is about 6 times more likely to be a frequent air traveler than someone with an income under $40,000 per year.  Ironically, because the Port of Portland charges market rates for parking ($3 an hour; $24/day) it’s the one place where car users actually shoulder something approaching the cost of their trips, and the Port could easily fund this road improvement out of the fees it charges users; but it prefers to ask that the general public subsidize car trips to and from the airport.

The project proponents claim that the proposal will support investments in safety and pedestrian improvements in the region.  But the bulk of these monies are focused on highway corridors. As we’ve discussed at City Observatory, as a practical matter, “pedestrian safety” improvements in and along highways are of dubious value in creating more walkability, and are in many cases, actually highway improvements—designed to facilitate more and faster car travel.

What this tells us about equity

Urban leaders around the nation are grappling daily with the question of how to fashion policies that achieve “equity.”  There’s a cacophony of voices calling for greater equity, but no definitive yardsticks to say what this means, or measure whether we’re making progress, or even moving in the right direction.  If a light rail line through the richest, whitest portion of a region contributes to “equity” than arguably pretty much any investment could.  Likewise projects that expand capacity on suburban highways, or make it easier to drive to the airport, rather than take the light rail system we’ve already built.

There are pieces of the Portland measure that do support equity, like free and reduced price transit fares. But the most expensive items in the package hardly serve disadvantaged front-line communities. Beyond that, we know that low income communities and people of color are those most likely to bear the brunt of the effects of climate change, which means that the Metro bond measure’s abject failure to reduce the region’s transportation greenhouse gases is, itself, highly inequitable.

The point is that without a clear definition of what constitutes “equity,” this criterion is meaningless. Given our current approach to the subject, “equity” is as vague, personal, and subjective as “beauty.”

Our discussions of equity need to be far more specific and quantifiable:  Who benefits, and how much?  And how do we address the underlying inequities that are built into the existing institutional arrangements for transportation, and that are never questioned:  like virtually universal free parking, or policies that prioritize the faster movement of cars over the safety and livability of urban neighborhoods? Is there anything more equitable than adequate funding for bus operations to assure greater frequency?  We should make equity a key criteria for guiding our transportation policies, but we should do so in a way that systematically makes our overall society more equitable, rather than being a subjective talking point for a particular pet project.

It’s worth imagining what a real, equity-driven regional agenda would look like. For starters, it wouldn’t be based on the premise that everything has to be viewed through the lens of transportation. While inequity manifests itself in the transportation system, the problem is much broader and more fundamental, and is rooted in land use and housing policies, like the prevalence of single family zoning in the Southwest corridor that this proposal does nothing about. An equity focused agenda might also try to learn from and adapt based on the lessons of the Covid pandemic. Arguably right now, widespread and free or inexpensive access to high-speed Internet is a more salient equity issue. Framing this single largest investment in the region’s history solely as a transportation issue forces all communities to play a transportation game rather consider more broadly the range of investments that would provide the livability and opportunity communities are asking for.  Finally, a real equity measure should pay as much attention to how monies are raised as it does to how they’re spent, and be funded by assessing the costs to users.  People are adaptable, transportation systems much less so.  Investing in steel and concrete before investing in new patterns of cost allocation and usage is today short-sighted and should be unthinkable.

 

 

The Great Disconnect: The perverse rhetoric of gentrification

The Great Disconnect

By Jason Segedy

City Observatory is pleased to publish this guest commentary from Akron’s Jason Segedy.  It originally appeared on his blog.

 

image

As this decade draws to a close, the story of urban America is increasingly about the great disconnect between a small number of large cities that are thriving, and a much larger number of cities of all sizes that are continuing to fall behind.

What’s true for a handful of large cities is increasingly untrue for the majority of cities in the vast middle of the country. Nowhere is the great disconnect more apparent than in the debate about gentrification.

Gentrification is a hot topic of conversation in coastal cities, like New York, Washington, and San Francisco, with expensive living costs that are also home to influential journalists and academics.

Writing about gentrification has become a cottage industry for many pundits and urban policy wonks.  Many of the earlier pieces penned on the topic were important, thought-provoking, and well-reasoned.

But what started as the airing of thoughtful, reasonable, and understandable concerns about displacement and inequality in a handful of coastal cities, has turned into intellectual dishonesty, irrational hysteria, and even self-parody, particularly when it is applied to the long-suffering cities of the Rust Belt.

Peter Moskowitz’s How to Kill a City, which Josh Stephens accurately calls “an ideological rant in the guise of journalism” makes it clear that no matter how many times he mentions Detroit, it is clear that the New Yorker simply doesn’t really understand the place.  He says: “The new Detroit is now nearly a closed-loop…It is possible to live in this new Detroit and never set foot in the old one.” I’ve got news for him.  Detroit has been like that for 50 years.  It’s just that the closed-loop was called 8-Mile Road.  Gentrification didn’t kill Detroit.  Urban decline did.  And we can be confident that more decline won’t resurrect it.

A recent New York Times piece on Climate Change warns us that although Duluth may benefit from “climate refugees”, new growth raises the specter of (you guessed it) gentrification.  In case you were wondering, Duluth has been steadily losing population since 1960.

Then there’s Samuel Stein’s Capital City, which at least gets points for originality by dispensing with blaming hipsters or developers for gentrification, and aims its sights squarely on my overwhelmingly leftward-leaning profession of urban planning, even going so far as to say that “proto-planners” (whatever that means) were responsible for Native American genocide as they “enabled the country’s murderous westward expansion, and mapped the rail networks and other infrastructure that made it possible.“

There is even a movement called “Just Green Enough”, which is premised on the idea that parks in poor neighborhoods shouldn’t be made “too nice” in order to prevent displacement by gentrification. Precious energy and effort is expended on endless worry and discussion (and in some cases, active opposition) to a nice park, a new ice cream shop, or a new grocery store, because it could potentially displace someone.

Meanwhile, the poor themselves continue to languish in disinvested and actively-avoided neighborhoods, without any of the amenities or conveniences that the activists and academics have (and take for granted) in their own neighborhoods.

However well-intentioned, these efforts end up doing the same thing – ensuring that people living in poor neighborhoods continue to have the worst of everything, confined to separate and unequal places with substandard facilities and amenities, all “for their own good”.

How elitist, patronizing, and sad.

For those interested in separating data-driven fact from ideologically-driven fiction, a new report, American Neighborhood Change in the 21st Century: Gentrification and Decline, provides a welcome corrective.

Anyone who is serious about understanding urban public policy, equity, and neighborhood change, should read this report.  It is an easy read.

The report examines the ways in which neighborhoods in the 50 largest U.S. metropolitan areas are growing or shrinking; getting richer or poorer; rebuilding or disintegrating.  It quantifies the degree to which neighborhoods are experiencing economic growth, displacement of low-income people, concentration of poverty, and abandonment.

It finds that the most common form of American neighborhood change, by far, is poverty concentration, rather than wealth concentration.  Low-income residents are exposed to neighborhood decline far more than gentrification.  In fact, there was no metropolitan area in the nation where a low-income person was more likely to live in an economically expanding neighborhood than in an economically declining neighborhood.

The findings mirror what Alan Mallach says in his must-read book, The Divided City: gentrifying areas are rarely the most distressed areas of a city, particularly where demolition has unraveled a neighborhood’s fabric, and where few attractive homes or buildings of any kind remain; and predominately African-American neighborhoods are less, not more, likely to experience gentrification than largely white, working-class neighborhoods.

Instead, gentrification typically follows a pattern of black neighborhood avoidance.  Rather than being subject to displacement by gentrification, urban residents who are both black and poor are far more likely to be left behind in neighborhoods experiencing widespread vacancy, abandonment, and disinvestment.

Instead of displacement by gentrification, what we are seeing in most cities in my part of the country, including Detroit, could be described as displacement by decline – as middle class residents, African-Americans in particular, frustrated by the continued social and economic disintegration of their neighborhoods, are moving to safer and more attractive neighborhoods in the suburbs.

While the urban renaissance in a handful of neighborhoods gets all the headlines, it is the rapid concentration of poverty and urban decline that is far more prevalent – and troubling.

I’ve lived my entire life in Akron, which, like Duluth and Detroit, has been losing population and wealth for 60 years now.  Those of us who work on behalf of (and love) these places do our best to fight poverty, abandonment, and urban decline every single day.  Living here, it is hard for me to understand getting worked up in anger at someone with some money in their pocket renovating an old house in an urban neighborhood, opening a brewery, or leasing a brand-new apartment downtown.

I hope that this new report’s findings serve as a wake-up-call to the people who worry so much about the potential downside of urban revitalization, that they are overlooking the far greater challenges of inter-generational poverty, uneven economic growth, disinvestment, abandonment, urban sprawl, and pervasive and entrenched racial and economic segregation.

I see a lot of people, even here in the Rust Belt, who are energized about gentrification, and convinced that it is the enemy.  It’s considered a sexy topic for activism.

But I don’t see the same level of passionate activism being applied to fighting the spread of concentrated urban poverty, neighborhood abandonment, or the yawning racial and economic chasm between older cities like Akron, Cleveland, Detroit, and their newer suburbs.

And let’s be honest.  Those are big, messy, complicated, systemic, extremely intractable problems, and there is nothing sexy about them.  They don’t lend themselves to clever yard sign slogans or quick-take podcasts.  Most people would rather not think about them, because there is not a lot that the average person can even do about them.

But they are the urban problems we need to face.  They are the existential challenges to our cities and to the people who live in them

New development does not always mean displacement, and revitalization is not always a synonym for gentrification.

Gentrification has become a useless word.  Words lose their value whey they no longer have an agreed-upon meaning.  No one knows what the hell that word means anymore.  It’s time to retire it.

Parking and equity in cities

The average price of a monthly parking permit in cities is $2.25, compared to $70.00 for a transit pass.

Everything you need to know about equity and privilege in urban transportation is reflected in how much we charge for parking compared to transit

The triumph of asphalt socialism is reflected in providing unlimited free or underpriced private car storage on public streets (a scarce and valuable commodity) while charging people to make use of transit, (a public good with positive externalities, and plenty of excess capacity).

The benefits of free private car storage of city streets accrue to those wealthy enough to own cars; Those who can’t afford cars get no benefit, plus they have to pay to use the only feasible alternative for many trips:  transit.

No one should invoke the term “equity” in urban transportation without insisting that we start asking those who convert public property to private use for car storage pay for the privilege.

University of Northern Illinois professor Chris Goodman recently compiled data for the nation’s 30 largest cities on the price cities charge for on street parking permits compared to the price of a transit pass.  In every single city, the price of a transit pass exceeds the price of parking by a factor of ten to twenty or more.  For the median city in Goodman’s sample, the monthly cost of a parking pass was $2.25, compared to the a cost of $77.00 for a monthly transit pass. (Our calculation of the median price of parking permit includes only those cities that charge for on-street parking permits; ten of the top 30 cities don’t). Even when cities charge for on street parking, the monthly cost is usually less than a single bus ticket.

We’ve reproduced Goodman’s tabulations in graphic form here.  Cities are ranked according to the amount charged for transit passes, from lowest to highest.

It’s worth noting that the prices of parking permits are only for those areas where cities require permits, and on most streets, in most cities, including, bizarrely New York City, street parking is completely unpriced almost everywhere.  In effect, the prices shown for parking in Goodman’s sample overstate what city’s actually charge for parking:  it’s mostly zero.

The disparity between what people pay to park their cars on the public street (nothing or very little) and what they have to pay to use transit speaks volumes about privilege and equity in transportation.  To take advantage of free or low cost on street parking, you have to own a car, which automatically means the poorest households receive little or no benefit; meanwhile, because car ownership is highly correlated with income, more of the benefits go to high income households.

It’s also worth noting that private car storage on the street has all the aspects of a private good:  In economic terms its rivalrous and excludable.  When you park your car along a street you deprive others of the opportunity to use that space. Others can include other car owners who might like to park there, as well as other road users, who might want to walk, cycle, or say, in the era of Covid, set up tables for a bar or restaurant.

In contrast, transit has many of the characteristics of a public good.  Except at peak hours (in pre-Covid times, that is) buses and trains almost always have excess capacity, so your use of a train or bus seldom deprives others of its utility. In addition, transit has positive externalities:  it results it less traffic congestion and pollution, and lower energy use than car travel, and supports walkable urban destinations.  Finally, its worth noting that the only places where transit really works well in the United States are in the areas where cities charge for parking.  When street parking is free, people own cars and drive, depriving transit systems of customers and revenue, and skewing the transit ridership to the dispossessed and powerless.

From an economic and an equity standpoint, it would make vastly more sense to make on-street parking expensive (to reflect its real costs) and to make transit cheap or free.

The way we price transit, and don’t price private car storage in the public realm, is evidence of “Asphalt Socialism“–subsidies for cars and driving, and high prices and penalties for those who take transit.  As Dr. King once observed, we have socialism for the rich and rugged free enterprise discipline of the market for the poor. In an era when so many urbanists and transportation advocates profess great concern for equity, the subsidies to parking are one of the most inequitable aspects of the urban realm.

City Beat: Another sketchy claim of Covid-driven urban flight

Again:  It’s anecdotes, not data that are fueling claims of an urban exodus due to Covid-19

The virus is now  deadlier in the nation’s rural areas than it is in cities, undercutting the basis for the urban flight theory

Since the early days of the Coronavirus, the media has regularly trumpeted anti-city screeds, a kind of 21st Century echo of the “teeming tenements” indictment of the alleged unhealthiness of urban living in the 19th Century.  In the Spring, the worst outbreaks were in the New York City metro area, which automatically led many to equate size and density with pandemic risk. As we’ve noted, the canonical story is generally the product of a reporter quoting some suburban real estate agent about a sale they’ve just made to someone who moved from the city. But as we’ve noted time and again, and yet again, the data don’t support the urban exodus theory.

Last week we saw another provocative headline about surging interest in suburban living, this time from usually reliable analysts at John Burns Real Estate Consulting (JBREC).  They claimed that

We have all heard the tales of the current renter migration to the suburbs and even to the exurbs in some parts of the country. Until now, the migration story captivating the housing sector has been mostly anecdotal. But we now have proof! Our recent national survey of single-family rental (SFR) operators provides hard data confirming the migration movement that has been amplified by the pandemic.

. . .

59% of new SFR tenants are relocating from urban locations, with 41% of new tenants moving from already suburban locations.

We took a close look at the report.  It’s single key data point is this:  In their survey, 59 percent of those renting single family homes from survey participants were moving from urban areas.

We don’t have any reason to doubt the accuracy of that statistic, but without a little more context, its impossible to know what it means.   Specifically, we don’t know if 59 percent of new residents coming from urban locations in higher than prior to the pandemic, or lower, or whether its influenced by seasonality or other factors.  We followed up with JBREC’s Devyn Bachman, who confirmed that they don’t have data for the prior year, and telling us that the surge is based on anecdotal reports.  The critical part of the story here is not the fraction of people moving from urban locations to suburbs, but whether that trend has shifted noticeably from prior years.  And here, again, what we have is not data on such a shift, but simply anecdotes.

Its also worth keeping in mind that suburban single family rental housing is a relatively small segment of the market (most rentals are still multi-family).  And Burns’ survey is a sample of larger scale, institutionally owned single-family landlords, which are an important and growing segment of the market, but a decided minority of single family landlords.

The pandemic is now far worse in rural reas

The foundation of the “urban flight” hypothesis is the notion that cites are vastly more risky than suburbs or rural areas:  by fleeing, you can reduce your risk of getting Covid. The JBRE story pushing the urban flight hypothesis now seems a bit dated in light of the recent data on the spread of Covid-19.  While it was true that cases and deaths per capita were higher in cities in the Spring, that’s no longer the case.  In fact, the relationship between city size and Covid-19 mortality is now just the reverse, with the highest rates of deaths per capita in the nation’s most rural and least dense communities.  Based solely on location factors, those who fled to the countryside earlier in the year are now statistically at much greater risk of being diagnosed with and dying from the disease that their urban counterparts.  Our friends at the Daily Yonder have chronicled the grim shift in death rates:

The nuance here seems to be that cities and large metro areas are more closely connected to the rest of the world, and while exposed to the Coronavirus first, and at a time when knowledge of the virus’s danger and preventative measures was limited, there was nothing about the urban environment that made its residents more susceptible to Covid. As the pandemic spread, more sparsely populated areas which were insulated from the virus primarily because of less frequent and robust connections to other places ceased to be refuges.

“Urban flight” as collective journalistic hysteria

NPR’s On the Media took a close look at these stories and concluded that the “urban flight” meme is both widespread and utterly false.  In an incisive article at real estate website Curbed, Jeff Andrews attributes the popularity of these stories to biases of reporters:

Given that the media industry is concentrated in Manhattan — with another good chunk in San Francisco — journalists seem to be confusing the minor outbound migration from two ridiculously expensive areas with the double dose of demand happening across the country.

More insidiously, some members of the media are willing to peddle stories about nonexistent carnage in the streets, extrapolating that the cities — all cities, but especially the diverse, Democratic–led ones — are headed for inevitable collapse. And it’s hard not to separate that dark fantasy from a Republican talking point.

But, according to the data, it’s just not happening.

The idea that the pandemic has upended real estate markets and is triggering a flood of migrants to suburbs and rural areas has enormous appeal for reporters and their editors. Anecdotes to the contrary notwithstanding, there’s virtually no data to show this is happening.

City Beat is City Observatory’s occasional feature pushing back on stories in the popular media that we think are mistakenly beating up on cities.

City Beat: No flight to Portland’s suburbs

Another anecdote-fueled, data-starved article repeats the “suburban flight” meme, this time for Portland.

Actual market data show the central city’s market remains strong

Janet Eastman, writing in the Portland, Oregonian, offers up yet another example of a popular journalistic trope, the “Coronavirus is triggering a flight to the suburbs.”

Never mind, of course, the point that the pandemic is just as bad, and in many cases worse in the nation’s suburbs than it is in cities, a mythical belief that density aggravates the Coronavirus is continually being repeated by journalists.  We’ve pushed back against stories making this claim from The Wall Street JournalNational Public Radio, and The New York Times.

The evidence for the story consists entirely of anecdotes of two households who have recently moved from the city of Portland to one of its nearby suburbs, peppered with quotes from a handful of real estate agents about panicky buyers and bidding wars.  While this make strike some observers as unusual, it isn’t.  People are always moving into and out of the city from its suburbs; the movements are in both directions.  So finding a handful of households moving to the suburbs can be done at any time. But it doesn’t signify a trend, or even that the movement is more in one direction rather than another.

For that, we need data. Some of the best real-time data on housing market trends comes from the web-based real estate search sites that track where people are looking for housing, and how rents and home prices are changing in response to shifting demand.

One of the biggest real estate web-sites, Zillow, just completed an extensive analysis of exactly this question:  whether suburbs were seeing their market share increase at the expense of cities.  They looked at data from across the nation, and their conclusion was unambiguous:

Are people fleeing the cities for greener suburban pastures? Some faint signals may have emerged in certain places, but by and large, the data show that suburban housing markets have not strengthened at a disproportionately rapid pace compared to urban markets. Both region types appear to be hot sellers’ markets right now – while many suburban areas have seen strong improvement in housing activity in recent months, so, too, have many urban areas.

Zillow’s Economic Research team analyzed a variety of Zillow data points in order to illustrate this trend. Data related to for-sale listings are generally the best indicator of real-time housing market activity, and in all but a few cases, suburban markets and urban markets have seen similar changes in activity in recent months: about the same share of homes selling above their list price, similar  changes in the typical time homes spend on the market before an offer is accepted, and recent improvements in newly pending sales have been about the same across each region type.

[emphasis in original]

Of course, that’s the national pattern.  What about Portland, specifically?  Again, Zillow’s analysis shows that in fact, in Portland, the opposite is the case.  Zillow’s data show suburban rents have decelerated by about 1.9 percent since February, but urban rents have decelerated by only about 1.1 percent, a little over half as much.  Zillow’s data on home searches also shows that buyers remain even more interested in city homes relative to suburban ones than they were prior to the pandemic. And Portland followed this pattern as well, with urban searches increasing their market share relative to suburban searches in Portland at the height of the pandemic. Oregon state economist Josh Lehner points out that there’s been no shift in the city of Portland’s share of regional home sales during the pandemic.

“Urban flight” as collective journalistic hysteria

NPR’s On the Media took a close look at these stories and concluded that the “urban flight” meme is both widespread and utterly false.  In an incisive article at real estate website Curbed, Jeff Andrews attributes the popularity of these stories to biases of reporters:

Given that the media industry is concentrated in Manhattan — with another good chunk in San Francisco — journalists seem to be confusing the minor outbound migration from two ridiculously expensive areas with the double dose of demand happening across the country.

More insidiously, some members of the media are willing to peddle stories about nonexistent carnage in the streets, extrapolating that the cities — all cities, but especially the diverse, Democratic–led ones — are headed for inevitable collapse. And it’s hard not to separate that dark fantasy from a Republican talking point.

But, according to the data, it’s just not happening.

The idea that the pandemic has upended real estate markets and is triggering a flood of migrants to suburbs and rural areas has enormous appeal for reporters and their editors. Anecdotes to the contrary notwithstanding, there’s virtually no data to show this is happening.

City Beat is City Observatory’s occasional feature pushing back on stories in the popular media that we think are mistakenly beating up on cities.

Lived segregation in US cities

We’re much less segregated during the day, and when we’re away from home

Commercial and public spaces are important venues for interaction with people from other racial/ethnic groups

Patterns of experienced segregation tend to mirror residential segregation across metro areas.

In the US, we measure racial and ethnic segregation using census data that reports where we live. For example, the rankings of white/non-white segregation in US metro areas that we reported last month are based on residential data from the American Community Survey. But looking just at where we live (or sleep) doesn’t tell us everything about segregation.  Some worry that while we may live near each other, we don’t spend a lot of time with people from other racial/ethnic groups. Indirect measures—like the extent of interracial marriage—confirm that less segregated places do have more inter-group interaction, where and when that kind of interaction happens outside of residential neighborhoods has been a bit of mystery—until now.

A new paper from economists Susan Athey, Billy Ferguson, Matthew Gentzkow and Tobias Schmidt uses anonymized cell-phone data to look at patterns of racial segregation in US metro areas over the course of the day and in different locations.  We may live in different neighborhoods, but do our daily activities, like working, shopping, dining and entertaining, cause use to be less segregated? Their answer is a definite yes.

Athey, et al, use a “isolation index” to measure the extent to which white and non-white persons interact with one another over the course of the day. Also, because their underlying data is anonymized, they can’t directly observe the race/ethnicity of individuals, and so rely on census data on the race and ethnicity of their neighborhoods (actually, census blocks) to classify observations.  Consequently, their results are best thought of as showing the extent to which people who live in predominantly white neighborhoods interact with people who live in predominantly non-white neighborhoods in different places and at different times of the day.

The paper’s key insight is a new and broader computation of the lived, or in their words “experienced” level of segregation in US cities over time and space.  As a rule, we’re less segregated over the course of our daily routines than we are when we’re at home.  As with conventional measures of residential segregation, there are wide variations across cities, with some cities much more segregated than others.

Segregation by Time of Day and Place

The cell-phone data allow the authors to compute the isolation index by time of day, and it turns out we’re most isolated from one another (by race and ethnicity) when we’re in our homes.  Over the course of the day, though, we’re much more likely to interact with people from different racial/ethnic groups.  The following chart plots the hourly variation in the experienced isolation index relative to the residential isolation index for large cities.  Higher levels indicate more segregation. The data show a consistent pattern across cities, with isolation being highest overnight, dropping during the morning.  Mixing peaks at about Noon, and declines thereafter.  At midday, people are about 40 percent less segregated than they are overnight.

The cellphone data also enable the authors to compute isolation indexes for different kinds of locations.  As noted home locations tend to have the highest levels of isolation, but many commercial and social settings produce much higher levels of mixing (lower levels of isolation).

The cell phone data allows the authors to look at a variety of destinations. In general all of the locations (schools, parks, retail, restaurants, bars, entertainment) are broadly similar.  Accommodations and entertainment seem to have the greatest levels of intergroup mixing.

Which metros have the lowest levels of segregation?

Athey, et al look at the patterns of isolation across metropolitan areas and compare them to common measures of residential segregation.   The following table shows their estimates of experienced segregation for metro areas with a population of 1 million or more.  Metro areas are ranked according to the isolation index, with the least segregated (lowest scores) first, and most segregated (highest scores) last.

Similar to a ranking of segregation in the central urban counties of the nation’s large metropolitan areas that we published last month at City Observatory, Portland has the lowest level of both residential and experienced segregation of any large US metro area, followed by Seattle, Minneapolis, Raleigh and San Francisco.  The cities with the highest levels of experienced segregation include Milwaukee, Detroit and St. Louis.

Experienced segregation largely mirrors residential segregation.

In general, cities with high levels of residential segregation also tend to have higher levels of experienced segregation. The following chart shows the relationship between residential segregation (on the vertical axis) and experienced segregation (on the horizontal axis).  There’s a strong positive correlation between the two types of segregation (with experienced segregation being considerably less than residential segregation in most metropolitan areas.

The Athey, et al, research has important implications for thinking about how we can more quickly foster more integrated communities. Because we move to different housing relatively infrequently, and because housing markets change even more slowly, changing patterns of residential segregation take time. But the fact that we already have higher levels of intergroup exposure in a range of public and commercial spaces suggests that they may be easier and more fruitful opportunities to promote integration.  As the author’s explain:

People spend substantial time away from their home neighborhoods, and when they do they are much more likely to encounter diverse others than they would at home. Commercial places like restaurants and retail shops are a particularly strong force pulling against segregation, while local amenities such as churches and schools tend to remain more segregated. One implication is that public goods that are tied to residential boundaries should be a particular focus of efforts to combat segregation.

Susan Athey, Billy A. Ferguson, Matthew Gentzkow, and Tobias Schmidt, Experienced Segregation, Working Paper 27572 http://www.nber.org/papers/w27572

Why this Portland transit veteran is voting no on Metro’s bond

Editor’s Note: City Observatory is pleased to present this guest commentary from GB Arrington, longtime veteran of Portland’s transit and land use planning systems, explaining why he’s against a proposed $5 billion transportation bond measure proposed by Metro that will be voted in the Portland region this November.

By GB Arrington

When I stepped down as TriMet’s Director of Strategic and Long Range Planning, the TriMet Board gave me a fancy brass clock engraved with “You may not like the message, but you know it’s true.” So, here is the unvarnished truth, a vote for Metro’s $5 billion Transportation Bond Measure will hurt the region’s livability. It’s a bad idea, and that is why I’m voting no.

The measure Metro referred to the voters is an example of broken old-time transportation politics. You will be voting on a Christmas tree loaded down with sprawl inducing road projects and a new MAX light rail line to Washington County that simply does not stand up to rigorous analysis. Metro paved the measure’s way to the ballot by giving every part of the region an earmark for a pet project. New suburban highway capacity in Clackamas County. Bridge replacement in Multnomah County. Quicker access to Port of Portland airport parking garages.

For two plus decades my job at TriMet was to help set the future direction and weigh in on the decisions required to help assure the success of TriMet and the region. I was part of the team planning the region’s first MAX lines. I sat at the table shaping Metro’s Region 2040 land use and transportation plan. And I lead the planning to create livable mixed-use communities around the east and westside MAX lines. Collectively, we worked to ensure the livability of the region’s residents and getting more riders for TriMet.

The opening of the Portland Transit Mall in 1978, when I was a young planner, was the region’s first installment in a signature strategy that would repeat itself over and over — using transit investments to help achieve broader community building objectives. In that case to help revitalize a declining downtown. After decades of hard work, the Portland region rightly won global acclaim for how it pioneered linking new transit investments with land use to leverage its vision to “grow up, not out.”

Sadly, Metro has now chosen conventional political pork barrel over being the steward of the region’s livability. Metro’s proposed transportation measure departs from its past track record of success and innovation. The most expensive project in the package, the $2.8 billion SW Corridor light rail project, is a perfect case in point on what’s wrong. Don’t be fooled by the hyperbole, the SW Corridor has always been a very expensive politically driven project with marginal transit and land use benefits.

The region did the best MAX corridors first. So, it should come as no surprise that corridor number eight does not stack-up well compared to corridor one, two or three. That may help explain why ridership on recent MAX lines have failed to meet projections. The newest line to Milwaukie forecasted 17,000 average daily riders by 2016. So far, before COVID, it has averaged less than 11,000 daily riders – 35 percent less than promised.

Some inherent challenges unavoidably came with the territory when TriMet and Metro chose a politically expedient low-density auto-dominated corridor for MAX line number eight. At the end of the day, compared to other MAX projects, the SW Corridor has too few riders, costs too much, and does too little to continue the region’s legacy of using transit to create livable vibrant equitable transit-oriented places.

TriMet and Metro are counting on 50 percent of the SW Corridor’s funding to come from the Federal government. I helped write the federal rules to evaluate which projects qualify for funding. Demand nationally for federal transit funding has grown by about 70% over the past few years.

With limited discretionary federal funds, the national competition is highly competitive. Apart from the politics, the equation for who gets funded is pretty simple. The capital costs of each project are divided by its benefits (ridership and transit friendly land use). Projects that meet the criteria set by Congress are then eligible for funding.  The critical numbers the federal government will be looking at to evaluate the SW Corridor are moving in the wrong direction. Ridership forecasts have gone down more than 12 percent this year, while capital costs to build the project have climbed and climbed. Fewer benefits and more costs. That may add-up to a fatal flaw in the region’s financial plan for getting federal funding.   The SW Corridor also fails to actually connect to important destinations in this part of the region.

A big unknown for the project remains unanswered. How to serve “Pill Hill,” —Oregon Health and Science University—the biggest ridership generator on the corridor. TriMet is more than a year late with a multi-million dollar decision on the Marquam Hill Connector to move some 10,000 daily riders up nearly 120 vertical feet from the Gibbs Street station and then disperse them in different directions, another 100–200 feet higher on the hill. TriMet can’t tell you the cost, the technology or how it gets paid for.  Similarly, the light rail line also doesn’t serve Portland Community College’s Sylvania Campus.

Don’t just take my word on the SW Corridor projects flaws. Former Metro President David Bragdon, now executive director of the TransitCenter, a New York City think tank, summed up it up well for Willamette Week in March:

“The current low-density land uses and auto-dominant street design along Barbur [Boulevard] are big challenges. For a line to succeed, it needs to have a whole lot more housing and job density, and much more pedestrian-oriented streetscapes and station areas, than that corridor currently does.”

It saddens me to vote no. I love the Portland region. And I’m privileged to have been a pioneer in helping set the direction for transit and land use in Portland over nearly a quarter of a century.

I’ve spent the next 20 years as a consultant working in more than 25 states and eight foreign counties applying the lessons I learned in Portland. So, I’m very surprised TriMet and Metro lost sight of their job to deliver livability to the region’s citizens. Going along with political backslapping and road building is not what made the region stand out as a global model of livability. Pursuing political business as usual is a sure- fired recipe for making Portland more like sprawling auto-oriented Houston than the Portland we have long worked and aspired to be. I ask you to join me in voting no and committing to work with TriMet and Metro to get the region to get back on the right track.

GB Arrington was TriMet’s Director of Strategic and Long Range Planning in a career that spanned from 1975 to 1999. He left to join Parsons Brinckerhoff (now WSP) where he led a global planning and design practice from 1999 to 2012. Time Magazine highlighted GB’s plan for Tysons Corner, VA as one of “10 Ideas for Changing the World Right Now.” GB is a Former board member and a founder ofRail~Volution. Since 2012 he has been the principal of GB Place Making, LLC.

More performative pedestrian infrastructure

Houston’s “Energy Corridor” gets a pedestrian makeover, but just one thing seems to be missing.

Bollards and better landscaping can’t offset the increased danger from wider, faster slip lanes.

Most “pedestrian” infrastructure projects are often remedial and performative; their real purpose is to serve faster car traffic.

Houston’s “Energy Corridor” is a commercial district west of Downtown Houston that’s home to a number of energy companies like BP and Conoco Phillips.  Unsurprisingly, it’s a heavily auto-dominated area.  We read with great interest last week a news report describing a new pedestrian infrastructure project at the intersection of two main arterials, Eldridge Parkway and Memorial Drive.

The Houston Chronicle hailed it in an article titled “The Energy Corridor District unveils west Houston’s first protected intersection.”  Here’s an aerial view of the project.

See all the pedestrian infrastructure? (This is actually the “after” picture.)

To be sure, there are wide sidewalks, clearly marked crosswalks, attractive plantings, and new signal lights.  But essentially the only “protection” for pedestrians and cyclists are a series of bollards.  If you step back and consider the setting of the project, its apparent that it remains an auto-dominated and pedestrian hostile environment.

For starters, the inescapable fact is that you have two busy multi-lane arterials–the kind of roadway that’s been consistently shown to be the most deadly to pedestrians. Nearly 60,000 cars a day go through this intersection.  Second, a key feature of the project is two right turn “slip lanes” that slice through the corners of the intersections.  Slip lanes like these allow (and encourage) cars to make faster turns, and also increase the crossing distance for pedestrians.  The slip lanes have marked crosswalks, but they appear to be governed only by “yield” signs, not traffic lights, and Houston drivers are notorious for not yielding even when the law requires it. (We’ve got more detail on these slip lanes, and their problems, below).

Let’s zoom in to street level.

Sidewalks, crosswalks, even bollards, but no actual pedestrians.

These pedestrian safety problems are apparent when you look at the promotional photographs provided by the project’s sponsors, the Energy Corridor District.  The illustrations show a nice new intersection, but you’ll notice one element conspicuous by its absence:  pedestrians.

Of course the project’s design aimed to be very pedestrian oriented.  You can tell that from the artist’s pre-construction concept.  Like so many such illustrations, it shows roughly as many pedestrians and cyclists as cars (we counted 38 cars and 41 pedestrians and bikes).  The reality of course is closer to all cars and zero bikes and pedestrians.

The big underlying problem though is that the Energy Corridor is a place laid out for cars and car travel. The reason no one walks in Houston, or in its Energy Corridor, as in so many such places in the US, is that there’s very little nearby to walk to.  (Pro-tip:  any area that describes itself as a “corridor” is almost always an auto-dominated, pedestrian-hostile space, a place people travel through, rather than being in).  The Energy Corridor is just a short distance from Houston’s mammoth Katy Freeway, the nation’s widest. A quick glance at Google maps that shows that within a block or two of the intersection you have a single bank, a convenience store, a CVS drug store and a lone Chinese restaurant—and almost no other retail or service businesses.

With 60,000 cars zooming by, with slip lanes that encourage drivers to take fast right turns, and with nothing nearby to walk to, it really doesn’t matter how wide are the sidewalks or how beautiful the plantings or how numerous the bollards.  While this has the veneer and some of the trappings of walkability, it’s just not a walkable area. There’s a lot of loose talk about “retrofitting suburbs” and “walkable suburbanism” but examples like this show just how hollow and meaningless those terms can be. And while we’re picking on Houston here, you can find similar examples of performative pedestrian infrastructure in almost every US city.

As we’ve said, much of what is labeled pedestrian infrastructure is in reality car infrastructure.  In a place populated entirely by pedestrians and bicycles, for example, there’s no need for wide rights of way, grade separations or traffic signals. In even the most crowded cities, people simply walk or ride around one another. If it’s just people walking, there aren’t even lane markings.  Humans have long had the ability to avoid collisions, using subtle visual cues. Pedestrian friendly places don’t need elaborate infrastructure.

When we build a sidewalk along a busy arterial, or put in a traffic signal or some bollards, we may call it “pedestrian” infrastructure, but the only reason it’s actually needed is because of the presence  and primacy of cars.  It is at best remedial, and its purpose is primarily to benefit cars, speeding car travel by freeing drivers from the need to pay attention to or yield to pedestrians (or to only have to do so under strictly limited conditions).

Last year, we highlighted this example from Orlando suburb, Lake Mary, where the  city has constructed two pedestrian bridges over the highway, with a 153-foot span.

Italian-inspired walkability: Passeggiata, anyone?  (DRMP Engineering)

These elaborate and expensive pedestrian bridges are at best a remedial effort to minimize the danger this environment poses to anyone who isn’t in a car. They don’t really make the area any more desirable for walking. The real problem is not the infrastructure, or lack thereof, but a built environment that’s inhospitable to walking and cycling.

Much of what purports to be “pedestrian” infrastructure, is really car infrastructure, and is only necessary in a world that’s dominated by car travel, in places that are laid out to privilege cars.

Real pedestrian infrastructure is a dense, mixed use area that shuns or at least slows private automobiles. A place with a mix of housing types (apartments, duplexes or triplexes and single family homes), local-serving businesses, and a grid of streets, rather than the rigid, hierarchical arterial/collector/cul-de-sac model of most post WWII US suburbs.  It’s about neighborhoods where people don’t have to cross multi-lane arterials to shop, attend school or visit a public park. Walkability and pedestrian safety are really about building great places, not piecemeal and largely decorative so-called infrastructure.

More on slip lanes

Transportation for America’s Stephen Davis explains that slip lanes are inherently dangerous because they encourage cars to speed through intersections:

Slip lanes are dangerous because they prioritize vehicle speed over the safety of everyone who needs to use the road.  Slip lanes increase the distance that people have to cover to cross a street, put people into spots that are often the hardest for drivers to see, and encourage drivers not to slow down when approaching an intersection and a crosswalk—the precise moment they should be the most careful.

While advertised as improving pedestrian safety, this project actually widens and lengthens the existing slip lanes.  It also increases the slip lane’s radius of curvature, enabling cars to make the turn even faster than would be possible in the narrower, sharper slip lane they replaced.  Both the wider distance of the new slip lane, and the faster speeds it tends to encourage actually make the intersection more dangerous for pedestrians than before.  Here are two Google Streetview images of the intersection of westbound Memorial Drive onto northbound Eldridge Parkway

Slip Lane– BEFORE (2018)

Slip Lane– AFTER (2020)

 

 

The myth of pedestrian infrastructure in a world of cars

Big money “pedestrian” projects are often remedial and performative; their real purpose is to serve faster car traffic.

One of the biggest lies in transportation planning is calling something “multi-modal.”  When somebody tells you a project is “multi-modal,” you can safely bet that its really for cars and trucks with some decorative frills appended for bikes and pedestrians.  A four- or six-lane arterial, posted for 45 miles per hour, and with crossings every half mile or more isn’t pedestrian friendly no matter how wide the sidewalks are on either side of the road.

Much of what is labeled pedestrian infrastructure is in reality car infrastructure.  In a place populated entirely by pedestrians and bicycles, for example, there’s no need for wide rights of way, grade separations or traffic signals. In even the most crowded cities, people simply walk or ride around one another. If its just people walking, there aren’t even lane markings.  Humans have long had the ability to avoid collisions, using subtle visual cues. Pedestrian friendly places don’t need elaborate infrastructure.

When we build a sidewalk along a busy arterial, or put in a traffic signal or build a pedestrian overpass, we may call it “pedestrian” infrastructure, but the only reason it’s actually needed is because of the presence  and primacy of cars.  And its purpose is primarily to benefit cars, speeding car travel, by freeing them from the need to pay attention to or yield to pedestrians (or to only have to do so under strictly limited conditions).  If a pedestrian crosses outside a sidewalk, or against a light, the law routinely exempts vehicle drivers from any penalties from hitting or killing them.

Most elaborate “pedestrian” infrastructure is really car infrastructure. As an example, lets have a look at Lake Mary, Florida, a suburb of Orlando. Like much of suburban Florida, Lake Mary is a grid of multi-lane arterials. One of the city’s highest crash locations, according to its transportation plan, is the intersection of Lake Mary Boulevard and Country Club Road.  Lake Mary Boulevard is 7-lanes wide, with turn-lanes and through traffic lanes, and is a daunting obstacle for pedestrians, so the city has constructed two pedestrian bridges over the highway, with a 153-foot span.

Italian-inspired walkability: Passeggiata, anyone?  (DRMP Engineering)

The engineering firm that built the crossing describes it as a “having a Mediterranean/Italian style.” and touts its “highly decorative safety enclosure and decorative cladding walls .”  Anyone who has ever walked for five minutes in an Italian city will be hard pressed to find any substantive resemblance.  The ramps needed to reach the elevated structure roughly triple the crossing distance for pedestrians, which probably explains while people still use the grade-level crosswalk.

(Google Streetview)

These elaborate and expensive pedestrian bridges are at best a remedial effort to minimize the danger this environment poses to anyone who isn’t in a car. They don’t really make the area any more desirable for walking. The real problem is not the infrastructure, or lack thereof, but a built environment that’s inhospitable to walking and cycling.  Even the densest parts of Lake Mary get a Walk Score of 49 “car dependent” and most housing has Walk Scores of 20 or less, meaning that people need a car for almost all of their basic travel.  Here’s a heat map, the yellow and red areas have Walk Scores of less than 50 (car-dependent), the gray areas are below 10.

Here’s another, example, from Port Wentworth, a suburb of Savannah, Georgia.  Here, the Georgia Department of Transportation has built a $4 million pedestrian overpass over a four-lane highway, Augusta Road (GA-21).  The bridge’s 178-foot span  connects a new residential subdivision on one side of the highway with other subdivisions and a local school on the other. The overpass features lengthy serpentine switchbacks on both sides, more than quadrupling the distance one has to walk as opposed to using the highway’s crosswalk.

US 21 pedestrian crossing, Port Wentworth, GA (ICE)

The total population of Port Wentworth (in 2018) was about 8,500 persons, so the $4.1 million cost of the overpass works out to about $500 per capita.  Few if any cities in the US spend so much per capita on “walkability.”

But Port Wentworth is anything but walkable.  Redfin calculates the city’s overall Walk Score as 20.  Of the apartments for rent on either side of Augusta Road, a handful have scores in the low teens; most are under ten, and several have a Walk Score of “1” the lowest possible score. Even with a prodigious investment in “pedestrian” infrastructure, this is not a place for people walking or biking.

The irony of course is that Port Wentworth is a suburb of one of America’s most delightfully walkable cities, Savannah.  The indelible imprint of its 18th Century town planning with regular squares, tree-shaded streets and a mix of housing, all laid out for walking–no $4 million highway over-crossings to be seen. Hundreds of thousands of tourists come to Savannah each year, mostly just to walk around, in a way that’s impossible nearly anywhere else in North American city, in neighborhoods that are illegal to build in almost every municipality.

“Pedestrian infrastructure” is an oxymoron.  In a place that’s hospitable to people and walking pedestrians don’t need separate “infrastructure”—they can use the streets as a place to walk, just as humans have done for the several thousand years in which there were cities but no cars.

Much of what purports to be “pedestrian” infrastructure, is really car infrastructure, and is only necessary in a world that’s dominated by car travel, in places that are laid out to privilege cars. It’s telling that the “level of service” provided to pedestrians (nominally for their safety) would never be tolerated in any freshly built or “improved” highway project:  The the ramps to reach overpasses double, triple or quadruple the distance a pedestrian must travel to cross a roadway, and require them to ascend and descend a substantial grade.  No highway engineer would build a bypass that doubled or tripled travel times for cars, but they regularly do this for people on foot.

A somewhat better form of pedestrian “infrastructure” if we actually could create such a thing, might look more like raised crosswalks. Sandy James has a nice definition. Raised crosswalks, she writes:

. . . are walkable speed humps that are at the same grade as the sidewalk on either side of the street. The raised crosswalk serve to  elevate the pedestrian, and slow vehicular traffic . . .

Raised Crosswalk in Sydney (David Levinson)

Raised crosswalks make a space more comfortable for pedestrians, and marginally slower for cars. But perhaps most importantly the raised crosswalks redefine the “ownership” of space; they signal to drivers and pedestrians alike that the walkers have priority:  that cars are driving across a sidewalk, rather than pedestrians are walking across a road.  Curbs (and curb cuts) signify to everyone that pedestrians are stepping out of “their” space, and into the space “owned” by the driver. But in the US, raised crosswalks are extremely rare. Again, though, raised crosswalks are really only necessary because of cars, so they are actually car infrastructure, not pedestrian infrastructure.

The takeaway here is that real pedestrian infrastructure is about a dense, mixed use area that shuns or at least slows private automobiles. A place with a mix of housing types (apartments, duplexes or triplexes and single family homes), local-serving businesses, and a grid of streets, rather than the rigid, hierarchical arterial/collector/cul-de-sac model of most post WWII US suburbs.  It’s about neighborhoods like old Savannah, where people don’t have to cross multi-lane arterials to shop, attend school or visit a public park. Walkability and pedestrian safety are really about building great places, not more infrastructure.

And that, it a nutshell, is one of the big problems with Portland Metro’s proposed $5 billion transportation ballot measure.  It proposes spending lots of money on “pedestrian” improvements to a series of highway corridors, the multi-lane, car-dominated arterials that slice through the region. They are undoubtedly a safety menace. But money for wider sidewalks, better illuminated crosswalks, and even grade separated crossings like those shown above won’t do much, if anything to make these areas or the region more walkable, because in the end, these corridors are still dedicated to moving lots of cars as fast as possible. If the region really wants to promote walkability, it needs to focus on building places, especially town centers and main streets, where car traffic is shunted or shunned, and people on foot or on bikes are the dominant and prioritized use.  Its about place—not “infrastructure.”

 

 

Is there anything “smart” about smart cities?

Big data and new technology make bold promises about solving urban problems, but not only fall well short of solutions, but actually can end up making things worse.

Why we’re skeptical of the “smart city” movement.

You can’t be an urbanist or care about cities without hearing—a lot—from the folks in the “Smart Cities” movement. The idea is that there’s nothing wrong with cities that a healthy does of information (especially “big data”) or technology can’t solve. The result has been an seemingly unending series of claims that we an fix problems as challenging as housing, traffic and inequality simply by building more elaborate models based on big data, or deploying new forms of technology.  Color us skeptical:  It’s hard to see how we’ll make better decisions with even bigger data when policy makers seem to routinely ignore the small and obvious data that’s already well in hand.

As an example, we start with a fairly simple problem. Lots of Americans would like to be able to walk to more common destinations.  They pay a substantial premium for living in housing and neighborhoods with high levels of walkability. Yet we seem to be consistently building more cities and neighborhoods that are wildly car-dependent. Surely, if smart cities and technology are the solution, they ought to be able to grapple with this basic problem. But they aren’t.

We’ve frequently picked on Houston, a city which is notoriously hostile to people who want to walk (although all American cities need to improve in this regard).  The typical technocratic/smart city approach which gathers copious data about current travel patterns (which themselves reflect a car-dominated world), contain almost no information about walking and biking, and even more importantly, never ask the question of whether people would prefer places that made it easier, more convenient and safer to walk, compared to ones optimized for vehicle movement.The way the “smart city” and technology folks approach it, “fixing” cities and transportation is all about vehicles, as their simulations illustrate.

MIT’s “Drivewave”

In the “smart city” world, this kind of thinking leads to claims that we can eliminate all traffic lights at intersections, turning vehicle control over to centralized computers (essentially forgetting about bikes and pedestrians), or that autonomous taxis could eliminate congestion, but which ignore fundamental concepts like induced demand. Researchers at MIT and the University of Texas came up with these ideas, and simply assumed pedestrians and cyclists don’t exist, as Eric Jaffe explained, to these auto-oriented engineers:

It’s natural to model intersections as if cars were the only mode that mattered—especially when computer drivers make every move predictable. The driverless intersection we presented a few years ago, based on work from computer scientists Peter Stone and Kurt Dresner of the University of Texas at Austin, made the same assumptions: lots of cars, no people or bikes.

Even really smart people, armed with loads of data and “design-thinking” tend to ask questions that are fundamentally too narrowly drawn, and that emphasize movement for movement’s sake, rather than harnessed to any greater sense of well-being or quality of life.
Transportation planners and self-proclaimed “smart city” technologists focus on optimizing cities for vehicle movement, and place little or no value on optimizing the quality of place for people, whether they’re walking, biking, or just choosing to be in a particular place. The “Big Data/Smart City” viewpoint imagines that making cities work is only about getting from “A” to “B”, when in reality the urban challenge is creating places that people want to “be.” That’s a big, big problem.

The real challenge for cities is being more ambitious and aspirational in building the  livable and inclusive places we want.  That’s less about tweaking the performance of systems like transportation, and more about building strong community engagement around the vision we have for the future.  Consider housing, which is an economic, affordability and equity challenge.  Again, there are abundant technological solutions, like 3-D printing houses, that are provocative, but don’t deal with the fundamental institutional problem that we simply make it illegal to build affordable housing in lots of places in the US. The YIMBY-led effort to build a coalition around the idea of allowing “missing middle” housing is really key to all these objectives (and environmental ones, as well). And, as Portland’s recent experience experience in legalizing duplexes and creating greater affordability with its Residential Infill Project shows, this isn’t just or even primarily about wonky modeling, it’s about political engagement.  Technology, modeling and data can play a supporting role, but the challenge is organizational, political and communication. Promises of an easy technical fix, moreover, create a kind of Gresham’s law in which the prospect of impractical but technologically exciting vaporware drives out fundamental institutional reform. See, for example, Elon Musk’s magical hyperloop, currently serving as an excuse not to fix mass transit in many cities, as Alissa Walker trenchantly observes:

And each time city leaders promote one of his [Elon Musk’s] fantastical ideas—tiny tunnels! autonomous vehicles! platooning!—it does serious damage to the real-life solutions being proposed by experts that will actually make life better for their residents.

Elon Musk’s Boring Company: Habitrail’s for Teslas?

In theory, big data and smart city models should help us make better decisions, in practice, they’re slaves to broken and biased policies

Almost a decade and two mayors ago, IBM’s Smart City team came to Portland to show off some of its elaborate models of urban systems. There’s precious little evidence that the IBM smart city modeling has had any staying power in the city.  Take for example climate change (which is something the IBM model was meant to address).  As it turns out, Portland didn’t use this modeling for the 2015 Climate Action Plan; and since then, the city has pretty much walked away from a rigorous look at how reducing driving is the key to achieving our stated (and now more ambitious) climate goals.

The bigger question is how models and their results get used in the political/institutional setting in which we live. Models, especially big complex ones, are generally wielded as weapons by institutions to avoid or deflect scrutiny of their big decisions.  Highway builders like the Oregon Department of Transportation routinely cook the books in their transportation modeling to justify giant projects like CRC and Rose Quarter.  The public lacks the energy and resources to contest this technical work, and it becomes a huge barrier to change and fair consideration of alternatives.

Frequently, state highway agencies and regional planning models that create such models construct them in ways that systematically rule out important factors.  Portland’s Metro has gone to great lengths to deny that there’s any such thing as the price elasticity of demand (perhaps the most fundamental concept in economics), arguing that driving will increase regardless of the price of gasoline.  That leads Metro, in turn to ignore the most powerful policy levers it could deploy to reduce pollution and greenhouse gas emissions, and help bolster transit ridership. It’s chosen financing mechanism actually subsidizes driving, which undercuts all of its stated goals of reducing emissions and discouraging sprawl.

And even when the models show that plans aren’t achieving our goals, the powers that be simply ignore them:  Witness Metro’s $5 billion transportation package which will, according to their estimates, reduce transportation greenhouse gases in the Portland area by 5/100ths of one percent—essentially nothing. In the face of growing evidence that their climate plans are feeble and failing, we get the repetition of discredited myths (we’ll reduce greenhouse gases by reducing idling in traffic by widening roads).  And the consensus of state DOTs calls for climate arson in the form of spending hundreds of billions of dollars on new highway capacity.
Much of the  “smart cities” work is engaged in and tolerated only to the extent that it supports, or at least doesn’t get in the way of the status quo megaprojects that these organizations want.  Bottom line:  the “smart cities” effort focuses mostly on developing algorithms for the optimal arrangement of deck chairs on the Titanic, and hasn’t convinced anyone to change course to avoid icebergs.

Big Data, Bright Lights, Blind Spots

Shining the light of big data changes our perception:
But our vision changes,
our pupils constrict, our focus narrows.
Many things that were shades of gray are now fully illuminated.

But outside the glare of the big data spotlight,
everything else is plunged into darkness.
This is the profound bias of big data,
it illuminates some things, but darkens others.
And it does so in ways that mean we make bad decisions.

We are drawn to the light.
Focused on solving the problems we see
and dismissing—out of ignorance—the ones we can’t.
When we measure movement,
we inherently advantage those who are moving through
over those who are in or of a place, stationary.
When we measure vehicles,
we inherently advantage vehicles, and penalize those who do not have vehicles

As a result, we end up spending resources and building places
for the people who do not want to be there,
in the process, making them worse for the people who do.
Little surprise that people with choices move away

The case against Metro’s $5 billion transportation bond

Metro’s proposed $5 billion transportation measure makes no sense for the region, for transportation, for our economy, for our kids and for our planet.

Portland’s regional government, Metro, will be asking voters in November to approve a $5 billion transportation bond measure. There’s a strong case to be made that this is a badly flawed approach to the region’s future. Today, we lay out the arguments against this measure.

  • The plan is founded on a  highway-oriented concept of corridors, rather than a more sensible approach emphasizing walkable centers and main streets.
  • The $5 billion measure does nothing to lower greenhouse gases.
  • Its wage tax is unrelated to transportation, effectively  taxing those who use the system least, and subsidizes those who drive and pollute the most.
  • The measure amounts to a 30 cent per gallon gasoline subsidy, encouraging more driving and increasing greenhouse gas emissions 50 times more than the amount saved by all its investments.
  • The measure cannibalizes the principal source of funding for transit operations just as Tri-Met is experiencing an operating funds financial crisis.
  • The plan is mired in the past, making no allowance for the changes we’ve already experienced in driving during the Covid-19 pandemic.
  • Metro’s plan plunges the region into debt for the next two decades, using up funds that will be badly needed to combat climate change.
  • This measure makes Portland taxpayers pay for problems created by the Oregon DOT: unsafe and transit hostile state highways.
  • A better approach would be to have road users to pay directly for the services they get, reducing carbon pollution, creating a more equitable transportation system.

After a year-long process Metro has sketched out a $5 billion plan, which includes investments in a number of “corridors”—we’ll come back to that term in a moment— and which would be  paid for by imposing a .75 percent payroll tax on firms with 25 or more employees, roughly for the next two decades.  (For the past fifty years, the region’s transit agency, Tri-Met has subsidized its operations by a similar payroll tax). The plan earmarks funds for a number of projects, most notably another leg of the region’s light rail system, this one to angle to the southwest suburbs. While there are a few set-asides for measures to promote access by low income populations (subsidized transit fares for students), the bulk of the money is allocated to capital construction.

Every part of the region gets an earmark for pet projects and/or pet corridors.  Suburban Clackamas County gets money to build expanded highway capacity. Multnomah County gets a contribution to the cost of a replacement of one of its Willamette River Bridges. The Port of Portland gets a subsidy for a big overpass to speed cars to its lucrative airport parking garages. The entire process was crafted like an overloaded Christmas tree, as a political log-rolling plan to provide something for everyone, and has consequently engineered political support from local governments throughout the region and key community groups.

But while it may make sense politically, the measure is at odds with the region’s stated values and vision, and sensible transportation and environmental policy. As we’ve noted already at City Observatory, in spite of the fact that Metro claims to care about climate change (and even though transportation is already the region’s single largest source of greenhouse gas emissions, and is increasing rapidly), the plan does essentially nothing to reduce carbon emissions.  But there are plenty of more reasons why this is a bad plan.

Corridors versus centers; cars versus people

The central organizing principal of the measure is investing in “corridors.” While it might have some superficial appeal, the notion of prioritizing investments around corridors is fundamentally at odds with the region’s stated planning objectives. Corridors are virtually by definition highways or major, multi-lane arterial streets that move large numbers of cars. While some of what is proposed in the plan are remedial measures to make these highways less hostile to pedestrians and somewhat more conducive to transit, the underlying planning objective is increasing throughput of people and vehicles. It’s a plan whose goal is simply moving things around, rather than improving the region’s livability.

It’s an odd choice because for decades the Metro land use plans have called for investments in “centers.”  The 2040 Regional Plan, adopted in 1995, called for an emphasis on regional centers and town centers and main streets would create nodes of density and diverse commercial, economic and civic activity that would serve as destinations and anchors for walkable, bikeable, transit-served neighborhoods.  In addition, if we need to reduce vehicle miles traveled (VMT) in order to reduce greenhouse gas emissions, facilitating more volume on corridors is actually counterproductive.

This measure makes virtually no investment in centers, and instead invests all the regions capital in bolstering transportation infrastructure in corridors. If this were a house, it would be all about hallways, and nothing about rooms.

Corridors are dead ends for community and climate

It’s a fair point that many of these corridors provide a dangerous environment for pedestrians and cyclists, and slow service and unpleasant surroundings for transit riders.

But nearly all of the major corridors included in the Metro bond measure are state highways, built, owned and managed by the Oregon Department of Transportation, an agency that has systematically prioritized car movement over all other human activity. The key corridors in the project are McLoughlin Boulevard (State Highway 99E), Powell Boulevard (State Highway 26), 82nd Avenue (Oregon State Highway 213), and the Tualatin Valley Highway (Oregon State Highway 8). As City Observatory friend and planning professor Ethan Seltzer observes:

The bulk of the money goes to putting infrastructure in places that will never be great places and that, frankly, ODOT and the State should be paying for, not Metro taxpayers. 82nd Ave, HWY 217, McLoughlin Blvd. …. these are ODOT facilities, made inhumane and inhospitable by ODOT.  Now Metro is expecting the region to pay to fix a problem created by the state on state facilities.  Meanwhile, ODOT gets to shift its resources to building up the system outside the Metro area, and also at metro area expense as the biggest group of gas consumers and taxpayers are in the metro region.

As Metro’s own studies have shown, these ODOT-owned roadways are responsible for a disproportionate number of the region’s roadway deaths and injuries.  Why should regional taxpayers be asked to pay to fix problems that were created by a state agency, especially one which has at least $800 million for a project that most Portland residents oppose?

Pouring hundreds of millions of dollars into these corridors is likely to produce trivial improvements in transportation speed and safety, but worse, will do almost nothing to advance the region’s vision of building more robust town centers and main streets.  Consider 82nd Avenue, which has been developed as mile upon miserable mile of strip commercial development featuring used car dealerships, drive through restaurants, strip malls and occasional big box stores. With between 20,000 and 30,000 cars driving up and down 82nd Avenue daily, its not an environment pedestrians want to linger in.  To be sure, there are a few nodes of activity (a Portland Community College Campus at 82nd and SE Division, the Fubonn Asian Mall at SE Woodward Street, and the Montavilla business district at SE Stark Street, but every one of these places turns its back on 82nd Avenue and the maelstrom of cars it serves. Regardless of the amount invested, unless car traffic were radically reduced and calmed, 82nd will never be a street that attracts and serves cyclists and pedestrians; it will always be what it is now, an barrier between people living on opposite sides of the avenue, separated by a flood of cars.

We know how “corridor” investments like these turn out:  They’re merely a highway engineer’s view of what a freeway for bikes or pedestrians might look like.  Consider as an example, the “multi-use path” that was incorporated into the Interstate 205 bridge crossing the Columbia River.  The path is about 10 feet wide, bordered by concrete walls, and stretching for more than two uninterrupted miles in the median of the freeway with five lanes of deafening car and truck traffic on either side.

Here’s what a “corridor” for bikes and pedestrians tends to look like, in practice.

To be sure, it is a “corridor” that enables people on bikes and on foot to cross the Columbia River, but there’s nothing along the path (or even at either end) that is a plausible or enticing destination. Exclusive, grade-separated and protected bikeways in the middle of a hostile environment and not connecting safe and interesting destinations don’t build a sense of place or create livable neighborhoods.

The focus on moving traffic and corridors undercuts community building. As Ethan Seltzer notes:

The premise of improving corridors itself is bankrupt as a corridor focus is about moving through, not about making better communities. All we’re doing is making a prettier version of a failed system.  Transportation is best seen as a means, not an end.  Metro’s approach only glorifies transportation rather than putting it more clearly in service to more central community aims.

As we’ve observed many times at City Observatory, what Portland and other cities lack is not so much transportation facilities, but great walkable places. People pay a premium for homes located in places with lots of common destinations within easy walking distance. Bolstering centers and main streets for people, not increasing throughput on corridors, should be the region’s investment priority.

Cannibalizing transit’s funding source

In Portland, for the past half-century, a regional payroll tax has been the revenue source for subsidizing transit operations. This proposal would more than double the payroll tax, and thereby fiscally and politically foreclose Tri-Met’s ability to raise the tax to pay for increased operations in future years.  That’s already a serious issue.

The agency is already devastated by the Coronavirus, and lacks the funding to increase transit service to achieve the long-term ridership goals laid out in the Regional Transportation Plan. (The RTP assumes that Tri-Met will be able to increase its ridership from about 280,000 persons per day in 2015 to 480,000 by 2027, but that will require a massive increase in subsidies).

Its very likely that passage of the Metro bond measure will lead to capital spending for expanded light rail and bus rapid transit that the agency has limited financial resources to operate. We’ll build the rails, and maybe electrify some buses, but won’t be able to pay for drivers.

Tri-Met’s revenue and operation issues are even more stark in light of the Covid-19 pandemic and recession. The agency’s operations have been on life-support in the form of $196 milion from the Federal CARES Act, and that money will soon run out.  Ridership and fare revenue is down in June was down about 60 percent from year ago levels, and Tri-Met expects to lose $61 million in its 2021 fiscal year.  How long it will take to rebuild ridership, and how much it will cost, and how it will be paid for are still very unsettled questions.

Doing nothing to reduce carbon emissions

At City Observatory, we’ve already written about the Metro measure’s astonishingly feeble effect on greenhouse gas emissions.  By the agency’s own calculations, the measure will reduce regional greenhouse gases by five-one-hundredths of one percent, even though greenhouse gases from transportation are the largest source of the region’s carbon emissions, and have grown by 1,000 pounds per person in just the past five years.

 

The agency says it cares about climate change, but its investment of $5 billion on something that does effectively nothing to reduce greenhouse gases shows the emptiness of its climate promises. This package is probably the only serious opportunity to rework the transportation system in the next decade, and it will leave the region too broke and indebted for an actual “plan B.”

Subsidizing driving, sprawl and carbon pollution

Metro’s use of a payroll tax insulates motorists from the true cost of their transportation choices. While its largely a myth, many still believe that we have a “user pays” system for the roads.  The Metro bond measure would require the equivalent of roughly a 30 cent a gallon tax if it were paid for as a user fee, rather than by being charged as a payroll tax. By not insisting that car operators pay for these roadway improvements (and the vast majority of these expenditures are in the right-of-way and eligible for gas tax funding, even in the most narrow interpretation of the Oregon Constitution), we’re subsidizing additional driving.  People who don’t drive at all will end up paying for this measure through payroll taxes; while those who drive a lot will get a huge subsidy.   Despite its problems, the gas tax is crudely proportional to the use of the transportation system and to air pollution. Wages paid are not. The measure further insulates car drivers from the cost their decisions pose on others, and encourage additional driving. And a 30 cent a gallon subsidy to driving leads to more  driving, and therefore crashes and carbon emissions, offsetting the supposed safety and environmental benefits of the package.

Essentially what this measure does is tax work to subsidize the price of gasoline.  It thereby makes car travel cheaper, and encourages more vehicle use (and pollution) than would otherwise be the case.  Using standard estimates of the price elasticity of demand, we can calculate how much this subsidy to car travel implied by the payroll tax would stimulate vehicle miles of travel and  additional greenhouse gas emissions.  The long run elasticity of vehicle miles traveled with respect to gas prices is about -0.3; a 10 percent increase in fuel prices leads to a 3 percent reduction miles driven.  At current fuel prices of about $2.70 per gallon, the 30 cent reduction in gas prices enabled by financing this package from payroll taxes rather than gas taxes works out to about a 10 percent reduction in gas prices.  This suggests that the measure would lead to about 3 percent more driving than would be the case if users paid directly at the pump.  Metro Portland’s current transportation system annually generates about 8.4 million tons of greenhouse gases; a three percent increase represents about 250,000 additional tons of greenhouse gases per year due to the subsidy.  For reference this is roughly than 50 times larger than the estimated 5,000 ton reduction in annual greenhouse gases from the $5 billion spending package, according to Metro.  In addition, more driving would also result in more congestion, lower transit ridership, and more crashes.

For decades, the fiction of the “trust fund” and the state constitutional dedication of gas taxes to road improvements has created the illusion that the road system is paid for by user fees.  That’s never been true.  Roads and road use at every level of government are deeply subsidized.  The federal government has transferred more than $140 billion in general funds to bail out the federal highway trust fund, and is expected to need to chip in a further $176 billion this decade. Oregon shields car owners from nearly $1 billion per biennium in taxes they’d pay per biennium by exempting cars from property taxes.  Cars pay nothing for the cost of cleaning up the toxic runoff from tires, brakes, leaking fuel tanks and precipitated air pollution; according to City of Portland estimates, half of the cost of the city’s “Big Dig” to separate storm and sanitary sewers was attributable to dealing with road runoff. As Portland City Commissioner Chloe Eudaly observed on June 30, 2020, “roads are the only utility we don’t charge based on usage.” The Metro bond measure is one more colossal subsidy to car driving, one that will increase sprawl and financially penalize those who choose more environmentally friendly travel and living options.

Asking Metro taxpayers to fix problems created by ODOT

Ostensibly, one of the major motivations for the package is to increase safety.  According to Metro, a large fraction of the projects are reputed to be “safety” projects.  But almost all of them are about doing something to reduce to the dangers that cars pose to other more vulnerable road users.  Safety is a laudable reason for spending money, users of the road system should pay for the costs of making it safer, not the general taxpayer.

Nearly all of the Metro measures big ticket items are spending on improvements to the right-of-way of state highways, including the Tualatin Valley Highway, McLoughlin Boulevard, 82nd Avenue, Powell Boulevard and Highway 212. Why are Portland area residents being asked to tax themselves to pay for the fixes to these state highways?  In principal part, the answer reflects the fact that ODOT would rather use state money to widen I-5, a project whose cost has already ballooned to $800 million, and which could easily exceed a billion dollars, and further billions on a revived Columbia River Crossing. While the Rose Quarter freeway widening is marketed as a “safety” project, virtually all of the other ODOT highways in the Portland are are vastly more lethal to travelers.  The region should insist that rather than building an unneeded, ineffective and environmentally destructive Rose Quarter project, ODOT ought to first fix the deadly roads it runs in the region. Portlanders are already paying their gas taxes to do just that.  (Moreover: ODOT has also prioritized new construction over simply maintaining and operating existing roads, and it is already asking the legislature for more money on that basis, a classic budgetary bait and switch).

In addition, many of the transit portions of the project are really costs incurred to subsidize automobile travel.  The budget for the Southwest Corridor Light Rail includes more than a hundred million of dollars for the construction of garages to people can drive to take the light rail train. The cost of the Southwest Corridor is also inflated by $200 million by the decision not to reallocate some of the road right-of-way on Barbur Boulevard for transit use.  Similarly, the cost of bus rapid transit on other corridors reflects a conscious decision not to use publicly owned right of way in a way that maximizes the number of people who can travel in the corridor.

Sticking the cost on our kids: A ruined climate and a load of debt

The measure’s key feature is spending the money up-front in the next few years, but passing the cost on to taxpayers over the next two decades by bonding the revenue from higher payroll taxes. It’s understandable politically that Metro would want to get the benefits now, and push the costs off to the future.  But faced with the prospect of irreversible climate change before the bonds are even half paid off, its worth asking whether its in our interests, and especially whether its in the interest of our kids, to commit to spending additional billions for the next two decades to support a car-dependent transportation system and car-dependent living.  Make no mistake—the billions spent on this package won’t be available to reduce greenhouse gas emissions; we’ll be stuck repaying these bonds even as the planet heats up and our state burns.

Transit:  Expensive construction, nothing for operations

The biggest single project in the Metro bond package is the local share of funding for a new light rail extension through Southwest Portland to the suburbs.  In theory, SW Light Rail sounds like just the sort of thing we should be doing, investing in electrified transit to reduce greenhouse gas emissions. But sadly the project fails to actually deliver benefits. Its ridership is expected to only modestly increase above the numbers that would be carried by buses even if no light rail system were built and earlier this year, Tri-Met has already lowered those estimates by a further 12 percent. And the agencies track record on forecasting has been bad, consistently overestimating ridership for new light rail lines.    Moreover, the transit agency doesn’t have either the funds or a plan to provide the level of bus service that’s needed to carry the additional 200,000 daily transit riders called for in the Regional Transportation Plan.

Crumbs for equity

Metro has tossed in a few crumbs for expenditures that would be more just, including funding of reduced fare or free transit service for low income households.  But these subsidies are a tiny part of the overall program:  free transit passes for metro area students would cost only about $9 million per year.  But lower fares will be of little or no value unless theres a good transit system, and this measure actually does nothing to assure buses will keep running.

A key equity consideration is that Tri-Met doesn’t have enough money to expand operations to carry all the people that Metro says will travel by transit in the Regional Transportation Plan.  And Metro’s is draining the well that Tri-Met has depended upon for paying operating costs:  the payroll tax.  This measure more than doubles the payroll tax, and takes all that increase for the next twenty years (or more) and puts it into paying off bonds.  None of that money will be available to pay for the operating costs associated increasing actual transit service.  TriMet will have a new LRT line down the middle of Barbur to the Bridgeport Village lifestyle center, but it could easily have no money to pay to run your local buses.

A plan for the past, not for a post-Covid future

Covid-19 has been a major disruption to the way we get around.  Many more of us are working from home and shopping on line, which may permanently change our transportation system.  Automobile industry experts expect permanent and significant reductions in both automobile commuting and shopping trips—enough to reduce the numbers of cars on the road by millions.  Now, mired in a recession and still fighting a pandemic, we can’t know what the long-term effects of these changes will be.  Rather than borrow and spend all this money now, foreclosing its better use once we know how things shake out, we’d be well-advised to wait a couple of years, and develop a plan that works in a post-Covid world which could be very different from the one built into Metro’s models and projections.

The lesson of Covid-19 is that we can do much more to re-purpose street space for non-automobile uses, and do so quickly and cheaply, in ways that make our communities both healthier and more livable.  Already, Portland has implemented a “Slow Streets-Safe Streets” plan covering 100 miles of city streets.  Globally, leaders like Paris are using the pandemic to re-purpose entire boulevards for bike travel, and focus on “15-minute neighborhoods” that reduce car dependence and promote greater livability.  The pandemic is an opportunity to rethink our communities in ways that make our lives better.

But that’s not what Metro is proposing. As our colleague Ethan Seltzer points out:

. . . there is little, if anything, in this measure, hatched pre-COVID, that applies in any creative way to a post-COVID world.  How have transportation patterns changed with COVID?  What if a significant percentage of folks continue to work at home, even with a vaccine?  What kind of transportation infrastructure will we need?  Is municipal broadband the transportation and equity investment most needed in the future?  None of these questions have been dealt with, and won’t be dealt with until after Metro spends the money on your father’s transportation system.

If we’re going to spend $5 billion on our transportation system for the next two decades, it would be better if we waited just a short while to see how this shakes out, so we make decisions for the world we’re actually living in, rather than one that no longer exists.

A better fix:  Pricing congestion and carbon pollution

What this plan overlooks—and effectively undercuts—is systematically better and more direct ways of tackling our transportation and climate problems. If people are concerned about traffic congestion, we have a globally proven means for eliminating it:  congestion pricing.  The Oregon Legislature has already authorized pricing on Portland area freeways, and the region and City of Portland have said they’re willing to implement it.  The experience of the pandemic and of other cities, shows that transportation demand management (reducing the number of trips, and especially changing when they’re taken) allows us to dramatically reduce congestion without building more roads and creating more pollution. Likewise, carbon pricing, whether through cap-and-trade or a tax on carbon pollution, is the most efficient solution for reducing greenhouse gases and quickly facilitating the more widespread adoption of electric vehicles.  Essentially all of the increase in greenhouse gases in the Portland area after 2014 was due to increased driving because of the decline in gasoline prices.  The direct and indirect effects of these pricing measures also help achieve greater equity:  as we’ve noted, road pricing reduces traffic congestion, which enables buses to move faster, benefiting existing transit users, and making transit more attractive for others.  And the proceeds of congestion fees and carbon taxes could be used to directly underwrite transportation allowances for low income households and students.

In sum: A bad transportation package

  • Metro’s proposed $5 billion measure takes Portland in the wrong direction.  It’s founded on a flawed, highway-oriented concept of corridors, rather than a more sensible, human-centered idea of building walkable, sustainable centers and Main Streets, that reduce the need for car travel.
  • In spite of the fact that transportation is the largest, and fastest growing climate threat in the region, the plan does nothing to lower greenhouse gases.
  • It is funded from a source totally unrelated to transportation use, which has the effect of taxing those who use the system least, and subsidizing those who drive and pollute the most.
  • By shifting the cost away from road users, the measure effectively subsidizes gasoline by about 30 cents a gallon, encouraging more driving and increasing greenhouse gas emissions 50 times more than the amount saved by all its investments.
  • The measure cannibalizes the principal source of funding for transit operations at the time Tri-Met is experiencing tens of millions of losses, and an uncertain future of restoring existing ridership.
  • The plan makes no allowance for the changes we’ve already experienced in driving during the Covid-19 pandemic–changes that even auto industry experts expect to persist well into the future.
  • And the plan essentially takes all the money that will be generated by this tax for the next two decades or more and plows it into a set of short-term highway-corridor oriented projects that our children will be paying for as the climate grows steadily worse; money that will be badly needed for efforts that actually reduce greenhouse gases, and enable us to adapt to climate change.
  • This measure also asks Portland taxpayers to pay for problems created by the Oregon DOT: unsafe and transit hostile state highways.  Nearly all of the “corridors” slated for investment are state highways, and the bulk of all the improvements could be paid for by the Oregon Department of Transportation out of gas tax revenues.
  • A plan for the 21st Century would ask road users to pay directly for the services they get, and create strong economic incentives to reduce carbon pollution, and in the process pay for a more equitable transportation system.

America’s least (and most) segregated metro areas: 2020

The latest Census data show that Black/White segregation is decreasing in large metro areas.

Racial segregation still prevails in most American cities, but varies widely across the nation.

Portland is one of America’s least segregated metros

One pervasive and lingering hallmark American geography is racial residential segregation:  our metropolitan areas have literally been divided by race, and as numerous studies have shown, this has undercut opportunity, perpetuated poverty, limited economic mobility and eroded Black wealth. The data from Census 2020 give us the latest readout on the patterns and trends of segregation in large US metro areas.

Measuring Black-White Segregation

One of the most common measures of racial segregation is the dissimilarity index, which measures the extent to which different groups of people live in different neighborhoods in a city or metro area.  The index ranges from zero (perfectly integrated, where the composition of each neighborhood matches the composition of the larger region) to one (completely segregated) where each neighborhood consists entirely of persons of a single racial or ethnic group).  The dissimilarity index expresses the percentage of the population that would need to move to a different neighborhood in order for each neighborhood’s racial/ethnic composition to match that of the larger area.

Brown University’s Diversity and Disparities project has used data from Census 2020 to compute the Black-White Dissimilarity Index for all of the nation’s larger metropolitan areas.  (It’s website also has data making the same computation for earlier decades, going back to 1980).  The Black-White dissimilarity index measures the spatial separation of Black and white residents within metropolitan areas, and as described above, represents the fraction of the Black or white population that would have to move to a different neighborhood (in this case, a different Census tract) in order for each tract to have the same racial composition as the overall metro area.

As usual at City Observatory, we’re focused on the 53 US metro areas with populations of 1 million or more population.  Overall, the Brown University tabulations show that Black-white segregation in the US is continuing to decline.  The median Black-White dissimilarity index for these large metro areas was 52.8 in 2020, down from 58.2 in 2010, and from 71.2 in 1980.

The median level of segregation, according to this measure has been declining now for the past half century or more.  Still, a dissimilarity index of 50 or more is quite high, and many metro areas continue to have even higher levels of segregation.

America’s Least (and Most) Segregated Metro Areas

This chart ranks cities from least segregated to most segregated using the Black/white dissimilarity index for each metropolitan area.  As noted, in 2020, the median large metro area had a dissimilarity index of 52.8, meaning that about 53 percent of a city’s population would have to move to balance the composition of individual neighborhoods to the region’s overall demographic composition.  About half of all large metros have dissimilarity indices between about 46 and 60.

The metros with the highest levels of segregation according to this measure are MIlwaukee, Detroit, New York and Chicago.  Each of these metros has a Black-white dissimilarity index exceeding 70.

The cities with the lowest levels of segregation are Tucson, Salt Lake, Portland, San Jose.  Each of these metros has a dissimilarity index of 45 or less.

Portland’s big decline in segregation

Portland has not always been a highly integrated place.  If we look at the historical data on the black-white segregation index for Portland for the period 1970 through 2020, we see that Portland went from being one of the most racially segregated metro areas to one of the least.  Data compiled by Sophie Litschwartz at the Urban Institute shows that in 1970, Portland was actually  more segregated than the typical large metro area, with a Black-White dissimilarity score of 79.7, compared to a national metro median of about 76.5.  Portland’s segregation has declined sharply since then. Portland’s black-white segregation measure fell by more than half, (by almost 45 points) in 50 years; while the median rate for large metro areas fell by about 23 points.  Our earlier analysis showed that Multnomah County (at the center of the region, and encompassing the City of Portland) has the lowest white/non-white dissimilarity index of any large, central urban county in the nation’s largest metro areas.

 

 

 

 

America’s least (and most) segregated cities.

Racial segregation still prevails in most American cities, but varies widely across the nation.

Portland is the nation’s least segregated large city.

The murder of George Floyd by police has reignited national interest in making more progress toward racial justice. It’s prompted a new round of introspection about the racism that’s deeply embedded in many American policies and institutions. One pervasive and lingering hallmark is racial residential segregation:  our cities have literally been divided by race, and as numerous studies have shown, this has undercut opportunity, perpetuated poverty, limited economic mobility and eroded Black wealth. Today we take a closer look at racial segregation in the nation’s largest cities.

Which cities are the most (and least) segregated?

The most common index of racial segregation is the dissimilarity index, which measures the extent to which different groups of people live in different neighborhoods in a city or metro area.  The index ranges from zero (perfectly integrated, where the composition of each neighborhood matches the composition of the larger region) to one (completely segregated) where each neighborhood consists entirely of persons of a single racial or ethnic group).  The dissimilarity index expresses the percentage of the population that would need to move to a different neighborhood in order for each neighborhood’s racial/ethnic composition to match that of the larger area.

The Census Bureau’s American Community Survey annually collects data on the race and ethnicity of Americans by Census Tract (a geography that corresponds roughly to neighborhoods).  The St. Louis Federal Reserve Bank has used this data to compute the white-non-white dissimilarity index for each of the nation’s counties.  Data cover the years 2009 through 2018, and are based on rolling five-year ACS counts (i.e. the 2018 data are drawn from the years 2014-2018).  As the Federal Reserve Bank explains:

The Racial Dissimilarity Index measures the percentage of the non-hispanic white population in a county which would have to change Census tracts to equalize the racial distribution between white and non-white population groups across all tracts in the county.

We’ve assembled these data for the central county in each of the nation’s largest metropolitan areas, and then ranked them from least segregated to most segregated.  Our tabulation includes only central counties with populations of 100,000 or more. (As we’ve noted, counties are less than perfect units for making these comparisons; metro area data are more indicative, but the Federal Reserve’s tabulations only address counties).

America’s Least (and Most) Segregated Urban Counties

This chart ranks cities from least segregated to most segregated using the white/non-white dissimilarity index for the largest county in each city’s metropolitan area.  The median large metro area has a dissimilarity index of 45, meaning that about 45 percent of a city’s population would have to move to balance the composition of individual neighborhoods to the region’s overall demographic composition.  About half of all large cities have dissimilarity indices between about 38 and 54.

The cities with the highest levels of segregation according to this measure are Detroit, New Orleans, Philadelphia, Buffalo and Milwaukee.  Each of these cities has a dissimilarity index exceeding 60.

The cities with the lowest levels of segregation are Portland, Virginia Beach, Boston, Seattle and Las Vegas.  Each of these cities has a dissimilarity index of 35 or less. Portland, Oregon (defined in this case as its largest urban county, Multnomah County) has, by a wide margin, the lowest level of white/non-white segregation of any large urban county in the United States.  Its index value is 27.3, about half the median value of the typical large metro in the US.

Census data also enable us to plot the pattern of segregation over time.  Small year-over-year variations most likely reflect sampling variability from the American Community Survey, so its best to look at multi-year trends.  For most large US metro areas the trend in segregation is downward:  Dissimilarity indices are declining over time. Here’s a chart showing Portland (Multnomah County) white/non-white dissimilarity from 2009 through 2018:

Over this decade, Portland’s White/Non-White segregation index declined from 31.6 to 27.3. Segegation in Portland has been declining recently, but its actually a trend that’s been in place for many decades.

Portland’s big decline in segregation

 

Portland has not always been a highly integrated place.  If we look at the historical data on the black-white segregation index for Portland for the period 1970 through 2010, we see that Portland went from being one of the most racially segregated metro areas to one of the least.  (We use the black-white measure because the Census Bureau’s race and ethnic definitions were not comparable in early years.  Data compiled by Sophie Litschwartz at the Urban Institute shows that in 1970, Portland was more segregated than the typical large metro area, but that segregation has declined sharply since then. Portland’s black-white segregation measure fell by half (40 points) in 40 years; while the median rate for large metro areas fell by about 15 points.

 

It seems odd that Portland would be a leader in integration, given that it is by many media accounts and popular wisdom “the whitest city in America.” While its true that the city has relatively few Black residents compared to most large American cities, as many demographers point out, that particular definition of “white” effectively treats Hispanic, Asian, Native American and mixed race people as white.  When you look at the share of the population that is “Non-Hispanic White,” Metro Portland is more diverse than Cincinnati or Pittsburgh, for example.

A city’s racial and ethnic mix is a product of history and geography.  The geography of Latinos, Blacks and Asians in the United States each have their own geographic contours based on historical patterns of migration. Blacks were brought to the Southern United States as slaves, landed primarily at places like Charleston, Tidewater Virginia and the Gulf Coast, and to this day, are disproportionately concentrated in the South. Most Latinos have migrated from Latin America, and are heavily concentrated in the Southwest.  Asians have a disproportionate concentration on the US West Coast. The patterns that were in place 100 years ago are still reflected today in the regional concentrations by race and ethnicity.

What’s more malleable to change in the short run (over the course of a few decades) is where within metropolitan areas people live.  In general, throughout the United States, we’ve seen a marked decline in residential racial segregation. Localized patterns of segregation can change more quickly than the overall racial and ethnic diversity of a metro area.  Given its more limited overal racial/ethnic diversity, Portland achieves a higher level of integration than nearly all US metro areas, something we explored in our report “America’s Most Diverse, Mixed Income Neigbhorhoods.”

Greater diversity is already baked in Oregon’s demographic cake.  Demographer Charles Rynerson points out that–

2019 Census estimates and found that people of color make up just 10% of Oregonians 65 or older. But they are 37% of those under the age of 15.

This means that the state will become progressively more diverse with each passing year.

U.S. Census Bureau, White to Non-White Racial Dissimilarity Index  [RACEDISPARITY], retrieved from FRED, Federal Reserve Bank of St. Louis; https://fred.stlouisfed.org/series/RACEDISPARITY, August 10, 2020.

 

A world of fewer cars and less driving

Auto industry consultants KPMG see fewer cars and less driving in our future

That may be bad for the car business, but good for the environment and cities

One clear implication: hold off building new road capacity

There’s little question that the pandemic has altered the way we live in the present, the big unresolved questions are how this experience will change the way we live in the future.

One of the most powerful effects of staying-at-home during the pandemic has been a dramatic drop in car travel, with vehicle miles traveled in typical metropolitan areas declining by more than half in the first two months after the domestic onset of the pandemic.

We’ve adapted in a variety of ways; many professional and administrative workers have figured out how to work remotely from home, which has reduced commute trips.  At the same time, we’ve turned to internet shopping and delivery services for a larger number and wider array of goods and services that would ordinarily have required trips (mostly car trips) from home.  Our new familiarity and growing alacrity with Zoom, Amazon, Instacart, Doordash and their ilk could translate into permanent changes in the amount of car travel. Even though these are imperfect substitutes for high-touch and face-to-face experiences, they could definitely reduce travel by some amount.

That question is of keen interest to many, and especially to the world’s car-makers.  KPMG, the international accounting and management consulting firm has a practice devoted entirely to the automotive industry (car-makers and their suppliers) and has dived deeply into the likely impacts of pandemic-induced travel behavior changes on the demand for driving (and cars). Their report, Automotive’s new reality: Fewer trips, fewer miles, fewer cars?, is aimed at the auto industry, but has important implications for cities and transportation.  If, as KPMG predicts, we drive less and buy fewer cars, that should substantially reduce urban traffic congestion and pollution, and eliminate the purported reason to expand road capacity.

KPMG’s headline finding is that changes in commuting and shopping travel will sharply cut into car travel in the US:

. . . we estimate that total U.S. VMT could drop by 140 billion to 270 billion miles per year. The first-order effect would be a reduced need to own a vehicle and lower demand for new and used cars. We estimate that car ownership could fall from 1.97 to as little as 1.87 vehicles per household. That may not sound like much, but it could translate into 7 million to 14 million fewer vehicles on U.S. roads.

These changes would be driven by a permanent shift to more “work-at-home” and more online shopping.

Retail:  A dramatic reduction in shopping trips.

As we’ve stressed before at City Observatory, on-line shopping reduces vehicle miles of travel (VMT), because the reduction in car travel to and from stores dwarfs the increase in vehicle miles traveled to make deliveries. MIT Economist Will Wheaton estimates that on-line shopping is produces about 30 times fewer VMT than car travel to brick and mortar stores.  KPMG estimates that the growth of on-line shopping and delivery could reduce shopping tripes by 10 to 30 percent, reducing driving by as much as 130 billion miles per year, equal to up to 5 percent of all personal VMT.

Fewer trips, and less driving means fewer cars

Fewer commute trips and fewer shopping trips has an obvious implication for the automobile industry:  it’s likely that consumers will buy fewer cars.  Overall, the KPMG study estimates that there would be a 5 percent reduction in car ownership per household, and that this in turn would mean 7 million to 14 million fewer vehicles on the road.

While it might be bad if you’re in the car business, fewer cars and less driving would produce huge benefits for many of us, and especially in cities. Already in the pandemic, cities have experienced the cleanest air in decades as auto-pollution has declined from less driving. Less driving also generally means fewer car crashes and car deaths, a point which KPMG acknowledges, with an decidedly pro-industry spin:

Falling VMT would also affect used-car sales and aftermarket parts and service: less driving also means less wear and tear on vehicles, as well as a decline in traffic accidents, cutting into the lucrative collision parts business.

And less driving should mean less road building

And here’s the big public policy implication:  fewer cars on the road means less demand for car infrastructure. Keep in mind that the usual rationale for road-building and road widening is that we’re driving more and more every year.  If car driving and car ownership are going down, then logically, we should need to build fewer roads than we would otherwise.

Reducing vehicle miles of travel is the surest, most economical, and most environmentally beneficial way to reduce traffic congestion and emissions.  As the experience of the Coronavirus has demonstrated, reduced levels of travel have translated into more efficient highways, in some cases actually moving more cars at the peak hour faster than when we allow highways to become saturated by excess demand.

Already, state and local highway agencies should have gotten a pretty strong signal about falling demand for roadway capacity in the form of declining gasoline tax receipts. Just as KPMG is warning the automotive industry to plan for a world with fewer cars and less driving, state Departments of Transportation should be heeding the same call.

Some will doubt that KPMG is clairvoyant on these matters, but regardless of whether one believes their projections or not, there’s a very strong argument to be made that cities and states should hold off on committing to major capital construction projects that are predicated on an ever-growing amount of vehicular travel.  In a year or two, once we’ve (hopefully) worked our way past the Covid-19 pandemic and the recession it induced, we’ll have a better idea of how the behavioral changes play out in terms of travel demand. Now is not the time to be widening highways.

The amazing disappearing urban exodus

The greatest urban myth of the Covid-19 pandemic is that fear of density has triggered an exodus from cities.

US Post Office data show that the supposed urban exodus was just a trickle, and Americans moved even less in the last quarter than they did a year ago.

At City Observatory, we’ve regularly challenged two widely repeated myths about the Coronavirus.  The first is that urban density is a cause of Covid-19, and the second, and closely related, is a claim that fear of density in a pandemic era is sending people streaming to the suburbs and beyond.

Are we moving more?

If there were any truth to the urban exodus stories, you’d expect it to clearly show up in increased migration. But are more people moving now that pre-Covid-19?  One of the clearest ways to get a handle on this is through the US Postal Service.  When people move from one home to another, they fill out a change-of-address form.  The USPS tabulates this data monthly, and helpfully distinguishes between temporary and permanent moves.

MyMove.Com, a moving services website has prepared a new report summarizing these data.  The most compelling chart shows monthly data on permanent changes of address for 2019 and 2020.

If we believe the “urban exodus” theory, we’d expect to see a large and sustained increase in changes-of-address in the months after the advent of the virus, compared to the same month in the previous year.  The data show nothing of the kind.  While there’s a jump upward in moves in the first months of the virus—total moves up by about 21 percent compared to the year earlier in March and 10 percent in April, that surge didn’t persist..  (The month to month variation in 2019 and 2020 is pretty similar, suggesting that the variation in movement is normal, rather than unusual).  Moreover, though, total changes-of-address declined in May and June compared to the same months in 2019—precisely the opposite of what one would expect if the Coronavirus had prompted people to migrate.  Perhaps the most charitable explanation one can attach to the data is that the pandemic accelerated some moves–moves that would have happened in the summer, happened earlier; that would explain the jump up in March and April and the subsequent slide in May and June.

In the last three months for which there is data—that is, combining data for May, June and July—changes of address are down 2.5 percent compared to 2019. Bottom line:  If the pandemic permanently changed expectations, we should be seeing a steady increase in moving, compared to a year ago:  we don’t.  The “more moving” myth is busted.

Spinning the moving data

The lack of any increase in permanent moves should completely puncture the “urban exodus” claims, but that’s such a durable meme that those who write about the data can’t seem to bring themselves to concede it’s wrong.  In releasing its report, MyMove breathlessly plays up the urban flight story.  It’s release is titled, “Coronavirus Moving Study: People Left Big Cities, Temporary Moves Spiked In First 6 Months of COVID-19 Pandemic.” The body of the release claims (with no data) that:

Now that people can continue with their life remotely, they can do so from anywhere. And so people are leaving big, densely populated areas and spreading out to suburbs or smaller communities across the country — at least for now.

The report’s subhead shouts out the plays up the statistic that:

Over 15.9 million people have moved during the coronavirus, according to USPS data.

But lots of Americans are always moving. You have to dig a bit deeper to see that 15.4 million people moved in the same months of the preceding year, which means that there’s been only about a 4 percent increase in total moves.  In addition, most of that increase is due to temporary moves, not permanent ones; permanent moves are up just 2 percent.

Once again we have a report that implies that there’s an urban exodus, but then presents data that shows there’s been virtually no change in the level or pattern of moves compared to pre-Covid days.

The MyMove report also repeats the false claim about a connection between urban density and Covid:

The impact of urban density on coronavirus moving trends:

. . .  it’s only logical that large, densely populated cities and crowded spaces present a higher risk of spreading and contracting COVID-19, and that people would relocate to areas with fewer people, where the risk of infection could be lower.

True, denser cities were hit first, but we and others have presented detailed statistical evidence discrediting the “density=covid” claim.  Moreover, in the past two months, the character of the pandemic has completely reversed:  Covid-19 is now a rural and red state plague. New cases are far more prevalent in rural American and small metro areas, while the nation’s densest urban counties now have the lowest number of new cases per capita.

The anti-urban bias that conceals the continued strength of cities

While the copywriters haven’t seemed to have figured this out yet, the underlying data shows that there hasn’t been a surge in permanent migration, and the cities aren’t hemorrhaging residents. The data, unlike the copy, puncture the myth of an urban exodus.

Our recent report, Youth Movement, confirmed the depth and breadth of the long-term trend of well-educated adults moving in large numbers to close-in urban neighborhoods. And the real-time data from real estate market search activity confirmed that cities were still highly attractive, gaining market share in total search activity from suburbs and more rural areas, according to data gathered in April by Zillow and Apartment List.com

As we pointed out this summer, real-time data on apartment search activity showed that interest in cities increased, rather than decreasing.  Data compiled by  Apartment List.com economists Rob Warnock and Chris Salviati for nation’s 50 largest metro areas between the first and second quarters of 2020 showed interest in cities actually increased in the second quarter compared to the first, relative to other locations, including suburbs, other less dense cities, and rural areas. The findings are the exact opposite of what one would expect if the headlines about an urban exodus were correct. Rather than looking to less dense suburbs, or exploring other states, apartment search activity is focusing more on dense city locations.

In a fact-based world, that would put an end to these “fleeing the cities” stories. What the claims of urban exodus reflect are a persistent anti-urban bias embedded in many of these accounts.  The “teeming tenements” view of cities underlies many of the misleading anecdote fueled stories about people leaving cities. Sadly, the idea-virus that is the urban exodus myth seems just as persistent as the Coronavirus itself. But at City Observatory, we’ll keep working on the vaccine.

The Exodus that never happened

The greatest urban myth of the Covid-19 pandemic is that fear of density has triggered an exodus from cities.

The latest data show an increase in interest in dense urban locations.

At City Observatory, we’ve regularly challenged two widely repeated myths about the Corona Virus.  The first is that urban density is a cause of Covid-19, and the second, and closely related, is a claim that fear of density in a pandemic era is sending people streaming to the suburbs and beyond.

Our recent report, Youth Movement, confirmed the depth and breadth of the long-term trend of well-educated adults moving in large numbers to close-in urban neighborhoods. And the real-time data from real estate market search activity confirmed that cities were still highly attractive, gaining market share in total search activity from suburbs and more rural areas, according to data gathered in April by Zillow and Apartment List.com

Now, with even more data in hand, the picture remains very much the same.  Apartment List.com economists Rob Warnock and Chris Salviati have done a thorough analysis comparing the pattern of apartment search activity in the nation’s 50 largest metro areas between the first and second quarters of 2020; basically the period just before the pandemic struck with full force, and then the three months during which much of America was reeling from the virus and stuck in lock-down, with lots of time to consider possible new living locations.

Bottom line:  As revealed by apartment search activity, interest in cities actually increased in the second quarter compared to the first, relative to other locations, including suburbs, other less dense cities, and rural areas.

Overall, apartment search activity is up in the second quarter:  More people are looking to move (or perhaps sheltering in place has given us all a lot more time to spend on the computer, and therefore gives more people more time to look at new places).  Either way, the ApartmentList data don’t show any flight from density.  Warnock and Salviati looked to see whether people were looking to move to a higher density city, a lower density city, or a city with about the same density. Nationally, they found that the share of searches for apartments in cities with higher density than one’s current residence grew about four percent, while the share of searches in similar or lower density cities actually declined.

Overall, the data undercut the idea that density considerations are reshaping the demand for apartments.  They conclude:

But even if renters are a bit more likely to search in new locations, they are not eschewing density. In anything, we actually see a slightly greater appetite for density than we did prior to the pandemic. Among users searching beyond their current city in Q2, just over 35 percent are looking for homes in a city with higher population density than where they currently live, a 4 percent increase quarter-over-quarter. Despite density’s bad reputation recently, searches to lower-density cities have actually ticked down 3 percent quarter-over-quarter.

Helpfully, Apartment List.com provides detailed data for the 50 largest markets, so we can see how this plays out in different parts of the country.  Its especially interesting to look at the New York City metro area.  For the first months of the pandemic, it was the epicenter, with rates of new reported cases far higher than in the rest of the country. (The fact that Covid-19 hit metro NYC so hard and so early is undoubtedly one reason that some leaped to equate density with viral risk). ApartmentList.com’s data show that searches by those living in New York City for apartments in the city increased, relative to searches elsewhere; meanwhile for those looking at the New York City metro area, the share of searches for apartments in the five boroughs actually increased relative to the rest of the metro area. Between the first and second quarters of 2020; the share of searches by NYC residents within the five boroughs was up 12 percent, the share of searches outside the city, whether to a outlying suburb, a different metro or different state, or to a lower density metro all declined.

 

The findings are the exact opposite of what one would expect if the headlines about an urban exodus were correct. Rather than looking to less dense suburbs, or exploring other states, apartment search activity is focusing more on dense city locations:

While stories of Americans abandoning cities proliferate, our search data present a far more nuanced account of COVID-19’s impact on housing choice. Since the start of the pandemic, users have, on the whole, become slightly more likely to search in cities with higher population density than where they currently live. Similarly, searches from suburbs to core cities have become more common, rather than the other way around.

While the New York pattern holds for most metro areas, there are a handful of cities that are seeing a decline in central city search activity, but as Warnock and Salviati point out, there are local factors in each case that likely explain these trends. Boston has seen a decline in central city search share relative to suburbs, as has San Francisco and Chicago. In San Francisco, high housing costs coupled with increased remote working opportunities may have blunted demand, but that has mostly shifted to nearby markets like Oakland and Berkeley. In Boston’s case, the softening of housing markets may be related to declining demand from college students, who make up a disproportionate share of renters and who are facing extreme uncertainty about school schedules.

As we’ve pointed out, irrational fears about the negative health effects of urban living have a long history in the US.  The “teeming tenements” view of cities underlies many of the misleading anecdote fueled stories about people leaving cities. But the hard data show that suburbs and sprawling sunbelt cities are just as vulnerable to the Coronavirus, and while poverty and housing overcrowding are risk factors, there’s nothing about urban density itself that intensifies the spread of the disease.

The toxic flood of cars, not just the freeway, crushed Albina

Restorative Justice & A Viable Neighborhood

What destroyed the Albina community?  What will it take to restore it?

It wasn’t just the freeway, it was the onslaught of cars, that transformed Albina into a bleak and barren car-dominated landscape.

In the 1950s, Portland’s segregation forced nearly all its African-American residents to live in or near the lower Albina neighborhood.  During that decade, Albina was one of the cities densest neighborhoods, full of houses, apartments and local businesses.    In the late fifties and early sixties, the construction of Memorial Coliseum, urban renewal and finally, the I-5 freeway all triggered the neighborhood’s decline. Everyone now acknowledges that the construction of the I-5 Freeway through Albina was a classic example of the Oregon Department of Transportation’s institutional racism.

 

The construction of ODOT’s freeways triggered a wave of demolition and abandonment that caused the neighborhoods population to fall by two thirds.  To be sure, ODOT demolished hundreds of homes.  But the coming the the freeway and a flood-tide of cars traffic swept away the remaining population of the neighborhood.

The big question going forward is how to right that wrong. If we’re seeking restorative justice, what will it take to make a viable neighborhood?

It was the cars, not just the freeway, that destroyed Albina

It wasn’t so much the freeway that undermined the health of the neighborhood as all the cars the freeway brought. A better place for cars means a worse place for people.

Construction of the freeway was only part of the problem. Yes–the freeway wiped out more than 300 houses as it sliced through North and Northeast neighborhoods. But the real damage was done after construction was complete:  it was the flood of cars, chiefly suburban commuters and shoppers that transformed the area from a viable neighborhood, to a car-choked collection of parking lots, gas stations, and auto-oriented businesses. Most of the decline in population in Albina happened years after the freeway was built. The flood of cars undercut neighborhood livability, and population steadily declined in the 60s, 70s and and 80s. As people moved away, neighborhood businesses that served local residents, many owned by African-Americans, died.  More cars, fewer people, fewer businesses, and a shrunken impoverished neighborhood.

ODOT’s proposal to spend as much as $1.45 billion is supposed to revitalize the neighborhood. But flaw in the proposed freeway widening is that it makes both these problems–the freeway and local car traffic–worse. It widens the freeway–pushing it even closer to other uses like the Harriet Tubman Middle School, and funnels even more cars not just on the freeway but on local streets, like Vancouver, Williams, Broadway and Weidler. Plus ODOT would cut back the corners of major intersections to enable cars to drive even faster on these streets, and would its relocated freeway ramps would add more than a million miles of car travel to local streets.

Capping a small segment of the freeway does not create a neighborhood

We’re told that “caps”—actually just slightly wider freeway overpasses—will somehow magically heal the neighborhood.   But again, the problem is not just with one or two blocks, but about giving over the whole neighborhood to transient uses (like the Moda Center and Convention Center) and the demolition of housing and neighborhood sized streets. The construction of I-5, combined with the Memorial Coliseum, obliterated the dense street grid that underpinned Albina.

Creating a few blocks of buildable land, sitting directly over the noise and pollution of the widened I-5 freeway, and surrounded by fast-moving traffic on arterials and freeway-on-ramps is a pedestrian hostile environment that no one will want to linger and a place where no business will thrive.  The one example that freeway advocates point to, a cap over I-80 in Reno, houses a Walgreens drugstore and its parking lot; a transient, car-oriented use, not the hub of a revitalized neighborhood.

Development on the I-80 Freeway Cap in Reno

A freeway cap is no place for parks, sidewalk cafes, restaurants or grocery stores.  Its just another desolate off-ramp, designed for the needs of outsiders and travelers, not local residents.  The proposed highway cap would be bisected by the northbound freeway on-ramp to I-5, as well as four major arterials carrying traffic to and from the freeway (Vancouver, Williams, Broadway and and Weidler).  The project’s caps can only hold 3- to 5- story “lightweight” buildings, and OregonDOT is providing no funding to build anything on the site.  It doesn’t even know what it will put atop its bare concrete covers.  This is only a restorative space if you’re a highway engineer.

A neighborhood for the people who live there, or people driving through?

In cities across the country, freeways and the car-dependent development patterns and urban depopulation they’ve engendered have divided and decimated historically Black urban neighborhoods.  As the Los Angeles Times editorialized:  the real monuments to institutional racism are the nation’s urban freeways.

Freeways, and the high speed on-ramps and arterials that feed them  privilege people in cars driving through a neighborhood over the people on foot,  biking, breathing and just living in the neighborhood.  ODOT’s proposed I-5 freeway widening privileges the interests of peak hour car commuters from Clark County (75 percent white and with average incomes of $82,500) over the kids who attend Tubman School (two-thirds kids of color; half on free and reduced price lunches), and others who live in the neighborhood (where fewer than half of workers drive alone to work: most walk, bike or take transit, according to census data).

If we want to want to make amends to those whose neighborhood was devastated by the I-5 freeway and its car-dependent development pattern, the last thing we want to do is widen the freeway that caused this problem. And freeway covers, no matter how elaborate or expensive won’t replace the housing lost or the damage done to the fabric of the entire Lower Albina area.

 

“Let them drive Teslas” is not a climate or a justice plan

Portland’s climate emergency efforts are tarnished by an inability to plainly speak the facts about climate change

But the tragic fact is that the city is utterly failing to meet even its own previous goals, and more alarmingly, isn’t owning up to the failure of its 2015 plan to reduce emissions.

Instead, the Bureau of Planning and Sustainability is promoting trivial and incorrect stories about greenhouse gas emissions.

On June 30, the City of Portland adopted a climate emergency declaration. It steps up the city’s goal for 2030 to reducing emissions by 50 percent from 1990 levels (the previous plan had a 40 percent goal). The resolution declaring the emergency contains some sensible findings:  for example, explicitly acknowledging that “expanding roadways does not solve congestion but leads to additional vehicle miles and carbon emissions.” The resolution is an expression of concern, and a promise to do more, but rather than specific concrete steps, simply says that the city will, later this year, “co-convene” a process to talk about what we might do to reduce greenhouse gases while promoting climate justice.

You’d hardly know it from reading the climate emergency resolution, but as we wrote earlier, Portland has essentially failed to make any progress in reducing its greenhouse gas emissions under the Climate Action Plan it adopted five years ago.

On July 10th, Andrea Durbin, Director of the City’s Planning and Sustainability Bureau, appeared on Oregon Public Broadcasting’s “Think Out Loud” to discuss the city’s climate emergency declaration  She was interviewed by host Dave Miller, who quickly cut to the chase:

Dave Miller:
As you know, emissions reductions countywide have stalled in recent years and if I understand the numbers correctly, its largely because of an increase over the last five years, in emissions from the transportation sector.  In other words, even though the city has this climate action plan, compared to five years ago, emissions from people driving around are going up—its the opposite direction.  What’s not working?
Andrea Durbin:
Well, that’s correct.  More people are driving.

The only way to attain the city’s stated goals is to reduce vehicle miles traveled, as the the technical analysis done for the 2015 plan made very clear:  It said that the city would need to reduce vehicle miles traveled (VMT) by more than half by 2050 in order to achieve the planned greenhouse gas reductions. Instead VMT (pre-pandemic, at least) increased sharply after gas prices declined in 2014, and greenhouse gas emissions went up as well—by 1,000 pounds per person per year in the Portland metro area.

The new climate plan will fail unless it, too, reduces driving. The Portland City Council laudably adopted Commissioner Chloe Eudaly’s proposed amendment to require demand management, such as congestion pricing, before allowing any new freeway capacity to be built in Portland.  While that’s a good step, Planning and Sustainability Director Andrea Durbin could only point to two things as concrete examples of how to reduce transportation emissions:  reducing freight-related emissions by electrifying delivery vehicles, and mandating electric vehicle charging stations in new apartment buildings.

Climate justice: Let them drive Teslas?

The city’s new climate direction puts considerable emphasis on helping “front-line communities,” which it defines to include low income Portlanders, and people of color. It’s absolutely correct that climate change affects low income people more than others, which is why fighting climate change is, by its nature, intrinsically just. But Durbin’s example for how the plan will help low income Portlanders strains credulity:

For instance, in transportation, we will be doing work, again with multi-family housing, rental housing, to require electric vehicle hookups in those buildings as they’re being built, because we know there’s really an opportunity and a need to provide electric vehicles for communities of color, low income residents, because it is less costly to operate and so if we can provide those opportunities, and build out the infrastructure so that people have more choices, to pay less for their energy bills, or pay less for their gas costs because they’re driving an electric vehicle to work, those are the kinds of solutions that we’ll be looking for and trying to enable.

It’s hard to understate how small and potentially counterproductive requiring electric vehicle charging facilities are for fighting climate change or aiding the poor. Many poor households can’t afford cars at all; none can afford Teslas (MSRP:  $37,990) or other new electric vehicles.  Many can’t drive, due to age or infirmity. And aside from a few hundred newly built affordable housing units, few will be able to afford to live in newly constructed apartments with vehicle charging hookups.

Let them drive Teslas?

In addition, requiring parking (and attendant charging infrastructure) will drive up the price and likely reduce the supply of new apartments, which could collectively be worse for the economic prospects of low income households. Eliminating parking  and parking requirements (including ones with built-in charging) would do more to reduce greenhouse gases and promote affordability; parking requirements add an estimated $200 a month to apartment rents. Subsidizing electric cars is one of the most expensive ways to reduce carbon emissions and chiefly benefits higher income households.  And the plan doesn’t even acknowledge the vast subsidies the city provides to motorists, with free and under-priced parking on public streets.  It’s also paradoxical that the city charges a per mile fee for the use of electric scooters—the one type of electric vehicle that is arguably within the economic reach of low income populations—that is about ten times higher than the comparable fees charged to gas-powered cars.

Freight follies: On-line shopping reduces greenhouse gases

Pushed by Think Out Loud Host Dave Miller to elaborate on how the city would grapple with rising transportation emissions, Durbin singled out freight and specifically increased e-commerce deliveries as a source of rising emissions.

We’re also seeing an increase in emissions from the freight sector as more and more people are ordering their goods and products on line, on Amazon, and having it delivered. And we to make sure when they’re having it delivered, they’re being delivered in clean freight vehicles. And that also has direct benefits for communities of color, and low income communities, because its these communities that are most directly impacted by the air pollution

While much talked about, there’s actually no data to support the claim that Portland’s emissions growth is attributable to either freight generally, or e-commerce deliveries in particular.  The city’s most recent climate inventory, produced by Durbin’s office and released late last year contains no greenhouse gas estimates for freight or deliveries, and in fact, doesn’t contain the words “freight,” “trucks,” or “e-commerce.” A single reference to “delivery” discusses the operation of the  electric grid. As we’ve demonstrated at City Observatory, rising transportation emissions are almost entirely due to an increase in driving following the decline in gas prices in 2014. Also, the available evidence on e-commerce is that it reduces greenhouse gas emissions, by reducing the number, frequency and distance private car shopping trips. Each UPS, Fedex or USPS delivery vehicle takes dozens of Suburbans and Subarus off the road.  MIT economist Will Wheaton estimates that each dollar spent on E-commerce generates about 30 times less vehicle miles of travel that conventional brick and mortar shopping.


Delivering packages and reducing urban traffic congestion! Credit: Jason Lawrence, Flickr
Delivering packages and reducing greenhouse gas emissions! Credit: Jason Lawrence, Flickr

Focusing on electrical vehicle charging stations in new apartments and e-commerce deliveries, two categories that have almost nothing to due with the city’s failure to meet its greenhouse gas reduction goals signals that the Bureau of Planning and Sustainability isn’t taking climate change seriously. This fashionable trivia diverts attention from the much larger and more difficult steps we’ll need to take to reduce greenhouse gases. This city’s 2015 plan made it plain that achieving a more modest goal would require a huge reduction in driving; the new declaration and the BPS director, are virtually silent on this question, despite an even more aggressive goal. Soaring rhetoric about front-line communities is no excuse for a climate plan that simply won’t own up to past failures and lay out policies and strategies that will actually work at scale.

Editor’s Note:  Revised August 18 to include additional references on electrification.

 

 

 

CityBeat: NPR’s suburban flight story

Yet another entry in the trumped-up pandemic-fueled suburban flight narrative

Anecdotes aside, there’s no data that people are fleeing cities to avoid the Coronavirus

The data show young, well-educated adults moving to urban centers everywhere, and no decline in interest in urban markets during the pandemic

As we’ve chronicled at City Observatory, there’s a welter of press accounts claiming that people are fleeing cities to escape the Coronavirus.

The latest installment in this genre is Uri Berliner’s piece for National Public Radio.  Titled “New Yorkers Look To Suburbs And Beyond. Other City Dwellers May Be Next” it has the typical lede:  The pandemic is driving people from cities in droves:

Trends often start in New York. The latest: quitting the city and moving to the suburbs. If not quite an exodus, the pandemic has sent enough New Yorkers to the exits to shake up the area’s housing market.

Like the other stories of its ilk, this one is organized around the vignette of a wealthy professional couple living in a city apartment who has decided to chuck it all and move to a leafy bucolic suburb. In this case, its Miriam Kanter and Steven Kanaplue, who lived in Manhattan’s  Upper West Side, but were exhausted and appalled by the toll of Coronavirus. They recently moved to a single family home in Montclair, New Jersey.  We’re also treated to some self-promotional market hype from a local real estate agent in New Jersey, the “hypercompetitive bidding” for Montclair homes, we’re told is a “blood sport.”

Smiling, but their move to this suburb increased their risk of contracting Covid-19 by about one-third (NPR).

But there are a number of key facts that spoil this just-so story.

First, we learn, several paragraphs down into the story that Kanter and Kanplue were already already looking to move out of the city before the virus struck. There’s no question that as people age, they tend to suburbanize; there’s no news here.

Second, there’s a deep flaw in the “flight to the suburbs theory”—the virus is hitting New York suburbs even harder than it hit the city. The couple will be no doubt disappointed to learn that their choice of residence, suburban Essex County New Jersey has an even higher rate of Covid-19 cases per capita than the county they left (New York County, which corresponds to Manhattan).  According to the USA Facts tabulation, Essex County has about 2,368 cases per 100,000 population compared to 1,776 cases per 100,000 in Manhattan. Overall, based on the population data, the odds of getting Covid-19 are about one-third greater in their new suburban location than in the urban location they left. But the couple—and NPR—never bother to verify that the fundamental premise behind the story (that Covid-19 is somehow worse in cities) is actually true—because its not.

Third, there’s a very good question about whether any anecdote about one couple moving signals a shift in the market. While it may make for less charming radio, some actual data would be useful.  The NPR story is leavened with quotes from Redfin’s CEO Glen Kelman about the increased volume of searches for single family homes.  But as we’ve documented at City Observatory, the overwhelming trend, especially among well-educated young adults is a movement toward cities.  The phenomenon is both widespread (every single large metro area chalked up gains in its number of 25-34 year-old college-educated adults since 2010; and is accelerating nearly everywhere.  Moreover, real estate search data monitored at the height of the pandemic by Zillow and Apartment List, report no decline in interest in urban locations like New York; in fact, Zillow’s data show searches for urban properties outpacing those of suburban properties, compared to the pre-Pandemic era.

The media, even usually reliable sources like National Public Radio, seem eager to embrace the pandemic-powered urban exodus myth.  If they’d bother to look closely at the data, they’d be reporting that moving to the suburbs does nothing to insulate one from the virus (especially in New York), and that if anything, the trend toward city living seems resilient in the face of the virus.

 

What about reparations for people?

ODOT proudly spends road funds on mitigating the impact of its highways:  if you’re an invertebrate.

The highway department mitigates noise pollution, rebuilds jails, and even compensates neighborhoods

But if we repeatedly pushed highways through your neighborhood, all you’ll get is condolences, wider overpasses, and a pictures of housing for which there’s no money

The Oregon Department of Transportation regularly spends millions of dollars to mitigate and offset the damage its highway projects due to plants and animals, to wetlands, to the quiet of residential neighborhoods.  They’ve made a practice of creating mitigation banks, that offset past and future damage from highway projects. And they’ve even set up permanent funds to provide for the maintenance of these areas over decades.

ODOT talks a good game about its interest in offsetting the damage freeway construction did to the historically Black neighborhoods of North-Northeast Portland, but the damage they really did was in wiping out housing, and destroying the critical mass of population in the neighborhood, turning it into a car-dominated landscape of parking lots, drive-throughs and gas stations.  At City Observatory, we’ve shown that ODOT highways were principally responsible for decimating the Albina neighborhood. ODOT’s Highway 99W (Interstate Avenue) cut the neighborhood off from the river and demolished houses in 1951; the I-5 Freeway cut the neighborhood off from the rest of North and Northeast Portland in 1962, and the unfinished Prescott Freeway flattened another portion of Albina in the early 1970s. Together these freeway projects and the changes they enabled caused the neighborhood’s population to decline by two-thirds. (The red bounded areas below were largely destroyed by ODOT highway construction).

The most tangible way to achieve “restorative justice” in Albina is to do what the agency routinely does around the state:  restore the habitat it destroyed.  In an urban setting, that habitat is, most critically, housing.  ODOT demolished hundreds of housing units for the i-5 Freeway, and the traffic its roads generated destabilized the neighborhood and wiped out most of the rest.  Building housing, lots of it, in Albina, a neighborhood that’s poised for a comeback, would be a fitting way to mitigate the harm it has done.

Instead, the agency seems bent mostly on a superficial effort to sell a wider overpass. If it’s serious about reparations for the damage it did to Albina, ODOT should be making a major contribution to restoring housing in the neighborhood.

ODOT’s routinely pays to mitigate the damage its highways do

It’s hardly far-fetched to suggest that ODOT fix the damage that its projects do to the surrounding environment.   ODOT has established mitigation banks to restore disturbed wetlands, fish spawning areas and seasonal ponds.  It pays for extensive sound walls to block the noise generated by highways that affect nearby homes and businesses.

Here’s a classic example:  In Southern Oregon, highway construction has wiped out hundreds of acres of “vernal ponds”—areas that flood in the rainy season and dry out the rest of the year.  These ponds are home to some distinct plants and animals, like the fairy-shrimp and wooly meadowfoam.  Recognizing the damage that previous freeway construction had done, and in an effort to offset that (and likely future damage) ODOT has created an 80-acre conservation bank as part of its project to expand Highway 62 in Southern Oregon.  In all, ODOT Has more than 200 acres of oak savannah and vernal pools in Southern Oregon. Importantly, ODOT’s own report describes the establishment of the habitat as “compensatory” for the damage done by past, present and future roadbuilding.

Perhaps even more importantly, ODOT’s investment is actually prospective:  It recognizes that its future actions will likely cause even more damage to the environment and its created more habitat that its destroyed, in effect to serve as bank to offset likely future damages. ODOT paid to acquire the land, and set up a half-million dollar endowment to assure that exists, in perpetuity.

The Oregon Transportation Commission has already approved nearly $3 million in funding for another mitigation bank called the Columbia Bottomlands.  It’s designed to offset damage to wetlands from future projects, including the proposed effort to revive the Columbia River Crossing (Oregon Transportation Commission, December 6, 2019, Consent Agenda, Item 13).

Other major ODOT projects have similar mitigation expenditures.  For the Newberg-Dundee bypass, ODOT is creating twice as much wetland habitat as the project is destroying.  For the Pioneer Mountain-Eddyville project, ODOT paid for improving fish habitat on nearby streams.  ODOT makes all these payments to mitigate environmental damage, in part, because it is required to do so by the National Environmental Policy Act.

Sound walls to mitigate freeway noise

When freeways were first built in the 1950s and 1960s, the state highway department did nothing to offset the effects of noise on nearby homes and businesses. But since the 1970s, it has become commonplace to spend a portion of any project’s budget on “noise walls” to buffer nearby uses. An extensive set of sound walls is proposed as part of the I-5 Rose Quarter Freeway widening project.

ODOT’s guidelines suggest that it will pay up to $25,000 per residence for sound reduction.  Nationally there are more than 2,700 miles of sound walls; in just the three-year period 2008-2010, the Federal Highway Administration spent roughly half a billion dollars on sound wall construction.  Recently, ODOT spent $2.6 million on a sound wall to benefit several dozen homes in South Salem near I-5.  It’s particularly important to note that much sound wall construction was remedial and retrospective: the first noise walls were built years or decades after the freeways were first built, they were an intentional effort to correct past damage.

Highway money for jails and some neighborhoods

In 1980, the right of way for the I-205 East Portland Freeway passed through the site of the Rocky Butte Jail.  The Federal Highway Administration paid $53 million (over $160 million in today’s dollars) for the construction of the Multnomah County Justice Center in downtown Portland.  

Paid for with federal highway dollars, 1980.

There’s even precedent for compensating for the negative effects of freeways on surrounding human neighborhoods.  When ODOT built the I-405 freeway through Northwest Portland in the 1970s, it created a multi-million dollar community development fund.  The chief difference:  Northwest Portland is a largely white neighborhood; Albina was a largely Black neighborhood. The fund still exists and is now administered by the Oregon Community Foundation, but still gets its funding from highway-related revenues.  Among other things, the fund has supported the renovation of the Hillside Community Center in the Northwest hills.  For the record, when ODOT built the stub-end of the never-completed Prescott Freeway through Albina in 1973, it didn’t create a similar fund for the predominantly Black neighborhood devastated by construction.

Other governments have sought to mitigate their neighborhood impacts by building housing.  The University of California San Francisco’s medical campus has agreed to build nearly 1,300 units of housing, with one-quarter of them reserved as affordable, as part of a community benefits package associated with the approval of the expansion of its Parnassus Heights Hospital.  And this is being paid for by the University, not the city. The explicit purpose of the agreement is to offset potential damage to the neighborhood according to San Francisco Supervisor Dean Preston:

“Just as a doctor’s oath requires them to do no harm to their patients, we want to make sure that the UCSF development plans do no harm—and in fact, benefit—surrounding communities and the city,”

The need for reparations in Albina

Just as fish ladders on dams only partially mitigated the epic destruction of native salmon runs in the Pacific Northwest, building covers over a widened freeway is only the most minimal mitigation of the damage done to Portland’s historically Black Albina neighborhood.

ODOT’s map of the project area makes it clear that in 1954, prior to the construction of the freeway there were hundreds of houses, and that the gridded, walkable fabric of Albina was very much intact.  It wasn’t simply the construction of the freeway that transformed the area, it was the flood tide of car traffic that rendered much of the area inhospitable to residential inhabitation.  As we’ve documented at City Observatory, these ODOT highway projects led to the loss of hundreds and hundreds of housing units and led to a decline in population in Albina from more than 14,000 in 1950 to about 4,000 in 1980.

 

ZGF description of Albina Neighborhood Urban Form, 1954

Whether we’re talking mitigation or reparations, ODOT owes this neighborhood something much better that a few hundred square feet of noisy, largely unusable or undesirable space on a freeway cap, surrounded by an increasing flood of automobile traffic.  Instead, it should be looking to mitigate the damage to the neighborhood in exactly the same fashion it did for Highway 62 in Southern Oregon:  by restoring the pre-existing habitat.  The way to do that is not through freeway covers, or a single building site atop a freeway overpass, but by subsidizing the construction of hundreds of housing units that can restore the density, urban form and walkability of Albina that existed prior to the freeway’s construction.

The Rose Quarter project’s own public outreach process highlighted the construction of permanently affordable housing as a key strategy for restoring community wealth in Albina. (Executive Steering Committee presentation, March 22, 2021).

And while ODOT’s “Independent Cover Assessment” consultants have produced illustrations showing how as many as 750 new apartments could be built on or near the Rose Quarter project, ODOT hasn’t come up with a single dime of the $160 to $260 million it would cost to build the housing depicted in its images.

To sum up:  ODOT can spend real dollars when it comes to protecting vernal ponds, restoring wetlands, improving fish habitat, or lessening sound impacts on nearby houses.  It can use highway dollars to replace jail cells. It can even contribute money for neighborhood improvement—at least in historically white neighborhoods.  But when it comes to North/Northeast Portland, where their freeway intentionally targeted and wiped out hundreds of housing units (which they never replaced), and whose traffic destabilized the neighborhood, and led to decades of decline, ODOT has, at most, condescending rhetoric, and feeble and misleading promises about “freeway covers” which in fact are badly fragmented, noisy and pedestrian hostile land perched over a widened I-5 freeway..

Why doesn’t ODOT recognize the habitat of Albina as worthy of protection and recompense?  If it can restore twice as much wetland as a freeway destroyed, why can’t they restore twice as much housing as they demolished?

 

Dominos falling on Rose Quarter freeway widening

Last week, over the space of about 24 hours, the prospects for Portland’s proposed the Rose Quarter freeway widening dimmed almost to extinction.

Leaders of Portland’s African-American community have concluded that the Oregon DOT had no intention of altering the project in response to community concerns, and when they withdrew, a host of local leaders pulled out as well.

The duplicity of ODOT is well-known to seasoned veterans in the region.

For several years, the Oregon Department of Transportation has been proposing to widen the I-5 freeway opposite downtown Portland near the Rose Quarter. At City Observatory, we’ve taken a close look at the project’s flawed traffic forecasts, environmental projections, phony safety claims, impact on cyclists and pedestrians, deceptive public relations, climate impacts and cost overruns. In addition to all these issues, the project doubles down on the historical insult to Portland’s African-American community; promoting more traffic and pollution in the neighborhood destroyed by the freeway’s original construction, and moving the roadway even closer to Harriet Tubman Middle School, which serves primarily children of color. While ODOT has offered up the feeble palliative of partial “covers” over the freeway (in reality glorified and traffic-choked overpasses), the project now appears to have foundered over opposition from the local African-American community, led by the Albina Vision Trust (AVT).  For the past several years, the trust has been pushing for a plan to revitalize the neighborhood to promote restorative justice for the damage wrought by the freeway and urban renewal. The decision by AVT to pull its support for the freeway widening appears, in very short order, to have tipped the political balance against the freeway widening project. Here’s a chronology that shows just how fast these events happened.

Dominos falling on the Rose Quarter Freeway widening project

June 30, 2020 12:06 pm Item: Albina Vision Trust (AVT) renounces its support for the freeway, and withdraws from the project’s “Executive Steering Committee.” The Oregonian’s Andrew Theen tweeted:

In a June 30 letter to the Oregon Transportation Commission, Albina Vision Trust Director Winta Yohannes wrote:
Effective immediately, the Albina Vision Trust is withdrawing its participation from all engagement with the I-5 Rose Quarter Improvement Project.

The AVT team has engaged in this project for two-plus years, which has included the dedication of significant board and staff resources since the “reset” in January 2020. We believe that this project presented ODOT at its partners the opportunity to consider how a major public investment could be in service a broader community visions for healing, the generation of short and long-term wealth creation opportunities, and caring for our children who are here today and those who live in lower Albina in the future.Despite our good faith efforts we do not see our engagement resulting in meaningful changes to the project or its anticipated outcomes. For this reason we can no longer support the project. To be clear, we do not assign any individual claim. We are called to adhere to our organizational responsibility to ensure our actions and resources are aligned with our stated values.

June 30, 2020 12:47 pm Item: Portland Transportation Commissioner Chloe Eudaly announces she, too, is opposed to the project, and is off the Committee as well.

. . . after the first executive steering committee, it became clear to me that ODOT was determined to move forward with the project as planned, that they were resistant to congestion pricing, that the steering committee was to be treated as an advisory body with no governing authority, that ODOT did not seem to grasp the concept of restorative justice, and we were unlikely to achieve the outcomes we were seeking.

June 30, 2020 1:31 pm Item:  Portland Mayor Ted Wheeler follows suit.

June 30, 2020, 9:00 PM.  Item:  Portland City Council adopts its long awaited climate emergency resolution.  The change of heart on the freeway widening comes at an auspicious time, as OPB’s Rebecca Ellis noted:

Hours earlier, this resolution would have placed council members in the uncomfortable position of advocating for policies to mitigate climate change, while simultaneously supporting a freeway widening project that climate activists have argued will increase traffic and worsen emissions.

The Council voted unanimously to adopt an amendment proposed by Commissioner Eudaly strengthening the cities requirement that demand management measures like congestion pricing be implemented before the city will support any new freeways.

Be it resolved, that since freeway expansions disproportionately harm communities of color and increase carbon emissions, the City of Portland will require demand management,  implemented equitably and in close cooperation with BIPOC communities, before any future freeway construction or expansion project.

The change strengthens the existing provisions of the City’s adopted Central City Plan, part of its comprehensive plan,

Wheeler amplified on his earlier remarks backing away from the project, according to OPB:

“ODOT has not met our goals around community and economic development or climate,” said Wheeler. “And going forward I will look forward to collaborating with transportation Commissioner Eudaly to prioritize congestion pricing strategies for our existing freeways within Portland.”

July 1, 2020 11:00 AM Item: At her weekly press conference (focused mainly on the response to Covid-19), Governor Kate Brown is asked about the Rose Quarter Freeway.  She too indicates the project won’t go forward for lack of support from the African-American community:

. . . in terms of the Rose Quarter, I think we all need to take a look at all of these transportation projects in light of the economic disruption that we’re seeing statewide and frankly, nationally.  That said, we’re not going to proceed with this particular project—with the Rose Quarter project—without community support and engagement from the Black and African-American community. It’s my hope that this particular project can be part of righting historic wrongs and I’m committed to bringing people back to the table for that discussion.

As The Oregonian reported, Brown’s statement doesn’t say she’s backing away from the project; instead its clear she’ll try to get Albina Vision back to the table:

Brown did not say that she had pulled her support from the Rose Quarter project, which the Oregon Legislature authorized and agreed to fund under a mammoth 2017 transportation package. But she said, “It’s my hope that this particular project can be part of righting historic wrongs.” And she said she is “committed to bringing people back to the table” to discuss how the planned changes to the interstate and its surroundings can and should accomplish that.

Portland’s Stop Work Order

Portland’s City Council clearly illustrated their current position on Monday, July 6th by unanimously signing a letter to city staff, directing them to immediately stop work on the project.  (City staff have been tasked with working with ODOT on project planning and survey work). The July 6 letter reads, in part,

Effectively immediately, we are directing all City Bureaus to suspend all operations until further notice related to the I-5 Rose Quarter Improvement Project,.  This includes attending meetings, providing technical support or responding to project emails.”

The net effect of the local political exodus from Rose Quarter Steering Committeee, according to Oregon Public Broadcasting’s Jeff Mapes, is a major blow” to the project. Mapes says while ODOT is trying to say it will work with its “now estranged local partners,” that it  will be “hard for Wheeler and Eudaly to reverse course, and its likely that ODOT will have to retool the project.”

Albina Vision:  Disrespected. Will they deal?

Project backers are desperately looking for a way, to get the Albina Vision Trust, in Governor Brown’s words, to “return to the table.” How likely is that to happen? In the past few days, AVT leaders, speaking in media interviews, have conveyed the roots of their disenchantment with ODOT.  They run deep.

Speaking with Willamette Week’s Mark Zusman, Yohannes made it clear that her organization’s opposition didn’t stem from narrow concerns about freeway caps or air quality near the Tubman middle school, but rather with ODOT’s intransigence about the scope of the project and the need to redress this historic wrongs done to the district.

There is not one or two issues. All of us who have signed on as partners in many letters to the OTC over the last year—that includes Albina Vision, the Mayor’s Office, Commissioner Eudaly’s office, Metro, and Multnomah County Commissioner Vega-Peterson—we have always said that the way that you can address these issues is by comprehensively looking at the project. If there is a process to evaluate the entire scope then you can actually problem-solve these issues one by one. What we never got from ODOT was a willingness to explore with us what is worth having for a half a billion to a billion dollar investment. And if what we have is a project which you described, then that’s not enough. It’s so much easier to reduce this down to an issue about caps or an issue about the school, when really it’s a comprehensive issue about the Department of Transportation once again looking at our communities as one that they can run through without considering the people and communities that exist around it today and in the future. (emphasis added)

Albina Vision board member Zari Santner, speaking on OPB’s Think Out Loud on July 7 explained:

“We have not received any indication that they are willing to change the plans that they have for this freeway work since ten years ago, and we have finally decided that they are interested in talking about change but not really actually making any change.”

We’ve seen a lot of talk about process, but no indication that the process was going to lead to an outcome that would really satisfy all the people who were sitting around the [Executive Steering Committee] table. As I said, it was talking about change, but no change. At the June 21 meeting of the Executive Steering Committee, a draft charter language of this committee was put forward by ODOT and included in it was, and I quote “move forward the House Bill 2017 defined I-5 Rose Quarter Improvement Project,” which basically means the project that was presented to the Legislature in 2017, without any modification.

No one should be surprised that ODOT’s “involvement” efforts with stakeholders are simply a sham to dodge and delay, while ODOT blunders ahead with the freeway project it always wanted. In fact, the motto of the agency’s new “Urban Mobility Office” spells this out explicitly:

From the Urban Mobility Office’s May 27, 2020 presentation to the Oregon Transportation Commission.

“Process is the project”: When the bureaucrats tell you that their job is to shine you on, you should take them at their word.

Analysis:  Why the project failed

In addition to the concerns raised by Albina Vision, there are powerful, and essentially un-contradicted arguments against the project on traffic, environmental, safety, urban design and other grounds. More than 90 percent of the over 2,000 comments submitted on the project’s environmental assessment opposed the freeway widening. The fact that key political leaders (notably Portland Mayor Ted Wheeler and Metro President Lynn Peterson) could claim that the project produced benefits for the African-American community was the thin reed on which the two could choose to ignore these copious arguments (and widespread community opposition).  This dynamic was captured by Bike Portland’s 2019 editorial cartoon on the subject:

For a time, Portland Mayor Ted Wheeler and Metro President Lynn Peterson could use Albina Vision as an excuse for a badly flawed, widely opposed freeway widening project; No more. (Concept: Jonathan Maus/ Bike Portland – Illustration & Copyright: Cloe Ashton, May 2019).

The fate of the project nearly came to a head earlier this year, when the Oregon Transportation Commission elected to bull ahead with the project without preparing a full Environmental Impact Statement (EIS), instead choosing to rely on a much thinner (and deeply flawed) Environmental Assessment. Key leaders, including Portland Mayor Ted Wheeler, Multnomah County Commissioner Jessica Vega-Peterson, Metro President Lynn Peterson and Oregon House Speaker Tina Kotek had been asking for a full EIS to address the project’s manifold problems.  They and others relented, as part of a privately negotiated “reset” in which ODOT agreed to create a so-called “Executive Steering Committee,” to guide the project. The ESC, which held its first meeting May 22, included representatives from the City, Multnomah County, Metro, Portland Public Schools, and the Albina Vision Trust.  But as Albina Vision’s statements show, they now realize that  the executive steering committee is simply bureaucratic window dressing, there to create the illusion of involvement and consent; it had no power to modify the project in any significant fashion.

David Bragdon, former President of Metro, and now director of the Transit Center in New York, explained to Bike Portland why the effort failed.

“This final gambit was doomed by its core fallacy, the Governor’s apparent belief that a faux ‘engagement’ process manipulated by the State Highway Department with a predetermined outcome in mind would somehow produce the ‘right’ way to do something that is inherently wrong. The flaw in that assumption is that there IS no ‘right’ way to inflict more traffic and pollution on children of color and a waterfront park, and create more congestion – which is inevitably what this project was going to do, pretty drawings and insincere promises about caps notwithstanding. The State Highway Department tried to variously either co-opt, dupe, bully or bribe everyone, with falsified traffic forecasts, fraudulent fiscal fantasies and general incompetence and bad faith. Those standard ODOT tactics have now earned widespread, inalterable opposition from the community and a majority of local elected officials.

Bragdon knows whereof he speaks: A decade ago he led Metro when ODOT was pushing forward with the failed Columbia River Crossing project. Just as now, ODOT promised concessions to local governments and community groups, but as the process wore on, then ignored or reneged on those promises. For example, ODOT acceded, on paper, to then-Mayor Sam Adams insistence to reduce the size of the highway bridge from 12 lanes to 10, but instead of shrinking the bridge,  simply erased all the references to its actual physical width. Another example: A similar highway cap in Vancouver Washington, promoted as an elaborate landscaped connector between downtown Vancouver and historic Fort Vancouver was reduced to “bare bones” when the project went through “value-engineering.”

This dog ain’t dead yet

This rapid turn of events certainly doesn’t bode well for the future of the Rose Quarter freeway widening project, but its not enough to kill it—yet. For reference, the Washington and Oregon Legislature’s pulled the plug on the $3 billion Columbia River Crossing project (just a few miles north on I-5 from the Rose Quarter) in 2013.  But the project has lingered on, zombie-like, as a footnote buried deep in project plans. And sure enough, within the past year, the two states have ponied up tens of millions of dollars to try to re-animate the corpse.

Similar vestiges of the Rose Quarter project are buried in state law, Metro’s regional transportation plan and the City’s Central City plan, and will need to be excised before the project is completely dead.

If the region’s leaders are serious about terminating this project, or at least triggering a real conversation about how to rethink its scope to meet their stated goal of “restorative justice,” a good place to start would be rescinding the $129 million that the Metro has approved for project engineering and site acquisition just three months ago.

As Portland City Planning and Sustainability Commission member (and No More Freeways leader) Chris Smith puts it, the future of the Rose Quarter has to be decided in the context of how the region treats freeways as part of its stated goals of reducing greenhouse gas emissions.  ODOT—which is in denial about how more road capacity leads to more traffic and emissions—is still seeking to build a gigantic new freeway bridge across the Columbia River, which in large part is driving its case for the Rose Quarter project.

I don’t think this will be really over until after the Columbia River Crossing is settled. The CRC discussion should be about the whole I-5 corridor from Battle Ground to Wilsonville (or Salem) and ideally would include the whole freeway network in the region. We need to establish the role of the freeway network in our climate change plans. We can’t do that effectively project-by-project.”

In some senses, the project may be more in danger of succumbing to fiscal reality that a local veto. The decline in driving prompted by the pandemic (and induced recession) is sharply cutting state gas tax revenues (which are needed to pay for the project).  Meanwhile, the cost of the Rose Quarter freeway widening has ballooned from $450 million (promised to the 2017 Legislature) to as much as $795 million—that before meeting any of the demands by the Albina Vision Trust for repairing the damage done by the freeway’s original construction. ODOT may simply not have the resources to build this project. Governor Kate Brown prefaced her comments about the Rose Quarter project by alluding to the need to re-evaluate the state’s entire investment plan in light of the new fiscal situation, saying:

. . . we all need to take a look at all of these transportation projects in light of the economic disruption that we’re seeing statewide and frankly, nationally.

Deja vu all over again

For long-time transportation advocates, like Bragdon and Mayor Sam Adams, ODOT’s duplicity on the Rose Quarter isn’t a surprise. It seems that with every decade’s megaproject, community groups must learn once again that the agency isn’t to be trusted. In 2009, the Bicycle Transportation Alliance (since morphed into the Street Trust), similarly dropped its participation from the Columbia River Crossing after being abused and mislead by the Oregon Department of Transportation.  Here’s Elly Blue’s 2009 Bike Portland article, describing Michelle Poyourow’s resignation from that project’s steering committee:

“The BTA can no longer justify pouring our members’ precious resources into a project that is bad for the health and vitality of this region and now has a lousy bike and pedestrian facility to boot.”

Poyourow told BikePortland on Friday that she believes the BTA can effectively influence the project without sitting on — and thereby tacitly endorsing — the committee.

“The problem with the CRC is it’s just been deaf to community input,” she said, adding that bicycle advocates are not the only group to have had concerns about the bridge brushed aside. “They’re not listening. They’re determined to do what they’re going to do.”

It’s a decade later, its a different Governor, different members of the Oregon Transportation Commission and a different set of local officials and community leaders, but ODOT’s modus operandi is just the same. The process steps like the Executive Steering Committee are just cynical and insincere efforts to divert, delay and obscure criticism and pave the way for a wider road. And this isn’t just a decade-old problem, its a half-century long tradition. The original construction of I-5 through this neighborhood was riddled with exactly the same kind of tactics.

When the freeway was being planned, local officials objected to its impacts on neighborhoods and schools. The freeway bisected the attendance area for the Ockley Green Elementary School, meaning many students could no longer easily walk to school. Several local neighborhood streets were transformed into busy, high speed off ramps. In the planning process, local officials raised these concerns with the state highway department, and were offered assurances that “every effort” would be made to solve these problems. University of Oregon historian Henry Fackler describes the 1961 meeting convened by the city to address the effects of street closures:

At the meeting’s conclusion, state engineer Edwards assured those in attendance that “every attempt will be made to solve these problems.” The freeway opened to traffic in December 1963.  No changes were made to the route.

Six decades later, ODOT is still running the same scam. For the moment at least, the Albina Vision Trust isn’t falling for it.

Portland awards itself a participation trophy for climate

Portland is utterly failing to reduce greenhouse gas emissions from transportation, but not to worry, its ticking lots of boxes in its bureaucratic check-list.

The city walks away from its 2015 Climate Action Plan after an increase in greenhouse gases, but promises to do better (and more equitably) in the future.

Portland’s greenhouse gas emissions increased by 440,000 tons per year, instead of decreasing as called for in its 2015 plan.

Increased driving due to cheap gas has wiped out all the city’s climate progress in other sectors in the past five years.

We’re frequently told that when it comes to dealing with climate change, if our national government doesn’t step up (and it hasn’t under the current administration), not to worry, because the nation’s cities, and the mayors who lead them are as green as can be.

To be sure, mayors have loudly proclaimed their commitments to (future) greenhouse gas reductions, and fealty to the Paris Climate Accords, but rhetoric and pledges are one thing, and lower rates of carbon emissions are another. While plans are nice, we really need to be focusing on the results that the plans are producing.

When it comes to Portland, one of the self-proclaimed leaders of North American climate change cities, the results are disappointing, and the explanations are, at best, disingenuous.

Portland was one of the first cities in the US, to adopt and explicit grenehouse gas reduction goal in 1993.  The city’s website boasts:

Portland is tackling climate change head on. We were the first US city to adopt a carbon reduction strategy in 1993, and our cutting-edge Climate Action Plan put us on a path to reducing emissions by 80% in 2050.

Noble and far sighted, to be sure, but a quarter of a century later, how is the city doing in actually, you know, reducing greenhouse gases?

The answer to that question is supposed to be spelled out in a progress report on the the city’s adopted 2015 Climate Action Plan. The city’s Bureau of Planning and Sustainability last month published “final” report card on the city’s efforts. But rather than being an honest report card, the document amounts to the bureaucratic equivalent of a third-grade participation trophy. The city congratulates itself for its efforts, but the true test of progress, a reduction measured in tons of carbon emissions, shows the plan has been a failure.

The  city’s “Final Progress Report” almost completely glosses over the failure to cut emissions modestly in the past five years, and now the City has quickly moved on to a much more ambitious interim goal to cut greenhouse gas emissions by 50 percent from 1990 levels (up from 40 percent), in its new Climate Emergency Declaration.

Let’s get to the heart of the matter:  The key element on the plan is reducing emissions.  Here’s the report’s summary of our progress:

The 2018 data also shows that carbon reductions have started to plateau and that current emissions trends are not sufficient to meet the needed reduction targets that need to be achieved. To achieve the goal of a 50% reduction in carbon emissions by 2030 as identified by climate science, local emissions must be reduced by an additional 31% in the next 10 years. This is a daunting task.

Actually, the report doesn’t present the actual emissions data; instead, it links to another report (the September 2019 report we wrote about here) that has the  data, and that report includes data only through 2017.

A checklist isn’t a climate strategy

The bulk of the city’s self-congratulatory report card consists of describing a laundry list of 247 actions that were mentioned in the previous climate action plan, and briefly rating each as either complete, on track or “facing obstacles.  The actions include sweeping and important policies that would make a big difference (like Item 1H: carbon pricing, which is “facing obstacles), and administrivia, like planning for actions with minimal benefits (“Item 6B: explore options intelligent transportation system, complete).  Nothing in the report calculates or categorizes the impact of any of these individual actions on the region’s greenhouse gas reduction progress (or lack thereof).

Wow! 77% of 247 Actions are on track: But the one indicator that matters—carbon emissions—is going in the wrong direction. (Portland Climate Action Plan Report, 2020).

Put another way:  If you successfully implemented all or most of your checklist actions, and you’re not making progress on reducing GHG, then something is fundamentally wrong with your plan.

Plans have to be accountable, not just for endless checklists of busy-work tasks, but actually achieving measurable results. Ironically, the plan itself calls for more measurability, but as noted above, it simply failed to report the annual data showing that by the CAP’s own metrics, that its failing.

It’s the equivalent of a grade-school participation trophy:  The City and County laud themselves for implementing about three-quarters of their 250 checklist items, but gloss over the fact that greenhouse gases, particularly from transportation are rising.
In 2013, the base year for statistics used in preparing the 2015 Climate Action Plan, Multnomah County’s total emissions were estimated at 7,260,000 tons.  According to the latest climate inventory (linked to, but not actually quoted in the progress report), the level of emissions in 2017 (the latest year for which data is available), were 7,702,000 tons. (We dig into the detail of these estimates below). Thus, Portland’s  total GHG are higher in 2017 than in 2013 according to the city’s own numbers, i.e. since Portland adopted the 2015 plan we’ve made zero (actually negative) progress.  The report spins this as “plateauing” but we all know that unless we make steady progress in reducing GHG, the task becomes exponentially more difficult in the years ahead.  Nobody would call having your 8th grader still reading at the 3rd grade level five years later as “plateauing.”

Mission unaccomplished

Its clear from the tone of the report card, that the City is simply walking away from its 2015 plan–even though its substantive goals haven’t been accomplished.  The title of the report is “Final Progress Report.”  All of the references to the plan are in the past tense, for example:

The Climate Action Plan was an important roadmap over the last five years to help ensure the City and County continued their progress toward carbon reduction goals. The 2015 Climate Action Plan broke important new ground by including several important elements:

(Report, page 65).

City officials promise accountability, but if they simply walk away from plans after five years, without seriously acknowledging their failure, and analyzing the reasons for that failure, and begin by writing a new plan, de novo, as the city now proposes, there is no accountability. Moreover, the city’s “final report” disappears the few bits of serious number-crunching that were done in the 2015 plan, showing that we’d need to dramatically reduce vehicle miles of car travel in order to achieve our carbon goals.

The 2015 plan was explicit about what would be needed.  It laid out a carbon budget that did the math on what we would have to do to reach our goals. Specifically:

For example, by 2030 emissions from the building energy and transportation sector must be approximately 40 percent below 1990 levels (see Table 1). In 2050, residents must be able to meet all of their needs while using 62 percent less electricity and driving 64 percent fewer miles than they do today (see Table 2). (This also assumes a shift to cleaner electricity sources and more efficient vehicles.)

Climate Action Plan, 2015, page 19

In place of tangible, measurable indicators of progress toward our stated goal, the City’s climate emergency declaration offers  vague exhortations about future process.

As they walk away from the 2015 plan, there’s as yet  no plan to take its place.  There are vague statements about process going forward, descriptions of how the city might do something else, but few if any actual policies.  The City’s Climate Emergency Resolution directs BPS, by the fall of this year to “co-convene a process” to “identify and implement strategies that will advance a shared solution.”

BE IT FURTHER RESOLVED, that no later than Fall 2020, the Bureau of Planning and Sustainability is directed to work closely with other City bureaus, Multnomah County, frontline communities, and youth-led organizations to establish and co-convene a new and ongoing climate justice initiative that will provide a framework for government and community to work together as equal partners to identify and implement strategies that will advance a shared vision for climate justice and action;

But this new process and the plan it produces, and its specifics are in the future.  For now the city has simply closed the book on the 2015 plan, and awarded itself a participation trophy for having done so.

Portland’s 2015 Climate Action Plan has been an abject failure

Its important to look in detail at the data on carbon emissions in Portland for the past decade. They show that the city’s climate change efforts, so far, have failed.  In the three years prior to the adoption of the Climate Action Plan (2010 through 2013), the city managed to reduce greenhouse gas emissions by 250,000 tons per year; In the four years between the plan’s baseline and the latest available data, emissions have increased by 110,000 tons per year. In 2013, Portland GHG’s were 7.26 million tons; in 2017, they were 7.7 million tons, an annual increase of 1.5 percent per year, at a time the Climate Action Plan called for an annual reduction of 1.4 percent per year.

In the most basic sense, more greenhouse gas emissions mean your plan isn’t working. The plan characterizes this whopping failure as a “plateauing.”

It also conceals the failure by constantly referring to a 1990 baseline, rather than looking a recent trends (i.e. the past two, five or ten years).  In essence, the plan takes credit for emission reductions that happened in the two decades years before the 2015 plan was adopted (i.e. 1990 to 2010), and simply ignores the fact that Portland’s GHG are now going in the wrong direction.

Plateauing is a tacit admission of failure for a plan that depends on large and consistent reductions in emissions.  But, to be clear, emissions haven’t plateaued:  Portland’s total greenhouse gas emissions as calculated by the city, have risen by 440,000 tons between the CAP baseline year (2013) and the latest year for which data are available (2017).

It’s telling that the report includes no chart showing the needed path of emission reductions between now and 2030 or 2050.  Such charts are a staple of climate plans, including the 2015 CAP, which laid out this roadmap for GHG reductions:

The path laid out in Portland’s 2015 Climate Action Plan

If they’d replicated this chart, with data showing actual progress from 2013 to 2017, and showing their new, much more aggressive goal, it would look like this.

The 2015 Climate Action Plan, the 2020 Climate Emergency Declaration, and Reality

On this chart, the blue line shows actual emissions (as reported by the city), the 2015 plan (the orange line) and the 2020 Climate Emergency Goal (green line).  By 2017, in order to be on a path to achieving its 2030 goal, the City needed to reduce greenhouse gases by 25 percent below their 1990 levels; instead, as we’ve noted greenhouse gases rose, and were at 15 percent below 1990 levels.  The fact that the blue line is above the orange line shows the city isn’t meeting its previous goal.

And its worth noting just how much more ambitious the new goal of cutting emissions to 50 percent of 1990 levels by 2030 (and to zero net emissions by 2050) is.  The much steeper slope of the green line (the new climate emergency goal) implies a vastly bigger lift than the previous (2015) plan, the orange line. Meeting the 2030 goal will require more than twice as much annual reduction (110,000 tons vs 220,000 tons), each year, from now through 2030. The 2015 plan required that the City reduce its emissions by about 1.7 percent per year over 17 years to reach its goal of a 40 percent reduction; its new climate emergency declaration requires a 4.1 percent annual emissions reduction over the next decade to reach its higher 50 percent objective.

The city has raised the bar at exactly the time that its shown that its current efforts simply aren’t working. And unlike the 2015 plan, there’s no detailed calculation of how we’ll achieve this vastly greater level of emissions reductions. Recall that the 2015 plan said we’d need to cut driving in half to achieve a more modest goal over a longer period of time. If the city is serious about achieving this goal, as opposed to just posturing, its essential that they show how the goal can be reached. They haven’t.

What’s needed:  A laser like focus on reducing GHG from driving

The startling omission from the report is the fact that it’s been the increase in driving over the past five years that’s undercut our progress toward our stated climate change goals. The report neatly glosses over the fact that emissions, especially from transportation, are rising.  It presents one chart showing GHG in 2000 and in 2018 (the year of the latest GHG inventory) and omits data for individual years.

The City’s report  card makes it look as if very little has happened—the transportation emissions have gone up, and just a little  Leaving out the annual data conceals a much bleaker reality:  In the past five years, Portland has recorded huge increases in greenhouse gas emissions.  Here are the annual data from the independent, nationally-normed estimates prepared as part of the DARTE GHG inventory, showing the  Portland area’s greenhouse gas emissions from transportation. (These data are for the entire metropolitan area).

As we’ve noted before at City Observatory, the Portland made good progress until 2013, when increased driving due to cheaper fuel costs produced a surge in vehicle miles traveled and carbon emissions. Portland’s carbon emissions increased by 1,000 pounds per person annually between 2013 and 2018, more than wiping out all the other progress made in reducing greenhouse gases in other sectors.

Portland’s won’t make progress in reducing greenhouse gases until it finds a way to reduce vehicle miles traveled.  And it will need to reduce them substantially. As noted above, when it wrote the 2015 plan, the city did the math to figure out how big a reduction in driving would be needed.  Then, when transportation emissions were lower (and the city’s climate goals less aggressive) the city’s calculations showed we’d need a 62 percent decline in VMT. Our backsliding combined with a tougher goal means we’ll need to reduce driving even more to achieve the objective laid out in the Climate Emergency Declaration.

Dealing with climate change is a serious existential threat to humanity. Its good that the city is willing to acknowledge this, and that it has an important role to play in reducing greenhouse gases. This will be a challenging task, and it is not made easier by presenting reports that conceal fundamental failures to move forward, that hide key data and analysis that tell us where we really are, and which avoid accountability for failing to make meaningful progress toward our stated goals.

 

Whitewashing the freeway widening

A so-called “peer review” panel was kept in the dark about critiques of the highway department’s flawed projections

This is a thinly veiled attempt These are the products of a hand-picked, spoon-fed group, asked by ODOT to address only a narrow and largely subsidiary set of questions and told to ignore fundamental issues. 

As we’ve noted at City Observatory, the proposal to spend upwards of $800 million to widen a little over a mile and half of urban freeway in Portland Oregon is based on sketchy and misleading traffic projections, and related air pollution and greenhouse gas analyses. The I-5 Rose Quarter project has been subject to a withering barrage of technical criticism, which the state transportation department has simply ignored.

In an attempt to buttress its environmental claims, the Oregon Department of Transportation hired six out of state consultants to form a “peer review” panel (PRP).  The panel was asked a very narrow set of questions, was guided in its work by a former Director of ODOT, and predictably produced a whitewash of the project’s environmental analysis.

ODOT’s “peer review” panel, hard at work.

In theory, the PRP undertook an environmental review, looking at air pollution, greenhouse gases and noise pollution. But because all these impacts depend on the volume of traffic and whether the project increases or decreases traffic, they are all subsidiary to the accuracy of the traffic modeling. And the panel apparently did absolutely nothing to validate the accuracy of these traffic projections.

What No More Freeways and other critics have shown is that ODOT’s traffic modeling was deeply flawed, and biased in favor of dramatically understating traffic impacts, and therefore understating environmental impacts. The conclusions in the project Environmental Assessment (EA) that the project will not increase pollution are the direct output of these flawed traffic projections.  For example, the EA traffic projections compare the build project to a fictitious No-Build scenario in which the 12-lane Columbia River Crossing was built in 2015 (it wasn’t).  Because  the EA overstates level of traffic in the baseline, it understates the increase in traffic and pollution due to the construction of the project.  The PRP wasn’t aware of and didn’t address this key issue.

Overlooking the project’s failure to model induced demand

In addition, the project’s increase in freeway capacity will result in induced demand.  In passing the panel confirms the reality of induced demand, but then allows itself to be persuaded to ignore the issue by an undocumented assertion that this was somehow addressed in the traffic modeling.

One panelist suggested that reduced congestion could lead to shorter commute times, thereby encouraging people to move further from the city. An indirect effect could be induced growth. ODOT responded that the traffic analysis did look at the larger transportation network and found that these vehicle trips were redistributed across the Portland Metro area since there were similar volumes in the network, and therefore, analysts concluded that no substantive change in the volume of vehicles entering the network from outside the region would result from the Project. (PRP, Meeting Notes, p. 10-11)

The panel’s meetings were never opened to the public, nor did critics or the public have a chance to testify, or even present written materials to the peer reviewers.  It’s also apparent from reading the panel’s report that the scope of their analysis was so constrained as to prevent them from asking fundamental questions about the traffic modelling that directly drives estimates of  air pollution and greenhouse gas emissions, and impacts estimates of noise pollution.

Question 2:  To what extent are the correct baseline conditions, model assumptions, input data, analysis, and conclusions reasonable and adequately documented?

Without a careful review and analysis of the comments made by No More Freeways, the peer review panel could not accurately answer this question. In its comments on the project’s Environmental Assessment, No More Freeways identified several flaws and in the project’s assumptions and data.For example, the baseline conditions (2015 traffic modeling) assumed that the Columbia River Crossing had already been built. In fact, the CRC was never built,so that “baseline condition” is incorrect. As a result, on a fundamental issue of fact, the panel failed to identify or address a deficiency.
NMF also pointed out that the baseline projections for the I-5 Rose Quarter were inconsistent with other ODOT prepared I-5 traffic projections for the same area, including the projections prepared as a part of the agency’s value pricing work, and also the Columbia River Crossing.

A failure to consider tolling as NEPA requires

Another key issue is how tolling will affect travel. Oregon law mandates tolls, and ODOT claims it will implement them on this stretch of freeway in the time period modeled in the EA. The panel acknowledges in its report that it failed to address the potential impacts of tolling on traffic.

While the Panel understands that tolling/congestion pricing would affect the traffic, it is not within the purview of the Panel to question alternate traffic scenarios that were not included in the EA. This discussion should instead be brought directly to the OTC.

Nothing in the panel’s charge or written materials, or meeting summary explains why “it is not within the panel’s purview.” In fact, because tolling is already mandated by state law, it constitutes a “reasonably foreseeable future condition” which NEPA mandates be addressed in the environmental assessment.  ODOT’s own technical work shows that tolling would virtually eliminate congestion in the Rose Quarter without building additional capacity. The omission of this alternate scenario, as well as the fictitious baseline that assumed 2015 construction of the CRC, constitute material errors that directly bias the computation of air pollution, greenhouse gases and noise pollution.

No More Freeways assembled its own independent panel of transportation modelers and experts to review the traffic projections developed for the Environmental Assessment in 2019. There’s no evidence that the panel members received copies of the technical critique of the EA provided by No More Freeways or similar analyses provided by other commenters. (The group prepared an 12-page technical memorandum, with extensive references and supporting materials, which was submitted to ODOT on 1 April 2019—a PDF copy of the report is presented below.)
In fact, the peer review report is deficient for failing to provide any bibliography listing the materials that were provided to or reviewed by the panel in its report.  Without knowing what information it reviewed its impossible to rely on the panel’s findings.

The panel’s report includes only conclusory statements which are not supported by facts.  For example, the panel chooses to laud ODOT for exceeding federal requirements by addressing greenhouse gases but fails to acknowledge that NEPA and US DOT NEPA guidance require that the EA show compliance with state and local regulations. NEPA requires that the environmental review demonstrate consistency with adopted State and local statues and plans (40 C.F.R. § 1506.2(d)). Oregon Revised Statutes 468A.205(1) sets goals of reducing greenhouse gas emissions by 10 percent from 1990 levels by 2010, and by 75 percent from 1990 levels by 2050. Since Oregon’s legally adopted greenhouse gas emissions reductions are such a state regulation, the EA must address this issue. Nothing in the PRP report indicates any awareness by the panel of either the NEPA requirement or Oregon state law.

A sham process that buries the facts

The so-called peer review is simply a cynical effort to whitewash a deeply flawed environmental analysis. The panel’s meeting’s were behind closed doors. The critics of the Rose Quarter project were not notified of the meetings of the peer review panel or provided any opportunity to brief the panel or present materials. (ODOT chose to invite people from one community group and from Multnomah County, but pointedly did not invite No More Freeways, which actually submitted detailed testimony, exhibits and data germane to the panel’s deliberations).

The process by which this process was conducted by ODOT shows that the agency is dismissive of public comment on the project, a violation of the spirit, and perhaps the legal requirements of NEPA.  It invited the public to offer comments on the project, which were offered in considerable technical detail, but they chose not to allow critics to attend or present information to the panel; The panel’s report was also not written by panel members, but by ODOT’s paid consultant–a former director of the ODOT. And ODOT has, by its own admission, lied about the carbon impact of its projects in the past.

One can only surmise that ODOT realized that if the panel were provided these facts, it would reach a different conclusion.

An open, honest review process would have chosen experts who included those who could look in detail at the reasonableness of the project’s traffic modeling and core assumptions, and would have allowed project critics to present their technical reports –just as the panel was briefed by ODOT staff paid to support the project.

No one should have any illusions on how this report will be used.  ODOT will claim that its project has been independently reviewed and approved by these “experts.”  But the process is a sham.  The panel was asked the wrong questions, presented a with a one-sided case, denied the opportunity to hear from independent critics, and had its report written by someone else.

No More Freeways, Technical Traffic Report, April 1, 2019

NMF_Tech_Memo

Memo to the Governor: Recovering from Covid-19

Some advice on economic policy for states looking to rebound from the pandemic

City Observatory’s Joe Cortright has served as Chair of the Oregon Governor’s Council of Economic Advisers under three Governors.  The Council met (virtually) with Oregon Governor Kate Brown on May 29, to discuss how the state’s economy could recover from the effects of the Covid-19 pandemic.

Governor Kate Brown (Center) and legislative leaders meet with Oregon Governor’s Council of Economic Advisers, 29 May 2020.

While Oregon’s situation likely differs from that of other states in some details, the same issues are likely to arise elsewhere. Here’s a synopsis of his advice to the Governor:

  • The state has a limited ability to influence the overall trajectory of the Oregon economy.  The most decisive policies will be those of the federal government, through fiscal and monetary policy; The state’s limited borrowing ability means that it can’t spend to enough to offset an inadequate federal response, and much of any state-funded stimulus would flow out of Oregon. Oregon will mostly have to roll with the punches, but can help soften the blow on the hardest hit and most vulnerable.
  • The state’s focus should be initially on the public services that support managing the pandemic.  The state’s most important role for the next six to twelve  months is making sure that Covid-19 rates continue to decline, and that any flare-ups are dealt with aggressively.  The economic damage from too fast a re-opening is a greater risk than the economic damage from a somewhat too cautious re-opening.
  • Opening schools and daycare may be important to allowing all sectors of Oregon’s economy to reopen; these are within state control. Nationally, data suggest that pandemic related restrictions have disproportionately affected women’s employment; because the burden of household childcare falls more on women, opening schools and daycare is both an equity and an efficiency issue.
  • The recovery will be uneven across industries.  Some industries will rebound and may require little assistance or intervention; others will not recover completely, and we will lose some businesses permanently.  For example, we can expect health care to recover as we re-open.
  • Other industries face long-term or permanent structural changes.  For example, entertainment, travel, tourism and restaurants are likely to be smaller for some time.  We will likely see lots of business closures in restaurants in particular.  It may be impossible for the state to forestall these closures, but it can help the industry adapt (for example by allowing the use of sidewalks, parking spaces and parking lots for outdoor dining) and mitigating the effects on workers with unemployment insurance.
  • Unlike past recessions, I don’t think this event calls for a classic pump-priming public works response.  There will be lots of consumer demand as we re-open the state economy  and capital construction projects have such long lead times that they are unlikely to provide a boost anytime soon.  I would prioritize keeping public services (and associated payrolls) rather than going for one-off capital construction.  I would also argue that some of our old-school public works projects (wider freeways, a bigger airport terminal) may no longer make any sense in a post-Covid world.  To the extent we have the flexibility to do so, it may make sense to postpone big capital expenditures, use the funds to keep services and operations intact, and wait until we see how the post-Covid world is different before moving ahead  with big projects that might no longer be needed or useful.
  • If we are thinking about investments, I would recommend things that reflect the learning of the pandemic:  for example, equipping and training schools to provide distance learning, making sure all kids have access to high speed internet.  Also:  We should do a retrospective with OHA to figure out what we should invest in in order to be better prepared to recognize and act quickly at the time of the next pandemic; codify the lessons learned.

The full video of the meeting is available here.

City Beat: Why Portland is not like NYC when it comes to Covid

Once again, there’s a naive and unsubstantiated association between urbanism and the pandemic

Portland and Multnomah County have some of the lowest rates of Covid-19 cases of any large metro area

The big drivers of Covid-19 susceptibility are poverty, housing over-crowding and a lack of health care.

Like many states, Oregon is starting to re-open.  Governor Kate Brown has approved re-opening of 34 of the state’s counties, but the two most populous (Multnomah and Washington) are still under stay-at-home orders.  As the Oregonian’s Ted Sickinger reports, Multnomah County, home to the City of Portland, still hasn’t even submitted an application to re-open, and it may be weeks, rather than days before it does.

In his article, Sickinger likens Multnomah County to New York City:

Indeed, Multnomah County is Oregon’s New York City, uniquely important to its economic health, and uniquely vulnerable to a fast-spreading coronavirus outbreak, experts say. It has the most people. The most density. The most long-term care facilities. The most daycares and schools. The most vulnerable populations. The most people using shared spaces like offices and public transit.
The health care sector, where employees are perhaps most susceptible to infection, is the county’s No. 1 employer. And it is home to half the hospital beds in the state.

To be sure, Portland (and Multnomah County) are the state’s biggest city and biggest county, and consequently have more people, and more of just about everything else, than other cities and counties in Oregon. But that really begs the question of how serious the disease is.  When you look at the data on the prevalence of Covid-19 in Oregon, and in other states, its apparent that Multnomah County is nothing like New York City.

Portland (Not New York City, not uniquely vulnerable to Covid-19).

We’ve been following the nationwide data on Covid very closely at City Observatory, (for example: here, and also looking at the county level data in Oregon.

While in a large and diverse county like Multnomah, scaling up to do contract tracing may be a challenge, but in many ways, this story creates or amplifies some misperceptions about the pandemic in Oregon, and more generally in cities.

First, Marion County (home to the capital, Salem), not Multnomah County, is the worst hit county in the state.  Marion County has more than twice as many cases of Covid-19 per capita as Multnomah County.  Multnomah is not “uniquely vulnerable”– it ranks third statewide in cases per capita (behind Marion and Umatilla), and has rates that are roughly comparable to Washington County (126 per 100,000 and 110 per 100,000.)

Second, Portland looks nothing like New York City when it comes to Covid.   That’s true in absolute terms:  the New York City metro area’s rate is 2,300 cases per 100,000 population, 15 times higher than Multnomah County.  It’s also true in relative terms, the New York City metro rate is about 4.5 times higher than the US rate (less than 500 cases per 100,000 population).  Multnomah County is not a wide outlier from the statewide average: the Multnomah County prevalence  is less than 50 percent higher than the statewide rate (126 per 100K, vs. about 87 per 100K).

Third, there’s virtually no evidence to support the notion that density is a significant contributor to Covid risk.  This is true globally (the densest cities like Tokyo, Taipei, Singapore and Hong Kong) have some of the lowest rates of infection (far lower than Portland).  Its also true in North America (Vancouver, BC and San Francisco have very low rates of infection—Vancouver’s is lower than in any large US metro area).  It’s also true within big metro areas like New York—Covid is worse in the suburbs (Westchester and Rockland Counties) and within the city limits, is higher in lower density neighborhoods.

What does explain Covid is not density, but poverty, inadequate access to medical care and housing over-crowding.  Which is why the Navajo Nation, one of the least densely settled parts of the US has an even higher rate of infection that New York City.

Also, what the Oregonian article largely omits the much higher prevalence of Covid among the state’s Latino population, which is a big contributor to high rates in Marion, Polk, Washington, and Umatilla counties, as well as Multnomah.  According to OHA data, Latinos account for at least 30 percent of all Covid cases in Oregon, and have a rate of infection that is more than double the statewide average.  The Latino population is more vulnerable to the disease because they have lower incomes, are more likely to live in crowded housing, have less access to health care, and are more likely to be “essential workers” who have to work and who can’t telecommute.

Multnomah is the most populous county in the state, but the claim that it is “uniquely vulnerable” and the implication that density is a contributor isn’t correct.

And, for the record, when we look at large metro areas in the US (population 1 million or more, 53 of them) Portland has the second lowest rate of cases (behind only Sacramento) and the lowest rate of increase in new cases.  If you zero-in on the largest urban county in each of these 53 metro areas, only 2 have lower rates than Multnomah County:  Bexar (San Antonio) and Sacramento.

There are good reasons to be prudent in opening up after long weeks of Stay-at-home orders.  While the logistics of implementing test and trace may be tougher in a big urban area, just because of the number of people who need to be hired and trained and the (somewhat) greater cultural and linguistic diversity of the city, that’s no reason to repeat baseless claims that the pandemic is driven by density, or that a large metro area is somehow “uniquely” susceptible to the virus.

 

Is the pandemic worse in cities or suburbs?

Using county-level data, it depends on who’s classification system you use

Counties may not be the right basis for diagnosing the contributors to Covid.

One of the oft-repeated claims in the pandemic is the notion that cities and density are significant contributors to the risk of being infected with the Covid-19 virus. Some of this, we have argued, is based on a deep-seated (and wrong-headed) prejudice associating cities with communicable diseases (the “teeming tenement” meme).  But beyond base beliefs, what does the data show?

Because in the United States, the public health function is administered chiefly through counties, the nationwide data on the prevalence of Covid-19 cases and deaths is reported county-by-county.  And analysts (ourselves included) have used this county-level data to plot the prevalence of the disease in different parts of the country.  Our approach has been to aggregate data to the metropolitan level for the nation’s largest metro areas, based on the understanding that county units vary widely across the nation, and that the the labor, commuting and economic markets formed by metro areas are probably a more robust basis for comparing the extent of the disease nationwide.

As we noted earlier, several very good analysts, including Bill Frey at the Brookings Institution, Jed Kolko of Indeed and Bill Bishop at the Daily Yonder, have used the county-level data to look at the comparative prevalence of the disease in cities as compared to suburbs.  The style of their analyses is similar:  They look at county-level data, and classify counties as either urban, or some flavor of suburban or exurban.  They then aggregate the data for all the similarly classified counties across the nation, and compute prevalence rates (reported cases or deaths divided by population).  Jed Kolko’s analysis provides a representative example.

Translating with the Rosetta Stone

The challenge in interpreting their results comes from the fact that each of the three analysts uses a different system for classifying counties based on “urbanness,” as we explored in our earlier commentary on this subject.  We concluded that there was no right or ideal way to provide decide such a classification of counties, and noted that the three methods differ substantially in both the number of places (and people) classified as “urban” and also in which counties fall into which bins.

To illustrate the practical differences between the different definitions, we created a kind of Rosetta Stone (with data graciously provided by each of the three authors).  Let’s take a look at the city/suburb split using each of the three definitions. For this exercise, we use the USA data county level data on Covid Cases as of May 12.  As we usually do at City Observatory, our focus is on the 53 metropolitan areas in the US with a population of one million or more.

Covid-19 prevalence

The following table illustrates Covid-19 prevalence aggregated by each of the three classification systems.  The table shows the total population living in each county classification, the total number of reported cases in those counties, and computes the rate of cases (per 100,000) population.

Urban/Suburban Population, Covid-19 Cases, and Rate per 100,000,
May 12, 2020, (Metropolitan Areas with 1 million or more population).

Population Cases Rate
Brookings
1-Urban Core 99,667,019 780,762 783
2-Mature Suburb 61,943,681 178,381 288
3-Emerging Suburb 14,308,473 27,413 192
4-Exurb 5,253,680 11,915 227
NonMetro 118,497 259 219
Total, Large Metros 181,291,350 998,730 551
Kolko
1-Urban 75,949,521 622,889 820
2-SuburbanHigh 69,104,197 274,372 397
3-SuburbanLow 36,237,632 101,469 280
Total, Large Metros 181,291,350 998,730 551
Yonder
1-Central counties 90,665,117 439,086 484
2-Suburban Counties 86,288,760 551,678 639
3-Exurban 4,218,976 7,707 183
4-Rural Adjacent to Large MSA 118,497 259 219
Total, Large Metros 181,291,350 998,730 551

As you can see, one gets a very different impression of whether city or suburb rates are higher depending on which classification one uses.  Overall, the prevalence rate across categories is 551 cases per 100,000.  But the Kolko and Brookings classifications imply that the average prevalence is about twice as high in cities as in suburbs, while the Yonder classification implies the reverse, that the rate is about 30 percent higher in suburbs than in cities.

Outside of New York

The New York City metropolitan area has been the epicenter of the pandemic, and has accounted for a disproportionate share of reported cases and deaths.  Because of that concentration of cases, and the region’s large (nearly 20 million) population), it could be that this single metro area skews the totals.  In addition, there are significant differences in how the three typologies classify counties in the New York metropolitan area.  For example, the Brookings and Kolko methods classify Westchester and Nassau counties as “urban” while Yonder classifies those counties, and also Queens County, as suburban.

To filter out the effects of New York’s direct contribution to the pandemic, and to sidestep the disagreements about how to classify counties there, we construct a second table aggregating the data for the remaining 52 large metropolitan areas.  First, its worth noting that the aggregate rate of reported cases per 100,000 population drops about 40 percent, to about 354 cases per 100,000.

Urban/Suburban Population, Covid-19 Cases, and Rate per 100,000,
May 12, 2020, (Metropolitan Areas with 1 million or more population, excluding New York metro)

Population Cases Rate
Brookings
1-Urban Core 82,235,306 376,890 458
2-Mature Suburb 60,420,138 159,745 264
3-Emerging Suburb 14,042,865 25,657 183
4-Exurb 5,197,900 11,482 221
NonMetro 118,497 259 219
Total, Large Metros 162,014,706 574,033 354
Kolko
1-Urban 61,770,791 289,436 469
2-SuburbanHigh 65,521,033 199,860 305
3-SuburbanLow 34,722,882 84,737 244
Total, Large Metros 162,014,706 574,033 354
Yonder
1-Central counties 84,227,331 308,054 366
2-Suburban Counties 73,505,682 258,446 352
3-Exurban 4,163,196 7,274 175
4-Rural Adjacent to Large MSA 118,497 259 219
Total, Large Metros 162,014,706 574,033 354

Excluding New York shifts the apparent city/suburb balance in each of the three classification systems.  Brookings reports essentially the same relative gap between city and suburban rates (with city rates about double those in mature suburbs).  In the data that exclude the New York metro, Kolko still reports a higher prevalence of Covid in suburbs than in cities, although by a smaller margin (50 percent higher in cities, rather than nearly double).  The Yonder tabulation now reports a higher rate of prevalence in urban counties than suburban ones, although by a relatively small margin (366 in cities vs 352 in suburbs).

In the end, we’d recommend that anyone interested in understanding the geography of the pandemic closely read the work of Kolko, Frey and Bishop–they’re all smart analysts who paint vivid pictures with the data.  But as this analysis makes clear, drawing clear lines between urban and suburban counties is more of an art than a science, and its useful to understand the different definitions in order to be able to make sense of the conclusions. Even with the best efforts,  it is hard to use highly aggregated county data to make a clear cases about whether the pandemic is much worse in cities than in suburbs.  The devil is very much in the definitional details, as the range of estimates presented here illustrates.

In addition, it may help to look at the issue from different perspectives. As our own analysis comparing city and suburban rates within metropolitan areas shows, it matters a great deal more whether you are in a metro area with a high rate of infections, than whether you are in a city or suburb of any given metro area.  Its also clearly the case that within metros, city and suburban rates are highly correlated, and that when the virus was spreading rapidly, the average suburb was only about six days or so behind its central city in the prevalence of the virus.  Plus, when more detailed data becomes available, such as zip code level tabulations of cases and deaths, we may be able to more carefully discern the differences between cities and their suburbs.

 

City Beat: No evidence that people are fleeing to the suburbs

Today’s misleading and incomplete take on cities:   

There isn’t any evidence that people are fleeing cities for the suburbs; plus it wouldn’t help them avoid the virus if they did.

We’ve addressed the claim that the pandemic will lead to an exodus from cities before; today we’ll tackle another iteration.

The New York Times adds another “fleeing the city to avoid the virus” story.

On May 8, the Times published this story, saying “some” New Yorkers are looking to move out of the city due the perceived hassle and risk of coping with the pandemic. The story consists of a handful of profiles of people moving (or thinking about moving), and a bit of “anecdata” from a moving company and a realtor.

The story omits several things:

People are always moving out of (and in to) New York City. Even in the boomiest of boom times, some people are leaving. That’s always been true, and always will be true.

It’s a popular journalistic trope to find one or two such people, relate their lamentations, and then pronounce based on these stories, that city X or neighborhood Y is “over.”

What the data show for New York is that it always experiences “domestic net migration.”  Migration statistics are compiled by the census, which looks at people who lived in one US location in one year, and a different US location the second year.  What these data leave out are international in-migrants, who are particularly important for places like New York, which is an international gateway into the US.

New York’s population has grown over the last decade, with very slight declines in the past couple of years.  The big factor now limiting New York’s growth (and that of many other cities) is not a decline in the perceived value of urban living, but the limited supply and high cost of urban housing in the face of growing demand. The problem is a shortage of cities, not a disenchantment with urban living.

There’s one more thing to keep in mind regarding the pandemic:  Not only is their precious little evidence here or globally that density is a key factor in susceptibility to the pandemic, the New York Times’ own data show that in the New York metropolitan area, the prevalence of Covid-19 has actually been higher in the suburbs than in New York City.  Suburban Rockland and Westchester counties have rates of infection that are roughly 50 percent higher than in New York City:

Source: New York Times Covid-19 Data, May 11, 2020

A careful analysis by the Furman Center shows that within the city, the problem is worst in the lowest density neighborhoods. So even if you’re fearful of this (or future) pandemics, fleeing to the suburbs won’t be any kind of “escape.”

Like many stories, this one simply reinforces the long-time anti-urban “teeming tenements” viewpoint, while providing little actual data on either the risks of the pandemic, or the actual patterns or causes of migration.

Also from City Observatory on Cities and Covid-19:

City Beat is City Observatory’s occasional feature pushing back on stories in the popular media that we think are mistakenly beating up on cities.

Don’t make “equity” the enemy of improving cities for people

Invoking concerns about equity to block providing more street space for people is destructive 

A cautionary tale from Chicago, with some keen insight from Greg Shill.

Let’s begin by stipulating one thing:  There’s much about American cities, and our transportation system, that is deeply inequitable to low income households and people of color. Our auto-dependent system makes those who can’t drive or choose not to drive or own cars at-best second class citizens.  The burden of crashes and pollution falls more heavily on low income households. Subsidizing car storage (“parking”) and urban freeways systematically benefits those with means, while the infrastructure of walking, cycling and transit, is chronically under-funded and under-provided.  We have a lot of work to do to make this system fairer.

That said, invoking concerns about “equity” as a reason not to move ahead with aggressively re-allocating public space to favor pedestrians, cyclists, and those who just want to enjoy urban space, rather than speed through it in a vehicle, especially in the midst of a pandemic, is a tragic mistake.  Yet that’s exactly what we see happening.  Today’s example is drawn from Chicago, but we see versions of this particular story playing out in many places.

 

Chicago Sun-Times.

A little context:  Like many cities (Oakland is a leader, New York, Portland and many others are following), Chicago is looking for ways to create more public space so that people can travel on foot and by bicycle, as well as exercise during the pandemic. The need for social distancing is bumping up against the paltry amounts of space available for cyclists and pedestrians in many urban neighborhoods. So cities are creating “Slow Streets” and reallocating roadway for walking and cycling. Streetsblog Chicago has had great reporting on the city’s public space issues, the proposed changes and public reactions.

Chicago’s leading transportation advocacy group, the Active Transportation Alliance, expressed qualms.  Let’s turn the microphone over to Iowa law professor Greg Shill—author of the powerful essay “Should Law Subsidize Driving“— who tweeted about this last week. (We contacted Shill and asked him to allow us to share a somewhat longer version of his remarks, along with an edited version of these tweets at City Observatory; what follows is reprinted with his permission).

Nearly six weeks ago, the Active Transportation Alliance, whose mission is to “make walking, biking, and transit safe” and foster “easy options for getting around Chicagoland,” took the position that it was going to suspend any advocacy for active transportation space at the very time that it’s most needed. Chicago is now one of the only cities in the country not to expand public space amid the social distancing imperatives of COVID-19. While Oakland, a much smaller city, has begun rolling out over 70 miles of slow streets, Chicago has actually shrunk the available space for transportation and recreation, for example by closing the bike and walk path along Lake Michigan. This makes 6’ social distancing virtually impossible in many public places.

It’s a scandal, if not a secret, that the poorest areas of Chicago have the least public space. It’s frustrating to see a reform-minded organization (one I belong to, by the way) advocate for upholding the status quo, but it’s perverse that they’re doing it in the name of equity. Especially in areas starved for public space, the public right of way is not allocated fairly or safely. Yet ATA has failed to challenge the mayor’s misguided policy that freezes it in place. With demand for active transportation surging, ATA should embrace its own mission of expanding public space now more than ever, to help Chicagoans to stay safe and healthy during the pandemic.

In this case, the Active Transportation Alliance is explaining why they don’t support expanding sidewalks for essential workers and others who need to get around during COVID. In the past, they’ve acknowledged insufficient focus on equity, but in trying to fix that they made it worse.

The ATA has a good record in general, but by deploying “equity” as a rationale for opposing the expansion of even a single Chicago sidewalk during the pandemic, they’re making a mockery of it. Chicago is now behind every other city. This does not advance equity one whit.

The gist of the ATA statement is “we won’t advocate for *recreational* space while people are dying,” which of course misses the point. It’s stunning to see a reformist organization argue streets in poor neighborhoods must stay dangerous because “equity.”

Even if one believes, against the evidence, that road diets and other streets enhancements cause gentrification because people bid up surrounding real estate, the best way to break that cycle is to advocate for improvements everywhere. No scarcity, no premium.

At least, that would be the principled way to promote equity. The unprincipled way is to insist that underserved, dangerous neighborhoods with dirty air stay that way because that will keep newcomers out. Cutting off your nose to spite your face is not equity.

Shill wasn’t alone in expressing dismay that equity concerns were being used to block action.  Writing at Streetsblog Chicago, Courtney Cobbs acknowledges that ATA has dialed back its criticism of the planned improvements, but finds their support at best lukewarm. Where, she asks, is the equity analysis of not moving forward to make more space for people:

The advocacy group then states that before residents of Chicago neighborhoods or suburbs launch campaigns for open streets in their communities, they need to ask themself a long list of equity-related questions to help make sure the program wouldn’t have any unintended consequences or do harm, which is certainly an important goal. . . .

I don’t necessarily disagree with that line of questioning. But in that lengthy list, ATA raises lots of potential equity impacts of doing open streets. I wonder, has the advocacy group interrogated the possible social justice consequences of maintaining the status quo — that is, not creating more space for safe walking and biking during the pandemic — in the same way?

Too often, we make the perfect the enemy of the better by raising minor or illusory equity concerns about a proposed positive change, while turning a blind eye to the profound inequities embedded in the status quo–inequities that will be unmitigated, if not worsened, by the failure to take the first steps, if only small ones, in the right direction. As we’ve written at City Observatory, there are many profoundly inequitable aspects of our current transportation system that simply go unquestioned, and whose negative effects are vastly greater.

As another famous Chicagoan, former Mayor Rahm Emanuel once said, “A crisis is a terrible thing to waste.” The need to address the misallocation of public space to vehicles is long overdue. Organizations like the Chicago’s ATA labor long and hard to turn public attention to this issue. Now, when American’s are getting out of their cars and onto the public streets, and parks in unprecedented numbers is a unique opportunity for everyone to contemplate a fundamentally different (and ultimately, much more equitable) way of sharing the public realm. Those who wait for the perfect moment, will spend their entire careers waiting.

 

Oregon DOT: The master of three-card monte

The highway department’s claims it doesn’t have enough for maintenance are a long-running con

You’ve all seen the classic street con three-card monte. All you have to do to double your money is follow one of three cards that the dealer is sliding around the on the surface of the little table.  No matter how closely you track the cards, when the shuffling stops, and the dealer asks you to pick one, you can be sure that its not the one you thought it would be. It’s a sucker bet, and you always lose.

Casino.org

But there’s another street hustler out there, who thinks the guy with a cardboard box and a handful of playing cards is a penny-ante player. If you really want to see how the three-card monte con works, there’s no one more masterful than the Oregon Department of Transportation.

The game they play is “find the money to fix potholes.” Everyone agrees we need to maintain the very expensive investment we’ve made in our roads and bridges (that, ostensibly is why we pay the “user fees” that go into the state highway fund).  But no matter how much money goes into the fund—and the 2017 Legislature passed the biggest fee and tax hike in Oregon’s transportation history—the agency just seems to come up short when it comes to money for maintenance.

ODOT has been working this hustle for a long time (we’ll provide a bit of history in a moment). When it comes to finances the agency is very adept and shuffling the cards—and the money—so that no matter where you look, the money is elsewhere.  The latest iteration of the three-card monte was dealt up by Oregon DOT director Kris Stricker, who announced that the agency doesn’t have sustainable funding to maintain the state’s roadways—in spite of the fact that its been less than three years since the Legislature passed a massive funding bill. Here’s the Oregonian’s coverage:

“Many will wonder how ODOT can face a shortfall of operating funding after the recent passage of the largest transportation investment package in the state’s history,” Kris Strickler, the agency’s director, said in a Wednesday email to employees, stakeholders and other groups, citing the 2017 Legislature’s historic $5.3 billion transportation bill. “The reality is that virtually all of the funding from HB 2017 and other recent transportation investment packages was directed by law to the transportation system rather than to cover the agency’s operating costs and maintenance.”

Now keep in mind that the agency got $5 billion in new revenue, and because the agency is marching ahead with several large construction projects (and borrowing billions to pay for them), that won’t leave enough to pay for repairs and agency operations.

But let’s be clear: That’s no accident. ODOT made decisions that created this problem.  It understated the costs of big construction projects, and financed them in a way that automatically puts the repair dollars at risk. It told the Legislature that the I-5 Rose Quarter freeway widening would cost $450 million (and its price tag has since ballooned to nearly $800 million and could, according to the agency, easily top a billion dollars).  These overruns will be paid for with money that could have been used to repair roads. ODOT is also choosing to pay for these projects by issuing debt secured by its gas tax revenues, and the covenants it makes with bondholders mean that if gas tax revenues go down (and they’re in free-fall now, due to the pandemic), that bond repayments get first priority, and all of the cuts fall on operations and maintenance.

And that’s not all. Not only has the agency chosen to paint itself into this budgetary corner, it also routinely takes money that could be used for maintenance and plows it into big capital projects.

As we’ve pointed out, ODOT has a long series of cost-overruns on its major projects.  When a project goes over budget, the agency has to find the money from somewhere—and it always does. A good part of the dark arts of transportation finance consist of figuring out ways to take money in one pot and as the saying goes “change its color” so that it can be placed in a different pot. Two of its favorite tactics are “unanticipated revenue” and “savings.”

Here’s how they work.  Sometimes the agency will budget money for a project, and it will cost less than expected (or be scaled back).  Then those moneys are now “savings” and are free to be reallocated for other purposes.  The “unanticipated funding” is even more obscure.  ODOT can adopt a slightly more pessimistic revenue outlook at some point (by assuming for example that Congress lets the federal highway trust fund go broke); when that doesn’t happen the revenue outlook is re-adjusted upward accordingly and voila—there’s “unexpected revenue.” But notice that because the agency is responsible for estimating revenues and costs, it can easily choose to overestimate costs (to produce “savings”) or under-estimate revenues (to produce “unexpected revenue.”)

ODOT is currently employing both of these strategies to magically fund the widening of I-205—a project that the Legislature did not provide funding for. Here’s a slide from ODOT’s December, 2018 briefing on the project:

Most of these funds (regional flexible funds, “reallocated savings,” “unanticipated federal revenue” and especially the “operation program funds,”) could all otherwise be used to pay for ODOT operations and maintenance—but instead they’re being used here to fund a capital construction project.

That’s not an isolated example: the agency uses lots of funds that can be applied to potholes and repairs to finance new construction.  In the case of the Columbia River Crossing, millions for project planning came from federal “Interstate Maintenance Discretionary (IMD)” funds that can be used for the repair, repaving and upkeep of Interstate freeways throughout the state.

Here’s the thing: nothing stops ODOT from using “unanticipated revenue” or “savings” to pay for repairs.  Even when the savings are in programs that are nominally dedicated to capital construction, the agency could use the savings to offset other capital construction costs paid from its more flexible funds, and shift those second-hand savings into repairs.  But it doesn’t do so.  Like sleight of hand in three-card monte, the budgetary legerdemain always works in the dealer’s favor.

Nothing new:  A long-running con

Anyone who has followed ODOT for any period of time knows that these tactics are dog-eared pages in its playbook.  Consider the two biggest projects the agency has pushed since 2000, the $360 million, five-mile long re-routing of US 20 between Corvallis and Newport, and the failed effort to build the $3 billion Columbia River Crossing. In both cases the agency used or proposed financial sleight-of-hand to come up with the needed money.

Pioneer Mountain-Eddyville US 20

Originally, the Pioneer Mountain-Eddyville project was supposed to cost about $100 million, but through a prolonged serious of ODOT blunders, it ended up costing about $360 million.  The agency found the money to pay for the cost-overruns from a combination of “savings” and “unanticipated revenue.”  Here’s how they filled the last bit of the shortfall, according to ODOT’s own documents:

December 5, 2012 Memorandum from Matt Garrett to the Oregon Transportation Commission
(US_20_PME_12_19_2012_OTC.pdf)

Most of the needed funds came from “unanticipated Map-21 funds,” with the balance coming from OTIA three modernization “savings”–OTIA 3 being a program to repair highway bridges.  So when it wants to, ODOT manages to find money which could otherwise be used for maintenance and use it to cover the costs of capital construction.

The Columbia River Crossing

Consider the agency’s last proposed megaproject. In 2013, it sought legislative approval to go ahead with the $3 billion project, even though the state of Washington had pulled out—and taken its money and responsibility for covering half of all project costs with it.  ODOT came to the Legislature asking for approval to incur debt for the project, and assured the Legislature that it had on hand all the money it needed to pay for Oregon’s share–initially $450 million, but with liability for vastly more–without the need to raise taxes.  ODOT looked into its budget found “unanticipated revenue,” as reported by the Associated Press in its article “Bill proposes bridge debt but no funding source

SALEM, Ore. (AP) — A bill approving a new Interstate 5 bridge over the Columbia River would authorize $450 million in bonds to pay for Oregon’s share, but it doesn’t say how the state would pay off the debt over the coming decades.  Paying down the bridge debt would cost roughly $30 million per year.  In the short term, the Oregon Department of Transportation can use unanticipated federal transportation dollars to cover the debt, lawmakers said. But after that money runs out in two to three years, the state would have to approve a new revenue source — such as a gas tax or vehicle fees — or reduce the amount of money available for other road projects. [Emphasis added].

Legislative leaders took ODOT’s word that there was money.  House Speaker Tina Kotek’s spokesperson repeated ODOT assurances that the capital construction costs for the project could be paid out of ODOT’s existing revenue.  Here’s Willamette Week, quoting Jared Mason-Gere of the speaker’s office in 2013:

“Speaker Kotek believes the committee structure this session allowed for a full and open consideration of the I-5 Bridge Replacement Project, while still moving swiftly enough to move the project forward. The committee considered the same elements of the bill the Ways and Means Committee would have, and worked closely with the Legislative Fiscal Office. The funding already exists in an agency budget. LFO has verified that the funds are available in the ODOT budget, and that they will not impact other existing projects. [Emphasis added].

So, when the agency wants to take on a huge mega-project, future budget considerations—even on the order of hundreds of millions of dollars which would directly reduce the agency’s ability to pay for future operations and maintenance—are no obstacle. Plus, in the case of the CRC (as with today’s Rose Quarter freeway widening and the Pioneer Mountain-Eddyville project) the revenue hit isn’t limited to the projected cost, but also includes a massive and undisclosed liability for cost overruns, which then directly impact operations and maintenance. The lack of budgetary flexibility to pay for operations and repairs today is a direct result of choices like this in the past to use or commit

Sell potholes, spend on megaprojects

If you’re a highway engineer, nothing is more boring that fixing potholes, and nothing is more glamorous than a giant new bridge or highway. While department leaders consistently swear that they’re committed to maintaining the system, whenever they get a chance, they either plan for giant projects for which they have no money, or low-ball the estimates on capital construction, knowing they’ll use their fiscal magic down the road.

The real giant, unfunded liability for ODOT is big new construction projects.  The cost of the Rose Quarter project, which ODOT confidently told the Legislature would be just $450 million, has already ballooned to nearly $800 million, and could exceed a billion dollars if promised buildable covers are included.  ODOT has said nothing about how these vast overruns would be paid for.

Meanwhile, ODOT is moving full speed ahead with plans for a revived Columbia River Crossing (now re-branded as “I-5 Bridge Replacement”). In its last iteration, the price tag for this project was north of $3 billion, and at this point ODOT has no money in its budget for its share of these costs.  But it has allocated $9 million in planning funds to revive the project.

Paying Lip Service to maintenance, Paying interest on debt

ODOT officials talk a good game when it comes to the importance of maintenance.  And while they apparently blame the Legislature for telling them to spend money on capital construction rather than fixing potholes, its sales pitch consisted of telling the public how much it cared about maintenance:

Here’s the agency’s current deputy director, Travis Brouwer, speaking to OPB, in April, 2017 as the Legislature was considering a giant road finance bill.:

Of course, patching potholes are far from the only thing ODOT has to spend money on. So how does the agency decide what to prioritize? According to ODOT assistant director Travis Brouwer, basic maintenance and preservation are a top priority.

“Oregonians have invested billions of dollars in the transportation system over generations and we need to keep that system in good working order,” he said. “Generally, we prioritize the basic fixing the system above the expansion of that system.”

Back in 2017, the Oregon Department of Transportation put out a two-page “Fact Sheet” on the new transportation legislation.  It’s first paragraph stressed that most of ODOT’s money would be for maintaining the existing system:

Generally, meaning, unless we decide to build shiny new projects—which they do.  Make no mistake:  When it comes to one of the agency’s pet mega-projects, there’s always money lying around, and if there isn’t, they’ll pretend like there is and charge full speed ahead, maxing out the credit cards to generate the cash.

In 2000, the agency was essentially debt-free.  Since then, the share of State Highway Fund revenues spent on debt service has gone from 1 percent to more than 25 percent—and an increasing share of that debt burden is to pay off the costs of mega-projects. And, by definition, these debt obligations have first call on ODOT’s revenue, so the very act of debt-financing capital construction is a direct cause for the shortfall in funding for maintenance.

So, for example, look at recent decline in state gas tax revenues because of the decline in driving during the Covid-19 pandemic.  The way ODOT has chosen to structure its finances, the budget shortfall lands disproportionately on operations and maintenance.  The debt-cycle neatly provides a mechanism to implement a “bait and switch” strategy.

In three-card Monte, no matter which card you pick, ODOT will never have enough money for maintenance, but it will always be plowing money into big construction projects, and planning for even more.

What is urban?

Shape of the urban/suburban divide:  Views differ

There’s a lot of debate about the relative merits and performance of cities and suburbs. You’ll read that the migration to cities has come to a halt, that suburbs are growing faster than cities or that cities have a higher rate of Covid-19 infections than suburbs.

All those statements hinge on being able to draw neat, clearly understandable lines between what constitutes a city and a suburb. As is so often the case, the conceptual differences may be clear, but drawing lines, in practice, is fraught with confusion and complexity. And this matters because where one draws these lines has a big impact on what kind of numerical answers one gets.

Today we’ll take a look at three widely used urban/suburban definitions, developed by three different researchers–all of whom we greatly respect and admire. What we find is that, as the saying goes, they’re all over the map. Consequently, a degree of care and caution is needed in interpreting data that make “urban vs. suburban” claims based on county data.

Geography and data availability necessarily straight-jacket any analyst looking to craft a solid picture contrasting cities and suburbs. Cities are political subdivisions (municipalities, or ‘places” in Census parlance). But the geographies that compose cities are defined by local law and custom, and vary widely from state to state and around the country. The principal city is nearly all of the metropolitan area in Jacksonville or Phoenix, but is just a fragment of a much larger metropolitan area in Atlanta or Miami.

While one can get a much finer look at geography by cobbling together customized and uniformly defined clusters of census tracts, these data are available in the form of five-year pooled estimates (the latest being from the 2014-2018 versions of the American Community Survey).  For those interested in the most recent data, that’s frustratingly old. Two researchers at Harvard’s Joint Center for Housing Studies devoted an entire paper to the question, but there analysis focuses on slicing the urban/suburban divide within counties, so we don’t explore it further here.

The most convenient and timely data is county level estimates published by the Census Bureau.  It annually estimates the population of every county in the US, along with the components of population change (births, deaths, migration), and estimates the age structure as well.  Because the sample size is larger for counties than for census tracts, the Census Bureau also produces annual tabulations of the American Community Survey for counties. And of interest today, Covid-19 data are generally reported on a county-level by public health authorities.

As a result of the convenience of data availability, and the fact that the entire nation is divided into about 3,200 counties (or county-like units), analysts routinely tap census data to describe geographic trends.

Within the past few weeks, three terrific analysts, Bill Frey of the Brookings Institution, Jed Kolko of Indeed and Bill Bishop of the Daily Yonder, have been using these county level data to look at the relative prevalence of Covid-19 across the nation’s geography–from central cities and suburbs of large metro areas, to smaller metros, to rural areas.

Conceptual differences

There are a variety of ways to characterize the “urban-ness” of a place.  One is centrality:  is a county at the center of a metropolitan region. Another is density:  how many people per square mile live in a county?  One can also look at how developed an area is:  is a county mostly developed to some minimum level of density, or is much of it relatively lightly developed or undeveloped?  The three definitions presented here lean on different concepts:  Brookings uses a measure of urban development, Kolko looks at weighted population density and Yonder emphasizes centrality.  These different underlying concepts lead to differing categorizations of counties as urban or suburban.

Our focus–as it usually is at City Observatory–is on the nation’s 53 largest metropolitan areas, all those with a million or more population.  Although their taxonomies cover the whole gamut of metro and rural areas, we’re most interested in here, in where each of these researcher’s has used county boundaries to partition these large metro areas into “urban” and “suburban” components.

In all, about 181 million Americans live in one of the 53 largest US metro areas.  How many of them live in urban as opposed to suburban locations?  Each of these methods proposes to answer that question but they come up with answers that are quantitatively and compositionally quite different.  Of the 181 million people living in these large metro areas in 2018, depending on the definition one chooses, the number that in “urban” counties are about 76 million (Kolko), 91 million (Yonder) or 100 million (Brookings).

Classification of Counties in Metro Areas with Population of 1,000,000 or more.

Brookings Counties Population
1-Urban Core 96 99,667,019
2-Mature Suburb 111 61,943,681
3-Emerging Suburb 77 14,308,473
4-Exurb 121 5,253,680
NonMetro 4 118,497
Grand Total 409 181,291,350
Kolko
1-Urban 55 75,949,521
2-SuburbanHigh 91 69,104,197
3-SuburbanLow 263 36,237,632
Grand Total 409 181,291,350
Yonder
1-Central counties 59 90,665,117
2-Suburban Counties 238 86,288,760
3-Exurban 108 4,218,976
4-Rural Adjacent to Large MSA 4 118,497
Grand Total 409 181,291,350

That might not seem like such a large discrepancy, until you find that each of the definitions designate distinctly different sets of counties counties as urban.  Overall, the three methodologies agree on 21 counties in the 53 largest metropolitan areas are urban.  These counties are home to less than half of the people labeled as living in “urban” areas under any of the three definitions. These common counties work out to a little bit more than half of the people counted as living in “urban” areas by Kolko, and less than half of the people counted as  urban by Brookings and Yonder.  So these are largely differences in kind, rather than degree.

Comparing different definitions

One way to illustrate the differences among these three definitions is via a Venn diagram showing where these definitions coincide and where they differ.  The red (upper left) circle shows Kolko; the yellow circle on the right shows Yonder, and Brookings is the pale green circle on the bottom.

Urban county definitions, Large metro areas, Total population in millions (Number of counties).

The are where the three circles overlap shows that the three rubrics agree that 21 counties are “urban,” these contain about 41.3 million people.  Conversely, if you add together all of the counties that at least one of the three methods categorize as urban, you find that 62 counties, with a total population of 132.8 million are “urban.”

Remarkably, when classifying counties as urban or suburban in each of the 53 most populous metropolitan areas, the three methods are in complete agreement on what constitute the “urban” counties in only four metro areas:  Cleveland, Milwaukee, Pittsburgh and San Jose.

Two of the three methods imply that almost a third of large US metro areas have no counties that qualify as “urban.”  The Brookings and Kolko methods find that 16 metro areas, including Austin, Charlotte, Cincinnati, Kansas City, Memphis, Nashville, Oklahoma City, Phoenix, Raleigh and San Antonio, have no counties that are “urban”.  Kolko characterizes the most populous county in each of these metro areas as “high density suburban,” while Brookings generally classifies them as “mature suburbs.”

It’s interesting to see what two out of the three sources says is “urban” that the other one leaves out:

Brookings and Kolko agree on 26 counties (with 21.8 million people, that Yonder leaves out of its definition.  These are mostly populous counties in large Eastern Metros, Queens, and Nassau Counties in New York

Brookings and Yonder agree on 13 counties with 16.9 million people, that don’t meet Kolko’s criteria, chiefly because they’re not dense enough.

Yonder and Kolko agree on 7 counties with 11.3 million people.  The difference here is that the Brookings methodology says that five large MSAs in the West—  Las Vegas, Portland, Sacramento, Seattle and San Diego—have no urban counties–classifying their densest, most central county as “mature suburban.” In contrast, both Kolko and Yonder identify the central county in each metro as urban.

Each method has its own unique choices–counties it classified as urban that were not classified urban by either other source.  Kolko had just one such county.  Brookings had 35 counties that it alone designated as urban (mostly the second and third most populous counties in a metro area).  Yonder had 19 counties that were designated urban only by its methodology–reflecting its rule of designating the most populous county in each metro as “central.”

A Rosetta Stone

For our own use, and with the hope that it may be of some utility to other researchers, we’ve crafted a kind of Rosetta Stone illustrating these three different definitions of urban and suburban counties.  We’ve shown, side-by-side how each of these three method’s classifies each county in the nation’s 53  most populous metropolitan areas.

Rosetta_County.xlsx

This Excel file identifies the name and FIPS Code of each metro area, the name and FIPS code of each constituent county in that area, and the population of that county in 2018. In three separate columns, we show how the county is classified by Brookings, Kolko and Yonder.  We’ve also appended Kolko’s estimates of the tract-weighted population density of these counties (a key metric in his classification system).

At a minimum, if you’re interested to see how your metropolitan area is parsed among these different definitions, you can use this as a reference.  We also hope it lets researchers more easily decode and compare statistics compiled

There is no “right” definition

The purpose of this comparison is not to prove any one definition is superior to the others, but rather to illustrate the complexity and ambiguity of using county-level data to make strong statements about what constitutes “urban” and “suburban.”  As a practical matter,  the lumpiness and varying size of county units makes them a problematic choice for drawing these boundaries. Is none of King County Washington (which includes all of Seattle), urban?  Should single county metropolitan areas (San Diego and Las Vegas) be classified as core or suburban?  These are questions about which reasonable people can disagree, but in the interests of transparency, we offer up our Rosetta Stone so that people can use these data with a clear understanding of the difficult choices their authors made.

References

Jed Kolko, How suburban are big US cities? , FiveThirtyEight.Com, May 2015.

William Frey, Even before coronavirus, census shows US cities growth stagnating.

Bill Bishop, Major City growth slows, but that doesn’t mean a rural rebound,

Acknowledgements:  City Observatory is grateful to Jed Kolko, Bill Frey and Bill Bishop for graciously sharing their worksheets showing their classification systems. City Observatory is responsible for any errors in this analysis.

 

 

The Covid Corridor: The pandemic is worst in the NE Corridor

The incidence of reported Covid-19 cases, and their daily growth is higher in the metros of NE corridor than the rest of the country.

The Northeast Corridor has all four of the cities with the highest rate of newly reported cases.

New Cases per 100,000 population, April 17

A new metric:  New cases per 100,000 population

For the past several weeks, we’ve been tracking the cumulative number of reported Covid-19 cases per 100,000 population in each of the nation’s 53 largest metropolitan areas.  Today, we tweak that measure just a bit to focus on the number of cases reported in the past day in each of these metro areas.  This daily increase measure signals what’s happening right now.  It can be a bit noisier due to day-to-day variations in reporting lags across metro areas, but the cumulative measure is now increasingly telling us about past reported cases, rather than what’s going on right now.

Here’s a bar chart showing the number of new cases per 100,000 on April 17 for each metro area with a million or more population.  New York had about 50 new cases per 100,000 population, about ten times the level reported in the median large metropolitan area (5 new cases). (Northeast Corridor metros are highlighted in red).

What’s striking about this chart is that all four of the metros with the highest number of new cases–New York, Boston, Providence and Philadelphia–are all in the Northeast Corridor.  The corridor also accounts for six of the top eight metros on this measure (adding Hartford and Washington).  Each of these metro areas is reporting new cases per capita at a rate 2-3 times higher (and in Boston’s case, 6 times higher) than the median large metro area in the US.

In contrast, some cities that had experienced an earlier surge in cases have seen a significant reduction in reported new cases.  Seattle had just 3.7 new cases per 100,000 on April 17, well below the median for large metro areas.

These data signal a wide disparity among metropolitan areas in the current spread of the Corona virus.  Some metropolitan areas are seeing very low levels of growth (seven metro areas had 2 or fewer new reported cases per 100,000 on April 17).  Meanwhile the pandemic seems much more pernicious and continues to spread at a much higher rate in other parts of the country.  While the New York metropolitan area  has (appropriately) drawn attention as the epicenter of the pandemic, it actually appears to be a problem that disproportionately affects the entire NE Corridor, from Washington to Boston.

 

Regional Pandemic Hotspots: NE Corridor and Great Lakes

Originally published April 12; Revised and Corrected April 14

The Covid-19 pandemic is hitting two regions in the US much harder than others:  The NE Corridor and the Great Lakes

Metro areas in these regions have the highest rates of reported cases per capita, and the highest levels of growth

In contrast, incidence and growth rates are subdued in the South and West.

At City Observatory, we’ve been tracking the spread of the Covid-19 pandemic among the nation’s 53 largest metropolitan areas since the middle of March.  Our emphasis has been on identifying the incidence (reported cases per 100,000 population) and the growth (the daily growth rate averaged over the previous seven days), as a way of establishing which cities have been hardest hit, and which ones seem to be making progress in flattening the curve. Today, we step back and look at the regional geography of the incidence and growth of the pandemic:  What regional patterns do we observe in where the virus is spreading most rapidly.  As always, the usual caveats about the ambiguity of reported case data apply:  Testing isn’t done randomly, and is constrained by testing capacity, and is conditioned by medical necessity.  As a result, reported case data don’t necessarily accurately reflect the actual number of cases in a city or region.

The national pattern

Our core measure of the incidence of Covid-19 is the number of reported cases per 100,000.  We’ve mapped them for April 13 here:

 

Shaded areas depict each of the nation’s 53 most populous metropolitan areas (all those with a million or more residents).  Darker red shading indicates metro areas with the highest rates of reported cases per capita; the numbers superimposed on the shaded areas are the number of cases per 100,000 reported through April 13, 2020, according to the county level data compiled by The New York Times.  As this map makes clear, the pandemic has affected the entire nation, but the incidence of the virus seems much higher in some places than others.

There are a couple of somewhat isolated hotspots.  Seattle was the site of the first significant outbreak, and still ranks among the top twelve in cases per capita.  New Orleans has the second most serious outbreak of the disease (with about 1,048 cases per 100,000).  Interestingly, however, nearby metros are far less affected.  Portland (less than 200 miles from Seattle) has a rate of reported cases (41) that is in the bottom quartile of large metros; similarly Houston (350 miles from New Orleans) has a rate of reported cases (75) that is slightly below the median.

The Northeast Corridor

As The New York Times has said, New York City is the epicenter of the pandemic.  and a quick look at nearby metro areas suggests that proximity to New York City may be a factor explaining the spread of the virus. As this map focusing on the Northeastern United States makes clear, all of the metropolitan areas in the Northeast Corridor (i.e. between Washington and Boston) have above median levels of reported Covid-19 cases per capita.

 

 

Six Northeast Corridor metros rank in the top eleven for cases per capita:  New York is first, Boston is fourth, Philadelphia is fifth, and Hartford is seventh, Washington is ninth and Providence is eleventh. The following chart shows the top 12 metro areas for incidence of reported Covid-19 cases per capita on April 13.

What is particularly disconcerting about the high rates of reported cases per capita in the Northeast Corridor is that several of these cities are also seeing their rate of increase in the number of reported cases continuing to be higher than the national average.  Among all large metro areas, we estimate that the daily rate of increase for the week ending April 13 averaged about 6.9 percent.  While New York has lowered its rate of increase to this level, other NE Corridor metros are growing faster:  Boston and Philadelphia (10 percent), Providence (11 percent), Washington (12 percent) and Hartford (14 percent).  The combination of faster growth in reported cases and a higher incidence per capita puts them in the upper right hand quadrant of our classification of metros according the the incidence and growth of the virus.

 

The Great Lakes Region:  Detroit, Indianapolis, Chicago

Three metro area in the Great Lakes region have levels of reported Covid-19 cases per capita that are in the top ten of all large metro areas.  Detroit is third (478 per 100,000), Indianapolis is sixth (245) and Chicago is seventh (226).  Milwaukee’s rate (134) is above the median as well. Other nearby metro areas have rates of reported cases that are at or below the median, including Cleveland (82) , Grand Rapids (41), and Columbus (69).  Unlike the hard hit cities in the Northeast Corridor, the growth rate of reported cases in Detroit (6 percent), Indianapolis (6 percent) is slightly below the national average of 6.9 percent , and at 8 percent Chicago’s is still higher.

 

There appear to be emerging and persistent regional differences in the spread of the Covid-19 pandemic.  In general, metro areas in the West and South have seen lower incidence of cases, and in the past several weeks, slower growth in new reported cases than the rest of the country.  Meanwhile, as described here, cities in the Northeast Corridor and several cities in the Great Lakes region have experienced the highest incidence of reported cases, and continue to experience higher than average levels of growth.  At this point, we don’t have an explanation for this regional disparity, but as our knowledge about the pandemic grows, it bears closer investigation.

Note:  This post has been revised to include data for April 13.  The original post contained an incorrect estimate of cases per capita in the New York metropolitan area.  For more information, see our daily tabulation of metro area data.

Who’s flattening the curve? Evidence from Seattle & San Jose

Seattle and San Jose had the first outbreaks of Covid-19 but now have the slowest rates of growth of any large US metro area

Their progress seems closely related to the fact that they’ve cut back on travel more than nearly every other metro area.

For the past several weeks, City Observatory has been compiling the data on reported cases of Covid-19 in the nation’s largest metro areas, and like everyone, looking for signs that we’re “flattening the curve”–reducing the explosive exponential rate of growth of the number of cases to levels that won’t overwhelm the nation’s (or any city’s) health care system.

There’s been a lot of attention focused–appropriately so–on the metro areas with the highest number of cases.  New York accounts for more cases (92,000) than any other metro area; but on a population adjusted basis, the pandemic has hit New Orleans about 60 percent harder.  Its rate of reported cases per 100,000 is 856, compared to about 523 in New York.

From the standpoint of understanding how to combat the pandemic, it may be more useful (and more hopeful) to look at metro areas that seem to have made progress in slowing the reported increase in the number of cases.  Just three weeks ago, Seattle and San Jose had the highest rates of reported Covid-19 cases of any large metros in the US.  On March 18, Seattle reported 21 cases per 100,000 residents and San Jose had 9 cases per 100,000, ranking them first and third among the nation’s large metros.

San Jose and Seattle are now slowest growing in reported cases

Today, just three weeks later, both cities have among the lowest rates of growth of the pandemic.  San Jose has the lowest rate of growth (about 5 percent on a daily basis over the past week) and Seattle the second lowest (about 6 percent daily over the past week).  San Jose has also managed to go from having the third highest rate of reported Covid-19 cases per capita, to having the 24th highest of 53 metro areas (essentially at the median).  Seattle is still well above average, but after having more cases per capita than any metro area, it now ranks seventh on this measure among large metro areas. The key has been lowering the rate of increase in the number of cases. The following chart shows Seattle (pink) and San Jose (red) compared to New Orleans, New York and Indianapolis, which have all had higher rates of increase, and have managed slower declines in daily growth.

 

It’s helpful to look at the incidence (cases per 100,000) and the growth rate at the same time. Our analysis of metro area performance is distilled into this matrix, which shows the incidence of reported cases per capita (on the horizontal axis) and the rate of daily growth in reported cases over the past week in the vertical axis.  Ideally, you want your metropolitan area to be in the lower left-hand corner of this chart (low incidence, and relatively slow growth).  San Jose arguably has the second best performance after Minneapolis on our combined measures.  Seattle, as noted, has the second lowest rate of growth.

This is evidence that Stay-at-Home is working

Why have these two cities performed so (relatively) well?  Part of the reason may be the effectiveness of the stay-at-home policies in these two metro areas.  As we examined earlier, location services company Cuebiq is using cell-phone data to measure changes in travel behavior among US counties.  We’ve compiled that data for the principal counties of US metro areas as an indicator of how much travel has declined since the advent of stay-at-home policies in March.  According to our analysis of Cuebiq’s data, Seattle (King County) and San Jose (Santa Clara County) rank number one and number two as the two counties with the biggest declines in travel compared to the typical annual volume.

Seattle and San Jose also rank near the top of the charts according to Google’s parallel measure of visitation of workplaces.  The two cities ranked fifth and sixth respectively, out of the 53 most populous metropolitan areas in reducing workplace related travel.  The strong performance of these cities probably reflects some combination of the effectiveness (and relatively early implementation of these policies) and the fact that with strong high tech sectors and a large well-educated workforce, its likely that a relatively high fraction of workers were readily able work at home.

These are some dark days in the Covid-19 pandemic.  It’s a hopeful sign that two cities that were among the first-hit by the virus have improved their relative position so much in just a few weeks. There’s still a huge amount of work to be done, but their experience suggests that limiting travel and practicing social-distancing can blunt the pandemic’s spread.

Notes

The charts and information presented here on published data from state health departments, aggregated by The New York Times. Please use caution in interpreting these data. It is likely that in some areas, the number of cases is under-reported due to the lack of available testing capacity, or pressing medical conditions.  There are widespread differences in the number of tests administered relative to the size of the population in each state, and tests are not given randomly, and may be restricted solely to persons with symptoms, likely exposure or high risk in some states.  As a result, the ratio of reported to unreported, undiagnosed cases may vary across geography.  Moreover, changes in reported numbers of cases from day to day or week to week may reflect changes in the availability or application of testing over time, rather than the true rate of growth in the number of persons affected. The fact that some places are performing relatively better than others due to reported case data does not mean that the Covid-19 pandemic is under control, or that stay-at-home policies and social distancing are no longer needed.

 

 

Staying at home: Estimates for large metro areas

How well are “stay at home” and “shelter in place” policies working in different metro areas?

Big data” from smartphones gives us a picture of how we’re dialing back on travel in response to “stay-at-home” orders to combat the Covid-19 pandemic. We’ve compiled the data from Google and Cuebiq on the variations in travel behavior in the nation’s largest metro areas.

  • Google reports that since mid-February, workplace visits have declined by between a third and a half in nearly all large US metro areas 
  • Cuebiq estimates that its total travel index has fallen by between 25 and 95 percent in large US metro areas, with the typical metro experiencing a decline of about 55 percent.

We’re currently analyzing these data, but have some early observations:

  • Hard hit cities (New York, New Orleans) have big travel declines.
  • Well-educated, tech-oriented cities have consistently high travel declines, possibly reflecting the ability of many workers to work remotely
  • Tourism centers have (Las Vegas, Orlando) have seen declines in travel

Metro areas ranked by decline in travel (Cuebiq)

Cuebiq estimates how much travel has changed in each county in the US compared to the year earlier.  By its estimates, all metro areas have seen declines in its travel index.  Declines range from less than 25 percent in Virginia Beach and Jacksonville to more than 95 percent in New York, Seattle, Portland, San Francisco and Seattle.  The typical large metro area has seen a decline of about 55 percent compared to the year earlier.  (We use data for the most central county in each metro area as a proxy for overall change in travel in that metropolitan area).

Metro areas ranked by decline in workplace visiting (Google)

Google estimates how much visiting to workplaces had changed between the middle of February and the end of March.  By its estimates, workplace visits have declined in all metro areas.  Declines range from as little as 33 percent in Jacksonville, Memphis and Phoenix to more than 50 percent in New York, New Orleans, and San Francisco.  The typical large metro has seen a decline of about 40 percent compared to February.  (We use data for the most central county in each metro area as a proxy for overall change in travel in that metropolitan area).

Comparing the Google and Cuebiq estimates

There are obvious differences in definitions, methodology and measures between Google and Cuebiq.  As the above summary suggests, the percentage decline in travel as measured by Cuebiq is considerably greater in magnitude than the percentage change in workplace visitation as measured by Google.  The following chart shows the estimated change in the travel index for each metro (per Cuebiq) compared to the average change in workplace visitation (per Google).

Overall, there’s a reasonable correlation between the two measures.  Cities that rank high on the Google index, also rank high on the Cuebiq index.  Statistically, the coefficient of determination (R2) between the two series is .19.

Some Initial Findings

Hard hit cities show big declines.  New York and New Orleans show large declines on both indices.  New Orleans ranks first in cases per 100,000, and has the third largest decline in workplace visitation according to Google.  Cuebiq estimates that New York has seen a 95 percent reduction it its travel index; Google ranks it number one for reduction in workplace visitation. It seems likely that the higher level of concern in these areas due to the prevalence of reported cases gives people strong incentives to avoid travel.

Well-educated, tech-oriented cities seem to have high levels of travel reduction.  The top of both the Google and Cuebiq lists are dominated by the nation’s tech centers including San Jose, San Francisco, Seattle, Portland, San Diego, Denver, Washington.  This may reflect a high level of awareness and concern about the Covid-19 pandemic, and the ability and proficiency to work remotely. The lower-left hand corner of our scatter chart comparing the Google and Cuebiq estimates (which represents cities with the biggest declines on both indices, is populated by this well-educated tech centers.

The hit to tourism is apparent:  Las Vegas and Orlando are striking outliers in our analysis of the Google data, with large declines in total travel (which may directly reflect lower visitor counts) and indirectly, from layoffs in accommodations, food service, travel, and entertainment businesses.  The industry sector with the largest declines in employment appears to be accommodations and food service; it’s no surprise that metro areas heavily dependent on these industries would experience larger declines in associated travel.

These are just our first initial impressions:  We’ll be digging into this data in future commentaries, so stay tuned.

About the Data

This commentary draws on two sources of data:  Google’s “community mobility reports” and Cuebiq’s “mobility index”.  Both of these reports are based on these company’s analyses of data from smart phone and other device users.  Cuebiq tracks the trips we take, and has an index of total daily travel per person (really, per device) for the nation’s counties.  It’s unclear whether this is a distance measure or a count of trips, or some other measure. Google has aggregated and anonymized user location data to measure (apparently) the amount of time we spend or number of trips we make to various locations.  (We say “apparently” because Google’s explanation of its measures and methodology is quite vague.)  It, too, reports data for counties.

Because we focus on metropolitan areas, we used data for the central county in each large metropolitan area as an indicator for the entire metropolitan region.  Unfortunately, in our view, neither Google nor Cuebiq enable users to download their county level data in a machine readable format, such as CSV.  Consequently, assembling these county level estimates requires laboriously clicking through their interfaces (Tableau for Cuebiq and PDF for Google) and manually transcribing the data, and then entering it into a database for analysis.  (Our apologies if there are any transcription errors:  they could be avoided, and this data would be of infinitely greater value if both companies would release machine readable versions of their reports.  In the public interest, we call on them to do so at their earliest opportunity).

Google Community Mobility Reports

Google has started publishing “Community Mobility Reports” that tap the location data from smart phones to measure the approximate number of trips we take to various destinations.  Data are available at the county level (subject to minimum data requirements), and are available for six broad categories  destinations such as retail, recreation, work, home, and parks.  As this screenshot for Portland’s Multnomah County shows, work trips are down about 41 percent, retail trips are down more than 60 percent and grocery/pharmacy trips are down about 30 percent compared to a pre-pandemic baseline.

 

Google produces separate estimates for the percentage change in “visiting” at each of six categories of destinations between February 16 and March 29.  Google’s six categories are workplaces, retail shops, grocery and drugstores, parks, transit centers, and residences.  Google’s data show universal decreases for time spent at work, in stores of all kinds and in transit centers.  They show an increase in time spent in residences.  The pattern for time spent in parks varies across metropolitan areas (and over time, within metropolitan areas) with a wide range of increases and decreases.Its a very exiting and useful dataset, but inexplicably, Google has chosen to make it available only as a series of state-by-state PDF files, which make it extremely tedious for linking to other research.

Cuebiq Mobility Index Analysis

We examine Cuebiq‘s estimates of the total percentage change in its travel index between what it calls the “delta versus yearly average” (unclear whether this is 2019, 2020 or some other base period), and the estimated index for the current week (in this case the week ending March 30).  Cuebiq’s estimates show a pattern of universal declines in travel for all metro areas we examined.

 

Understanding the geography of Covid-19

What maps and charts can–and can’t–tell us about the spread of the pandemic

Since last week, when we wrote our first thoughts about the geographic spread of the Covid-19 virus, people around the globe have been doing a lot of work.  Here’s a quick synopsis of what we’ve seen.

National dashboards now have county data

Two of the leading US map resources (Johns Hopkins University and the New York Times) have both added county level data to their reporting.

The Johns Hopkins University Map of US Covid-19 infections has been expanded to allow a drill-down to county level data.

The New York Times now has a county-by-county listing of the number of Covid-19 cases for the nation.

 

At City Observatory, we computed county level prevalence rates for virus infections in the state’s with the highest levels of confirmed cases.

Data current as of 19 March 2020.

We should focus on the growth rate of cases

The real issue is the slope of the line, is it increasing or decreasing.  The conventional way this is presented, with the exponential curves of cumulative case counts is difficult to interpret. Several analysts instead started charting the five-day or seven-day moving average of the growth of infections. This provides us with the simplest clearest indication of whether we’re flattening the curve or not.

US States

Lyman Stone has, via twitter, charted the change of the growth rate. The important thing to pay attention to on this chart is not so much the level, but the slope:  if its heading down, that means that the rate of new cases is declining.  The bad news here is that the trend in New York, for example, is headed up.  Washington’s rate is nearly flat, which is a relatively good sign.

Italy

Michele Zanini has an excellent Tableau page documenting trends in Italy.  Again, with this chart, you want to see the lines sloping downward, which they are.

 

France

Gavin Chait has mapped the incidence of cases in France by region, and computed the growth rate. (Chait’s estimates of growth rates and doubling times for cases are shown in tabular, rather than graphic form, so we haven’t reproduced them here).

High definition mapping: South Korea does it best

In our commentary last week, we said that the Covid-19 pandemic calls out for the kind of neighborhood-level geographic mapping that John Snow used in London in the 1850s to pin down the source of the city’s typhoid epidemic. The most detailed map of the disease anywhere comes from Korea, where address-level data on the incidence of disease is compiled by public health officials.  This map shows the locations of Covid-19 cases in metropolitan Seoul; the dots are color-coded to show the recency of diagnosis:  red dots are less than 24 hours old, yellow dots are up to four days old, and green dots are 4-9 days old.

(Hat tip to CityFix for flagging this site).

County-level Incidence Mapping

The closest we come to the Korean neighborhood-scale maps are estimates of the prevalence of Covid-19 by county. The Columbia Missourian has used Tableau with state health department data to compute the number of Covid-19 cases per 100,000 population for Missouri Counties:

Maps and Charts, or Words?

ESRI’s Ken Field has some very smart advice on how to make informative maps about Covid-19.  Looking at maps of China drawn almost a month ago, he warned that mapping common mapping approaches may obscure more than they reveal:

Often, the simplest techniques, done well, provide a sound cartographic approach. The key to informing is to work with the data and to not imbue it with misguided or sensationalist data processing or symbology, and to deal with some of the cartographic problems different techniques are known for. And what are the key points? As of 24th February:

  • Hubei has 111 cases per 100,000 people (0.1% of the population);
  • everywhere else in China is less than 2.5 cases per 100,000 people;
  • for other countries reporting cases, the rate is even lower; and
  • maps mediate the message to a greater or lesser extent, and some that appear well-intentioned are often unhelpful.

Maybe words are all that’s needed? But if you’re going to make a map, think about these key aspects, pick a technique that supports the telling of that story, process the data and choose symbols that are suitable, and avoid making a map that misguides, misinforms . . .

Accurately understanding and communicating the spread of the Covid-19 virus is going to be difficult, but is essential to getting the widespread support for the measures needed to defeat this pandemic.

 

Cities and coronavirus: Some thoughts

The Coronavirus pandemic is already worse in several American states than anywhere in China outside Hubei Province

The pandemic is all about geography, and we need to do more to pinpoint hotspots and contagion

The very thing that makes cities special–their ability to bring people together–is their kryptonite in the Coronavirus pandemic

The harsh and largely unforeseen reality of Coronavirus has changed everyone’s daily lives, and promises to be a major disruption for months and years to come.

Covid-19 is a contagious viral disease, its spread by close and direct contact between humans. It started in Wuhan China late last year, and spread rapidly throughout China in the aftermath of the lunar new year celebrations, with thousands traveling to or from Wuhan.

What do we know about the geography of Covid-19?

What we find disappointing so far is the crude geography of most of the maps of Coronavirus in the US.  The real geography is not that of states, or counties, but rather the particular locations–the homes, businesses, hospitals, hotels, restaurants, airplanes or cruise ships, where infected people interacted directly with the previously unaffected. These maps would provide a much more useful and accurate picture of the geography of Covid19 if they were dot maps on a fine geography.

We know this kind of picture can provide essential insights on disease.  More than 150 years ago, in perhaps the canonical instance of geographic epidemeoology, John Snow mapped the location of cholera cases in London, and quickly deduced that a particular well was the source of the outbreak.

London, 1856. It’s 2020. Where is this map for Covid-19?

None of the maps published, for example, by the New York Times, show this level of detail.  And for the most part, this map, with circles scaled to the number of cases, mostly resembles a map of the nation’s largest metro areas.

A similar map prepared by the World Health Organization, aggregates data at the country level.

In a way, the most helpful information in the New York Times is the list of the locations or sources of transmission of the largest number of cases.  These hotspots help us visualize where the disease has had its largest impact. The clusters in New Rochelle and in a Seattle area nursing home are apparent, as are the outbreaks in cruise ships.

 

Covid-19 is a disease of hotspots.  And understanding where the hotspots are (and where they were 6 days ago) is an essential ingredient in ascertaining who’s most at risk, and using our all too scarce diagnostic and treatment resources to the greatest effect.

The incidence of Covid-19 in US States and Subnational regions in China, Italy and Canada

The reason a finer geographic fix on the progress of the virus is so important is underscored by looking at the incidence of Covid-19 in US states, Canadian provinces and Italian regions. China’s one and a half billion people live in 34 provinces; America’s 330 million people live in 50 states (and the District of Columbia).  These are generally the finest subnational geographic units for which data are available.  We’ve used WHO data for Chinese provinces and Johns Hopkins University data for US states to compute the incidence of Covid-19 in cumulative cases per 100,000 population as of mid March (Chinese data are for 12 March, Canadian and US data are through 17 March).  Italian regional data for 17 March are from Statista. Chinese provinces are shaded blue, US states are shaded orange, Italian regions are green, Canadian provinces are red.

(To better show the differences between most states and provinces, we’ve truncated the scale at 20 cases per 100,000 population; the correct bar for Hubei province and several Italian provinces would extend far off your computer screen to the right, with more than 100 cases per 100,000 population).

This chart makes it clear how severe and widespread the virus has been in Italy. Lombardy reports the highest incidence of Coronavirus of any subnational region in our chart, with more than 160 cases per 100,000 population. Italian regions account for 13 of the 14 highest rates of coronavirus cases per capita among the four countries shown here. Alarmingly, the incidence of Covid-19 in eight states and the District of Columbia is already higher than in any Chinese province outside Hubei (the epicenter of the virus). The median incidence of Covid-19 in US states (.73 per 100,000) is already nearly as high as the median incidence of Covid-19 in Chinese Provinces. On a population-adjusted basis, the incidence of reported Coronavirus cases in Washington, Massachusetts and New York is currently higher than in Beijing or Shanghai. If anything, the US numbers may understate the extent of the virus, because so few persons have been tested due to a shortage of diagnostic capacity in the US. (The data underlying this chart, as well as charts showing non-truncated values for coronavirus incidence, and country maps of incidence rates are avaialable on our Public Tableau site.

This disparity is both a testament to the effectiveness of the the Chinese efforts to restrict travel and its social distancing measures, and also an indication of how much time the US has squandered; the disease first manifested in China in November; months before the first case in the US.

Chinese Cities and Covid-19

While the Covid-19 virus started in the city of Wuhan, it quickly spread to other provinces in China. Hubei province, which includes Wuhan accounts for 67,800 of the roughly 81,000 cases of Covid-19 reported in China, and for 3,056 of 3,173 reported deaths (data as of 12 March).

When you exclude Wuhan and its surrounding Hubei Province, which together account for 83 percent of all Chinese cases of Covid-19, the Chinese have done a remarkable job in making sure that the disease did not grow exponentially elsewhere:  Here’s a chart from Thomas Pueyo, showing Covid-19 flatlining in every Chinese province outside Hubei after February 10.

And within these other provinces, the disease was also highly localized. The experience of Gansu province is instructive, and has been closely studied in a recent paper. Gansu province has a population of about 28 million, slightly more than Texas; at about 175,000 square miles, it is about two-thirds the area of Texas a well.  The research paper provides some clear insights about the geography of the virus’s spread. The authors used GIS to map the locations of identified cases, and distinguished between initial and secondary infections.

 

 

In Gansu province, nearly all of the cases were confined to the provinces largest cities, with few or no cases in outlying areas.

Our study demonstrates a significant spatial heterogeneity of COVID-19 cases in Gansu Province over this 2-week period; cases were mostly concentrated in Lanzhou and surrounding areas. LISA analysis findings are in agreement with the spatial distribution of COVID-19 at the county levels of Gansu Province. This analysis confirms that the distribution of cases was not random: hot spots were mainly restricted to the Chengguan District of Lanzhou, the most densely populated and most developed area. This case aggregation is closely associated with the development characteristics of Gansu Province, which is at the high end of economic, medical, population, and cultural development.

Again, unlike Hubei province, they had time to implement social distancing to limit the further spread of the disease.

For reference, as of 12 March, Gansu Province had recorded 127 cases and 2 deaths from Covid-19.  For reference, as of 17 March, Texas had recorded 110 cases and 1 death.

Jingchun Fan, Xiaodong Liu, Weimin Pan, Mark W. Douglas, and Shisan Bao, “Epidemiology of 2019 Novel Coronavirus Disease-19 in Gansu Province, China, 2020, Emerging Infectious Diseases, Volume 26, Number 6—June 2020 (Early Release)

Italian Cities and Covid-19

Outside China, the most severe outbreak of Covid-19 has been in Italy. As in China, while the infection has spread nationally, it is highly concentrated in a few hotspots in Lombardy. Here, health researchers have compared the experiences of two provincial cities, Lodi and Bergamo. The virus first struck Bergamo several days earlier, and consequently Lodi was able to implement social-distancing tactics earlier in the outbreak cycle.

Jennifer Beam Dowd, Valentina Rotondi, Liliana Andriano, David M. Brazel, Per Block, Xuejie Ding, Yan
Liu, Melinda C. Mills, “Demographic science aids in understanding the spread and fatality rates of COVID-19″ DOI 10.17605/OSF.IO/SE6WY

Big data and infectious disease

These two studies notwithstanding, there’s a paucity of geographically detailed information about the spread and intensity of the Corona virus. This seems like the ideal opportunity to deploy the much vaunted tech-driven big data infrastructure. Most adults in most developed countries (including China, Italy and the United States) have cell phones, and majority of these are smart phones. Both the cell network and various web-based apps track user location (through cell triangulation or device GPS or both). It is technically possible to use the location history of an individual device to track its users movements. Given the communicability of this disease, it seems like it would be useful to be able construct a dataset of the past couple of weeks of movements of those who have tested positive for Covid-19 to identify possible hotspots and paths of infection. This information might be helpful in prioritizing others with few or no symptoms to be tested as additional testing capability becomes available. We’re sensitive to the privacy concerns here, but its a long established protocol in the case of infectious diseases that the afflicted are expect to reveal to health authorities others they might have infected. In addition, the most valuable insights would come from aggregated data (i.e. identifying the common locations of multiple individuals) rather than data or specific only to a single individual.

Likewise, it seems like it would be of considerable value to researchers if CDC were to prepare a geo-coded database of the locations of persons diagnosed with the Covid-19 virus. Such data could be coded at a block, census tract or zip code level, to more narrowly identify the geography of the diseases spread, without disclosing the identity of any individual. Such data would make it possible to create much more detailed, informative maps than are possible with today’s highly aggregated data.

Cities are the absence of social distance

The particular irony of a viral disease like Covid-19 is that it is so closely related to a city’s core function:  bringing people together. The flourishing civic commons that brings people from all over China to xxxxx for the Lunar New Year, or which makes cities like Seattle closely connected to a global community, are exactly the characteristics that expose them to greatest risk. (It’s little surprise that West Virginia is the last US state to be infected with Covid-19.) The strength of cities emanates from the fact that ideas, like viruses, spread easily in a dense urban environment.

The response to Covid-19, social distancing, is a signal opportunity to visualize what the absence of these connections does to our daily lives.  When we can quickly, easily, frequently and serendipitously (and safely) interact with other people, the productivity and joy of urban live shrivels immediately. When cities work well, its because, in all their spaces, they overcome or bridge social distance. That’s true whether we’re talking public spaces and the civic commons, like parks and libraries, or whether we’re talking the nominally private spaces where we socialize and interact with others (bars, restaurants, workplaces). The reason we find social distancing so difficult, and so off-putting is that it runs counter to so much of what makes life, especially city life, worthwhile.

The Corid-19 outbreak, and our collective response to it are evolving quickly, and this post will be updated as our knowledge of the pandemic becomes clearer. Comments, additions and corrections are welcome. This commentary was originally posted at 9:52pm Pacific Daylight Time on 17 March 2020, and updated a 1:20 pm Pacific Daylight Time on 18 March 2020.

 

Widening I-5 at the Rose Quarter will increase greenhouse gases

Adding more freeway capacity at the Rose Quarter will thousands of tons to the region’s greenhouse gas emissions

If you say you believe in science, and you take climate change seriously, you can’t support spending $800 million or more to widen a freeway.

SYNOPSIS:

  • Wider freeways—including additional ramps and “auxiliary lanes”—induce additional car travel which increases greenhouse gas emissions.
  • The I-5 Rose Quarter project will add approximately 33,000 vehicles per day to I-5 traffic, according to ODOT’s own estimates
  • These 33,000 vehicles will directly add 56,000 daily vehicle miles of travel and indirectly add 178,000 daily vehicle miles of travel.
  • Additional vehicle travel will directly produce between 8,000 tons of greenhouse gas emission per year; and with induced travel outside the project a total increase of 35,000 tons of greenhouse gas emissions per year
  • The engineered right-of-way for the Rose Quarter project allows for eight standard freeway lanes, which would double freeway capacity in this area and further increase vehicle travel and greenhouse gas emissions.
  • Claims that widening freeways will reduce greenhouse gas emissions by reducing crashes and idling have been disproven.
This is what $800 million of fossil fuel infrastructure looks like.

Additional Vehicle Miles of Travel and Greenhouse Gases, ODOT estimates

Currently, I-5 at the Rose Quarter carries about 122,000 vehicles per day.  With the Rose Quarter freeway widening project proposed by the Oregon Department of Transportation, we estimate that traffic will increase to 155,000 vehicles per day.  This represents an increase of 33,000 vehicles per day over current levels.

I-5 North Volumes Existing conditions 2016 v. with Freeway Widening
Northbound Southbound Total Implied ADT
Time Period RQ Existing Conditions (2016)
AM Peak 8AM-9AM 2,146 5,133 7,279 122,000
PM Peak 5PM-6PM 3,360 3,639 6,999 122,000
Widened I-5 RQ Conditions (2045)
AM Peak 8AM-9AM 4,680 5,176 9,856 148,945
PM Peak 5PM-6PM 4,707 5,070 9,777 161,385
Average 155,165
RQ Existing, “2016 Existing Conditions” “Mainline North of Going”
Existing Volumes from pages 333-340 of ODOT “Volume Tables”, dated 5-21-18

We’ve had to compute that estimate ourselves, because, as we have noted, ODOT has suppressed inclusion of average daily traffic figures (the most commonly used traffic volume statistic) from the project’s Environmental Assessment.  To compute average daily traffic from the hourly data in the EA, we have factored up hourly traffic to daily levels, at given the current relationship between peak and total daily travel. Peak hour travel accounts for about 14 percent of daily travel; we’ve used the reciprocal of this amount as our multiplier to calculate future ADT implied by ODOT projections.

Today, according to the Environmental Protection Agency, the average vehicle emits about 411 grams of greenhouse gases per vehicle mile traveled.

Incremental Greenhouse Gas Emissions, I-5 Rose Quarter Project
Line Item Direct Indirect
1 grams per mile 411 411
2 vehicles per day 33,000 33,000
3 miles per vehicle 1.7 5.4
4 miles per day 56,100 178,200
5 grams per day 23,057,100 73,240,200
6 tons per day 23 73
7 tons per year 8,416 26,733
Notes
1 EPA estimate of greenhouse gases per vehicle mile traveled
2 ODOT estimate of increased traffic on I-5
3 Length of project (1.7 miles), average commute (7.1 miles)
4 Line 2 * Line 3
5 Line 1 * Line 2
6 Line 5 * 1,000,000
7 Line 6 * 365

Conservatively, we estimate that the additional 33,000 vehicles per day traveling on just the widened 1.7 mile segment of the I-5 Rose Quarter freeway will generate and additional 56,000 vehicle miles traveled per day, and in turn, that will produce about an additional 8,400 tons of greenhouse gases annually.

Moreover, we anticipate that widening the freeway in this location will induce additional automobile travel on roads connected to this section of freeway.  The 1.7 miles traveled on this segment of roadway is just a portion of typical trips. Given that the average commute trip in the Portland metropolitan area is 7.1 miles each way, we anticipate that the freeway widening will produce an additional 5.4 miles of travel elsewhere in the region, for a total of 178,00 additional vehicle miles traveled per day region wide, which in turn will produce an additional 27,800 tons of greenhouse gases per year.

Combining the direct and indirect effects of additional freeway capacity on travel, the Rose Quarter Freeway widening project is likely to increase Portland area greenhouse gas emissions by more than 35,000 tons per year.

ODOT did not analyze or model the effects of induced demand

While ODOT maintains that the I-5 Rose Quarter Freeway widening project will reduce congestion, that is because it has crafted a model which, by its construction, rules out the possibility of induced demand.  ODOT’s “static assignment model” has been shown to over-estimate traffic levels in base case situations, and understate traffic volumes in “build” scenarios, with the effect that they are systematically unable to accurately predict increased traffic due to induced demand.

The modeling has two related sources of bias:  First, it assumes that in the base case, travel patterns are not influenced by roadway congestion (i.e. that travelers don’t alter trip making behavior to avoid congestion).  These models also allow predicted traffic volumes to exceed the physical capacity of roadways, something that is simply impossible, but which again, leads to over-stating base case volumes.  Second, the models fail to predict that trip-making will respond to increases in capacity.

The EA makes no mention of induced demand, the phenomenon by which increases in highway capacity in urban areas generate additional travel that leads to a recurrence of congestion at even higher levels of traffic. (A text search of both the EA and its Traffic Technical Report show no mention of the word “induced”).

In all of its analyses, the EA uses a single set of assumptions about future land use and travel demand, including the distribution of jobs and population within the metropolitan area general, and within the Project Impact Area in particular. This analysis assumes that building (or not building) this additional freeway capacity will have no impact whatsoever on the pattern and intensity of traffic over the next two or more decades.

This approach has two effects, both of which subvert the analysis of environment impacts and which violate NEPA. In the “No-Build” scenario, levels of traffic are improperly inflated, producing much higher level estimates of congestion than will actually occur. In each of the “Build” alternatives, levels of traffic are systematically understated. This bias causes the EA to mischaracterize the relative merits of the build and no-build alternatives, and therefore violates NEPA.

The phenomenon of induced demand is so well-established in the academic literature that it is referred to as the “Fundamental Law of Road Congestion.”  Add as many un-priced lanes as you like in a dense, urban environment and that capacity will elicit additional trip-making that quickly fills new lanes to their previously congested levels. In the extreme, one ends up with Houston’s 23-lane Katy Freeway, successively widened at the cost of billions of dollars, but which now has even longer travel times than before its most recent widening.

These findings hold for the Rose Quarter Project as well. Key project staff have publicly conceded that the project will not produce significant improvements in regular, daily traffic congestion, which engineers refer to as “recurring congestion.”

Induced demand is firmly established science

It is well established in the scientific literature that increased roadway capacity generates additional vehicle travel. The definitive work by Duranton and Turner estimates that there is, in the long run, a unit elasticity of miles traveled with respect to road capacity, i.e. each 1 percent increase in road capacity generates a 1 percent increase in vehicle miles traveled:

This paper analyzes new data describing city-level traffic in the continental US between 1983 and 2003. Our estimates of the elasticity of MSA interstate highway VKT with respect to lane kilometers are 0.86 in OLS, 1.00 in first difference, and 1.03 with IV. Because our instruments provide a plausible source of exogenous variation, we regard 1.03 as the most defensible estimate. We take this as a confirmation of the “fundamental law of highway congestion” suggested by Downs (1962), where the extension of interstate highways is met with a proportional increase in traffic for US MSAs.

More recently, Hymel (2019) has independently reached a nearly identical conclusion.  His analysis concludes:

These findings offer persuasive evidence supporting the fundamental law of traffic congestion, and indicate that capacity expansion is not a viable long-term solution to urban traffic congestion. Across specifications of the dynamic model that controlled for endogenous lane-mileage and state fixed effects, the within-group estimator generated long-run induced demand elasticities ranging from 0.892 and 1.063, all with very small standard errors. . . . Furthermore, results from the dynamic model suggest that after five years, induced vehicle travel is expected to grow to 90% of its equilibrium level, quickly decreasing traffic speeds on the new roadway capacity.”

More comprehensive and independent reviews of the literature on induced demand have reached essentially the opposite conclusion from that asserted in the EA. These reviews include: Avin, U., R. Cervero, et al. (2007), Litman, (2007) and Williams-Derry, C. (2007), and Handy & Boarnet (2014). I

Whether development is consistent with local land use plans or not bears no necessary relationship to whether there is induced demand. Many different levels of development (from vacant to fully allowed density with variances) are possible under any local land use plan. Asserting that the level of development is “consistent” with land use plans is a straightforward evasion of the requirement to consider the impacts of induced demand. This is simply irrelevant to determining whether there may be impacts. Local land use plans only specify the maximum amount of development that may occur in the area influenced by the project. There is a wide range of possible levels and intensities of development that are possible under these land use plans, from no development to the full maximum allowed by law.

The fundamental law of road congestion is so well know that it has long been reflected in administrative guidance for the preparation of environmental reviews of road construction projects.  The Federal Highway Administration guidelines for preparing environmental impact statements clearly instruct the analysis of induced impacts: It specifically anticipates a different analysis for each alternative “substantial, foreseeable, induced development should be presented for each alternative”

  1. Environmental Impact Statement (EIS) — FORMAT AND CONTENT

  2. Environmental Consequences

 Land Use Impacts

This discussion should identify the current development trends and the State and/or local government plans and policies on land use and growth in the area which will be impacted by the proposed project.

These plans and policies are normally reflected in the area’s comprehensive development plan, and include land use, transportation, public facilities, housing, community services, and other areas.

The land use discussion should assess the consistency of the alternatives with the comprehensive development plans adopted for the area and (if applicable) other plans used in the development of the transportation plan required by Section 134. The secondary social, economic, and environmental impacts of any substantial, foreseeable, induced development should be presented for each alternative, including adverse effects on existing communities. Where possible, the distinction between planned and unplanned growth should be identified.

Federal Highway Administration, U.S. Department of Transportation, TECHNICAL ADVISORY: GUIDANCE FOR PREPARING AND PROCESSING ENVIRONMENTAL AND SECTION 4(F) DOCUMENTS, T 6640.8A
October 30, 1987 (http://www.fhwa.dot.gov/legsregs/directives/techadvs/T664008a.htm)

The FHWA has developed substantial technical resources to illustrate how induced demand can be estimated for projects such as the CRC. For example, DeCourla-Souza and Cohen document long-term demand elasticities of traffic with regard to travel time averaging -0.57 and ranging from -0.2 to -1.0. This means that in the long run, all other things being equal, a 10% reduction in travel time in a corridor would be associated with a 5.7% higher level of traffic. (Patrick DeCorla-Souza and Harry Cohen, Accounting For Induced Travel In Evaluation Of Urban Highway Expansion, 1998.) More recent estimates by Duranton and Turner (2011), and Hymel (2019)  put the long-term elasticity of traffic with respect to capacity at 1.0: an increase in capacity is exactly offset by an increase in travel.

A  review of transportation models used in estimating future demand and project benefits, including the type used in this process, concludes:

“Failure to account for indirect demand effects likely exaggerates the travel-time savings benefits of capacity expansion and ignores the potentially substantial land use shifts that might occur because of the marginal increase in accessibility provided.”
Avin, U., R. Cervero, et al. (2007). Forecasting Indirect Land Use Effects of Transportation Projects. Washington, DC, American Association of State Highway and Transportation Officials (AASHTO) Standing Committee on the Environment. (Page 5).

ODOT’s claims about GHG are false

ODOT has advanced two claims about the project’s potential for reducing greenhouse gases.  It has argued that the project will reduce the number of crashes in the corridor, and thereby lower the amount of greenhouse gases emitted when cars drive slowly. Similarly, but more generally, it has argued that by reducing congestion, the project will raise travel speeds, reduce idling and lower overall greenhouse gases.  Both of these claims have been disproven by independent research.

Non-recurring delay will not be reduced

In addition the Rose Quarter project has no demonstrable real-world evidence that the freeway widening will reduce delays associated with automobile crashes, so called “non-recurring congestion.” Just a few years ago, ODOT widened a nearby stretch of I-5 which carries mostly the same traffic, adding a travel lane and widening shoulders (just as it proposes to do at the Rose Quarter). ODOT’s own crash statistics show that the rate of crashes on this stretch of road not only did not decrease, but actually increased in the years following the freeway widening.

ODOT’s claims that additional lanes and wider shoulders will reduce crashes are based on it’s claim that it used a computer spreadsheet called ISAT to calculate probable crashes (Traffic Technical Report). However, the user manual for the ISATe model says that the model is not applicable to freeway segments that are controlled by ramp meters. (Ramp meters control the flow of traffic onto the roadway and reduce the likelihood of crashes associated with merging). This model is not a valid basis for predicting crashes or changes in the number of crashes because this segment of roadway includes ramp meters. See Bonneson, et al., 2012.

ODOT’s experience with I-5 suggests that widening one bottleneck at one point in the system only speeds and intensifies the process of traffic congestion at other bottlenecks in the system. For example, ODOT has made improvements to I-5 in the area north of Lombard Street, including the freeway widening project described in the previous paragraph). While this has removed some “bottlenecks” in some locations, it has funneled more vehicles, more rapidly into others, with the result that these locations become congested sooner, and actually lose capacity. The I-5 bridges now carry about 10 percent fewer vehicles in the afternoon peak hour than they did 10 and 20 years ago. (“Backfire: How widening freeways can make traffic congestion worse,” February 26, 2019, City Observatory Commentary). Similarly, an ODOT project to increase the capacity of the freeway interchange on I-5 at Woodburn also apparently has resulted in no reduction in crashes, and may actually be associated with an increase in more severe crashes (and attendant delays). See,  “Safety Last: What we’ve learned from ‘improving’ the I-5 freeway,” March 21, 2019, City Observatory Commentary).

Claims that less congestion will reduce idling and lower greenhouse gas emissions have been disproven

Claims that the project will result in less carbon emissions are based on the the discredited theory that smoothing traffic flow and reducing idling results in lower carbon emissions. That claim has been discredited by Bigazzi and Figgliozzi (2010), Williams-Derry (2007), Noland & Quddus (2006).

Also, experience has shown that carbon estimates prepared by the Oregon Department of Transportation are untrustworthy. In 2015, The Director of the Oregon Department of Transportation conceded publicly to the Legislature that ODOT had exaggerated by a factor of more than four the possible carbon emission reductions associated with certain transportation projects.

It doesn’t matter what you call the added lanes

And we don’t buy for a minute that it matters in any way that ODOT wants to call the additional lanes its building “auxiliary lanes”.  If the point is that the right hand lane on I-5 at the Rose Quarter is handling merging traffic, that is true whether the facility is 2 lanes in each direction or three.  If we apply ODOT’s logic and nomenclature to the current setup, the freeway now consists of one through lane and one auxiliary lane–and the proposed project would increase that to two through-lanes and one auxiliary lane. Using sophistry and shifting definitions doesn’t change the fact that this project adds lane miles of freeway. And more lane miles of freeway, as these calculators show, produces millions more miles of driving and thousands of tons more greenhouse gas emissions every year.

References:

Avin, U., R. Cervero, et al. (2007). Forecasting Indirect Land Use Effects of Transportation Projects. Washington, DC, American Association of State Highway and Transportation Officials (AASHTO) Standing Committee on the Environment.

Bonneson, J., Pratt, M., and Geedipally, S., (et al), Enhanced Interchange Safety Analysis Tool: User Manual, National Cooperative Highway Research Program, Project 17-45, Enhanced Safety Prediction Methodology and Analysis Tool for Freeways and Interchanges, May 2012.

Bigazzi, A. and Figliozzi, M., 2010, An Analysis of the Relative Efficiency of Freeway Congestion as an Emissions Reduction Strategy.

DeCorla-Souza, P. and H. Cohen (1998). Accounting For Induced Travel In Evaluation Of Urban Highway Expansion. Washington, Federal Highway Administration.

Duranton, G., & Turner, M. A. (2011). The fundamental law of road congestion: Evidence from US cities. American Economic Review, 101(6), 2616-52.

Federal Highway Administration, U.S. Department of Transportation, TECHNICAL ADVISORY: GUIDANCE FOR PREPARING AND PROCESSING ENVIRONMENTAL AND SECTION 4(F) DOCUMENTS, T 6640.8A
October 30, 1987 (http://www.fhwa.dot.gov/legsregs/directives/techadvs/T664008a.htm)

Handy, S., & Boarnet, M. G. (2014). Impact of Highway Capacity and Induced Travel on Passenger Vehicle Use and Greenhouse Gas Emissions. California Environmental Protection Agency, Air Resources Board.             https://www.arb.ca.gov/cc/sb375/policies/hwycapacity/highway_capacity_bkgd.pdf

Hymel, K. (2019). If you build it, they will drive: Measuring induced demand for vehicle travel in urban areas. Transport policy76, 57-66.

Kneebone, E., & Holmes, N. (2015). The growing distance between people and jobs in metropolitan America. Washington, DC: Brookings Institution, Metropolitan Policy Program.  https://www.brookings.edu/wp-content/uploads/2016/07/Srvy_JobsProximity.pdf

Litman, T. (2019). Generated Traffic and Induced Travel Implications for Transport Planning. Victoria, BC, Victoria Transport Policy Institute.

Marshall, N. L. (2018). Forecasting the impossible: The status quo of estimating traffic flows with static traffic assignment and the future of dynamic traffic assignment. Research in Transportation Business & Management. https://www.sciencedirect.com/science/article/pii/S2210539517301232?via%3Dihub

Noland, R. B., & Quddus, M. A. (2006). Flow improvements and vehicle emissions: effects of trip generation and emission control technology. Transportation Research Part D: Transport and Environment, 11(1), 1-14.

Parsons Brinckerhoff, Land Use-Transportation Literature Review for the I-5 Trade Corridor Regional Land Use Committee, September 17, 2001. Pages 4-5 http://nepa.fhwa.dot.gov/ReNEPA/ReNepa.nsf/All+Documents/CCECF4D789DB510E85256CE6006142A0/$FILE/land_use_literature_review.pdf

Williams-Derry, C. (2007). Increases in greenhouse-gas emissions from highway-widening projects. Seattle, Sightline Institute.

Anatomy of a rental marketplace

A new report from the DC Policy Center shows the inner-workings of the shadow rental market that is a key to housing affordability

Too often, our debates about housing policy are shaped by inaccurate pictures of how the housing market really works. A new report from the D.C. Policy Center provides a remarkably clear and detailed picture of the rental marketplace. And its richer and more complicated than accounted for in the usual oral tradition of housing markets.

The “shadow” market for rental housing. We generally assume that that there are two types of housing, rental and ownership. Rentals tend to be multi-family apartment buildings, single-family homes are owner-occupied. Rentals stay rentals;owner-occupied homes state owner-occupied, and never the twain shall meet. Except that lots of single-family homes do get rented; and some of them, even though once rented, get sold and occupied by a new buyer.  This fluid movement of homes in and out of the rental market is seldom mentioned in housing policy.  Taylor calls this the “shadow” market for housing.

The report makes two facts clear about the shadow housing market.  First, its a considerable part of the District’s rental housing stock.  Using detailed administrative data, Taylor calculates that there are more than 60,000 single family homes, condominiums, flats and other small scale rentals which represent about a third  of the District’s rental housing. Importantly, many of these units are in high opportunity neighborhoods, so if you’re a renter, and you’re looking to get a better environment for your kids, the “shadow” market may be the way you access such neighborhoods.

The other fact is that housing regularly moves in and out of the shadow market. Again, by laboriously constructing a longitudinal picture of the occupancy of individual houses–something that’s simply not available in most housing statistics–Taylor computes the share of the “shadow” housing that was rented in 2006 that is owner occupied today, also the share of the shadow housing we have today was owner-occupied in 2006.

Homeowners frequently move their units in and out of the rental market. One fifth of the 87,000 owner-occupied condominiums and single-family homes in 2006 had become rentals in 2019. Conversely, of the 39,500 condominiums and single-family homes that were rentals in 2006, nearly 15,000 (38 percent) were, as of September 2019, owner-occupied

Housing can and does move between these categories, in response to the incentives that owners have to rent housing versus selling it.

These two fundamentals shed a new light on how we think about rent control.  If you view the number of housing units in the rental market place as fixed (mostly big apartment buildings, owned by corporations or real estate trusts), its hard to imagine that the housing will be withdrawn from the rental market and occupied by its owners.  But that doesn’t hold for our shadow housing.  If renting out a single family home or condominium no longer seems like a viable or profitable proposition, the owner has lots of choices.  She (or someone from her extended family) can move into the house, or she can put it up for sale.

If you read the report carefully, you will see it puts the lie to one of the most pernicious and misleading terms in housing policy “naturally occurring affordable housing.”  The assumption that a lot of people have is that as housing ages it automatically must decline in price, and become more affordable. That’s only true if there’s an adequate supply of housing in the face of market demand.  If–as is the case in Washington–its hard to build new units, and there are lots of prospective renters who can pay top dollar, its highly likely that investors will fix up existing units rather than allow them to decline in quality (and rent).  As Brookings economist Jenny Schuetz explained at the Atlantic earlier this year, its entirely possible for housing to “filter up,” reducing the supply of affordable housing.

Taylor’s report provides additional nuance for understanding how this process unfolds.  Owners of shadow market rental homes and condos have choices about whether and how much to invest in upkeep, and what price point to seek in the rental market. They (and their extended family) are also potential occupants of the homes they own. And depending on the market, they can rent out their home as is, fix it up, occupy themselves, or sell it to another owner occupant.  The key point is that there’s nothing “natural” about the process by which an individual home becomes affordable (if it does). It’s all about the policy environment and the incentives.

All this is extremely salient to discussions about tightening rent control in the District of Columbia.  The District has had a modest form of rent control since the 1980s, restricting rent increases to the cost of living plus 2 percent, but with provisions to allow rent increases when apartments are vacant and when they’re rehabilitated. And the rent control doesn’t apply to newly built apartments. But there are moves afoot to reduce the allowable rent increase to just the cost of living, and to fix rents even when units become vacant.

Taylor’s report suggests that the policies could have a dramatic effect on the decisions of the shadow rental market owners.  By reducing the economic returns to renting, rent control is likely to prompt many owners  to take their units out of the rental market place, occupying them themselves, or selling them to new owner occupants. And importantly, rent control is likely to stem the flow of other units into the shadow market.  As Taylor’s work shows, there’s currently a regular influx of existing homes from ownership into rental status:  If that dries up, the city’s rental housing supply will shrink.

The conventional criticism of rent control is that it discourages the construction of new apartment buildings. But this report makes it clear that the effects on housing supply are more subtle and pervasive. Because of the fluidity of the shadow market, rent control can have a negative effect on the rental housing supply because it encourages some owners to take currently rented units out of the market, and also because it is likely to discourage others from entering the rental market.

The report provides, in passing, evidence for one other feature of the housing market that’s often overlooked. Most multi-family housing in cities gets built in spurts, in booms, and in between booms, very little is built. In DC there was a surge in housing construction in the twenties (grey), and again just before and after World War II, and then a decades long drought from 1950 through the turn of the century.  Only in the past decade has another housing construction boom occurred. The dearth of housing built from 1960 through 2000 is why the District has a shortage of “naturally occurring” affordable housing (and the bust is a good indication of why that term is so misleading).

While in the height of a boom, it seems like building may go on forever, that’s seldom the case. It takes a unique constellation of factors (a robust local economy, low interest rates, banks and developers willing to take a risk), which may be short lived.  The message to cities is that you have to make sure housing gets built in boom times, or it may not be built at all.

As an alternative to more stringent rent control, Taylor outlines a proposal for “inclusionary conversions” that would negotiate contracts with owners of existing rental units to maintain them at affordable rents.  The District would make payments to owners, who would be contractually obligated to provide below market rental units. The concept would help preserve the existing rental housing stock for low and moderate income households, and rather than imposing all of the costs on landlords, would spread them more broadly to the public, through tax abatements.

Finally, Taylor emphasizes a point that we thing is important to communities everywhere. We increasingly expect small scale landlords to play an important role in providing additional housing in cities, through liberalization of “missing middle” housing like duplexes, triplexes, fourplexes, and accessory dwelling units. But all these measures assume that we have willing investors and that being a landlord is a viable proposition, As Taylor writes:

. . . a substantive part of the District’s rental housing is dependent upon the willingness on the part of smaller landlords to keep their units in the rental market. Further, some of the policies the city is pursuing to increase housing supply (such as Accessory Dwelling Units or infill development) relies on convincing current homeowners to become landlords. The District’s rental housing policies, however, are generally focused on large rental apartments, and do not consider the constraints for and the capacity of smaller landlords in obtaining financing, meeting regulatory requirements, and working within the requirements of tenants’ rights laws. A broader rental housing policy that recognizes the importance of these smaller landlords in expanding the city’s housing supply would be a step in the right direction for the District of Columbia

If we’re really interested in promoting additional housing supply, and  assuring a wide range of rental options in neighborhoods throughout our cities, we should be paying much more attention to the size and fluidity of the “shadow” rental market.  This report shines a useful light on its crucial role, and is something every city should look to duplicate.

Yesim Sayin Taylor, Appraising the District’s rentals
The role of rental housing in creating affordability and inclusivity in the District of
Columbia, (Washington: D.C. Policy Center, 2020)

 

 

 

 

Declining bus ridership is no mystery

We know what’s responsible for declining bus ridership:  Cheap gas

And now, its about to get worse, thanks to $30 a barrel oil

Prices matter.

Last Friday’s New York Times has a nice data-driven article by the paper’s very smart Emily Badger and Quoctrung Bui, illustrating the decline in bus ridership in cities across the nation since 2013. It’s called “The Mystery of the Missing Bus Riders.”  As usual, they have a great Upshot graphic showing the decline:

And they  explain that the decline is widespread:

Sometime around 2013, bus ridership across much of the country began to decline. It dropped in Washington, in Chicago, in Los Angeles, in Miami. It dropped in large cities and smaller ones. It dropped in places that cut service, and in some that invested in it. It dropped in Sun Belt cities where transit has always struggled to compete with the car, and it dropped in older Eastern cities with a long history of transit use.

There’s no question that bus ridership is down since 2013.  But, with due respect to the authors:  There’s no mystery here.  We know exactly “who dunnit.”  We have a smoking gun:  It was gas prices.

That’s clear when you look at the historical record.  HIgh and rising gas prices bumped up transit ridership in the decade prior to 2013.  And the collapse of gas prices in 2014 coincides exactly with the decline in ridership.  A simple and powerful economic rationale explains what’s going on with transit ridership:  There is no mystery.  But you’ll be hard pressed to learn this in the Times article.

True, the Times article does mention gas prices:  once, in passing, with no data, in the 26th paragraph of the story.

Past research has suggested that transit riders are even more sensitive to changes in gas prices than they are to changes in transit fares. Recently gas has been cheap, and interest rates on auto loans low.

Instead, the story spends most of its time highlighting a number number of other possible explanations:  the movement of young people to cities, the increasing share of the white population in some neighborhoods, the growth of Uber and Lyft, the aging of the pre-boomer population and their replacement with boomers, who have little experience with transit.  Save possibly for the advent of ride-hailing, the timing of those trends hardly coincides with the decline in bus ridership:  There wasn’t a sudden shift in demographic trends in 2014.  What did change, suddenly and dramatically was the price of gasoline.

The data on gas prices and transit ridership

Here, we’ve plotted the relationship between gas prices and transit ridership for the nation since 2000. The blue line shows total transit ridership; the red line shows the national average of the price of gasoline. At the turn of the millenium, transit ridership was flat to declining. After 2004, as gas prices started rising, transit ridership rose as well. There was a brief decline in gas prices (and transit ridership) during the Great Recession, but as the economy recovered, from 2009 through 2013, gas prices remained relatively high, and transit ridership continued growing.  But, as we’ve noted before at City Observatory, there was a precipitous decline in gasoline prices in the third quarter of 2014, and that coincides exactly with the downturn in transit ridership.

In the second quarter of 2014, retail gasoline prices were more than $3.60 per gallon, and transit agencies carried about 900 million monthly riders.  In the first quarter of 2015, gas prices had fallen to about $2.10 per gallon, and ridership was down to 850 million.

Expensive gasoline explains why transit ridership was rising after 2005. 
Cheap gasoline explains why transit ridership was falling after 2014.  

Can we kindly suggest a kind of economist’s Occam’s Razor here:  If you have a salient price that drops by a third or so, wouldn’t you expect that to be the principal reason for the effect you observe? There’s little question that income and demography influence transit ridership, but those are not the factors that changed abruptly in 2014. What did change was the price of driving, and cheap gas is what’s produced the sharp decline in transit ridership in the US.

And that makes this month’s cratering in world oil markets an ominous development for transit agencies. The advent of $30 a barrel oil likely means a 50 cent per gallon reduction in gas prices, which makes driving even more affordable and attractive relative to bus or train travel. If you think bus ridership trends are bad now, just wait. It’s going to get worse.

 

 

Cheaper gas: Bad for climate and safety

Gasoline prices will drop 50 cents per gallon in the next week or so, and cheap gas will fuel more bad results: more air pollution, more greenhouse gases and more road deaths

Now is the perfect time to put a carbon tax in place

Lower gas prices mean more driving, more pollution, more road deaths

While the Coronavirus has dominated the headlines, there’s been another major global development:  the collapse of oil prices. Saudi Arabia and Russia have stopped holding back their oil supplies to prop up the price of oil, and world oil prices have plummeted. A barrel of oil that cost a little bit more than $60 in early January now goes for about $32.  That, in a very predictable way, will trigger a decline in gas prices. With a slight lag, gasoline prices (red) closely follow crude oil prices (blue).

The Energy Information Administration now predicts that gas prices will drop about 50 cents per gallon, from about $2.60 last year to a little over $2.10 this summer.

Based on the lower crude oil price forecast, EIA expects U.S. retail prices for regular grade gasoline to average $2.14 per gallon (gal) in 2020, down from $2.60/gal in 2019. EIA expects retail gasoline prices to fall to a monthly average of $1.97/gal in April before rising to an average of $2.13/gal from June through August.

(EIA, March 11, 2020).

Lower gas prices stimulate more driving. As we’ve explored at City Observatory, the price elasticity of demand for gasoline means that a 10 percent decline in gas prices is associated with about a 3 percent increase in driving.  That means the roughly 20 percent decline in gas prices we can expect this year will, all other things equal, lead to about 6 percent more driving. Cheaper gas translates in a straightforward way into more air pollution and greenhouse gases, and increased driving has been the principal cause of the increase in road deaths in the past five years.

Of course, especially in the short term, all things aren’t equal. For the next few months, we’ll be dealing with the social distancing required to limit the rapid spread of the Covid-19 virus.  And it now seems likely that economic growth will slow, if not actually tip into a recession, in spite of the best efforts of policy makers to assure markets, add to liquidity, and stimulate economic activity. With luck, we manage a short, “V-shaped” downturn. Lower levels of economic activity will reduce driving, traffic and pollution, at least temporarily.

But cheaper gas seems likely to persist for some time.  And as it does, its macroeconomic effects will be largely negative according to energy economist Jim Hamilton. To be sure, consumers will have more money to spend, but the evidence from previous gas price declines (like 2014) is that it provides relatively little stimulus. Part of the reason is that lower oil prices will devastate domestic oil production, especially the fracking industry, and the job losses and decline in investment there will more that offset the stimulus from cheaper gasoline.

Time for a carbon tax

We, like most economists, have long advocated for pricing carbon as a way to reflect back to consumers the environmental costs of their decisions.  The predictable political opposition to that idea arises from the fact that no one wants to pay more for energy, particularly a gallon of gas (which is perhaps the most visible price in the US economy).  Implementing a carbon tax as oil prices are falling would cushion the blow.  A twenty-five center per gallon carbon tax would capture something like half of the value of the decline in oil prices–and could produce $35 billion in annual revenue to support projects to fight climate change.  A carbon tax would also diminish somewhat the increase in vehicle miles traveled, air pollution, and greenhouse gases that would otherwise be triggered by cheaper gasoline. Similarly, it would serve as a valuable incentive to consumers not to purchase less fuel-efficient vehicles (which would likely happen if gas prices are consistently lower than $2 per gallon.

It’s never easy to implement a new tax. But there’ll never be a better opportunity to implement a carbon tax than when oil prices are dropping.

 

 

Equity and Homelessness

What’s equitable about spending six times as much per homeless person in the suburbs as in the city?

The “equity” standard that’s guiding the division of revenue for Metro’s housing initiative is based on politics, not need.

Portland’s regional government Metro is rapidly moving ahead with a proposed $250 million per year program to fight homelessness.

It’s plainly motivated by the fact that the growth of homelessness in Portland is a highly rated public concern-a third of Portland residents identified it as a #1 concern, up from just 1 percent nine years ago.   Public concern with homelessness is such that  Metro fears it will have trouble getting widespread support for its $4 billion transportation initiative if something isn’t done to address homelessness.

So after saying the homeless measure could wait until after the Transportation measure is put to the voters (in November), the Metro Council is moving on a hurried schedule to craft a homeless measure to appear on the region’s May 2020 primary election.

It’s plainly a rush job:  the proposed ordinance creating the program was first made public on February 4, had its first public hearings on February 14,  and would need to be adopted by the Council by February 27 in order to qualify for the May ballot.  The measure is very bare bones, it provides only the loosest of definitions about who is eligible to receive assistance and what money can be spent on.

The thing it is clear about, however, is how the money will be allocated,  Each of Portland’s three counties (Clackamas, Multnomah and Washington) will split 95 percent of the funds raised region-wide in proportion to their counties share of the Metro district’s population. And counties will be in the driver’s seat for deciding how funds are spent in their counties.  Aside from 5 percent reserved for regional use (including administration), Metro is acting solely as the banker for homeless services.

SECTION 7. Allocation of Revenue  Metro will annually allocate at least 95 percent of the allocable Supportive Housing Services Revenue within each county based on each county’s Metro boundary population percentage relative to the other counties.

DRAFT EXHIBIT A TO RESOLUTION NO. 20-5083
WS 2-18-2020

Concern about homelessness is, perforce, an equity issue.  Those who are living on the street or in shelters are plainly the among the worst off among us, and dedicating additional public resources to alleviate their suffering and provide them shelter seems like an intrinsically equitable endeavor.

Metro’s proposed adopting resolution is outspoken about clothing the entire effort as the alleviation of vast and historic wrongs:  It finds:

WHEREAS, communities of color have been directly impacted by a long list of systemic inequities and discriminatory policies that have caused higher rates of housing instability and homelessness among people of color and they are disproportionately represented in the housing affordability and homelessness crisis

(Draft Resolution No. 20-5083 WS 2/18/20)

That principle is followed up in Metro’s proposed enacting ordinance which would requireeach of the counties receiving funds adhere closely to Metro’s own statements about what constitutes equitable planning processes.  Specifically, Metro mandates that counties allocate funds in a way that redresses inequities.

A local implementation plan must include the following:

…..
2. A description of how the key objectives of Metro’s Strategic Plan to Advance Racial Equity, Diversity, and Inclusion have been incorporated. This should include a thorough racial equity analysis and strategy that includes: (1) an analysis of the racial disparities among people experiencing homelessness and the priority service population; (2) disparities in access and outcomes in current services for people experiencing homelessness and the priority service population; (3) clearly defined service strategies and resource allocations intended to remedy existing disparities and ensure equitable access to funds; and (4) an articulation of how perspectives of communities of color and culturally specific groups were considered and incorporated.

DRAFT EXHIBIT A TO RESOLUTION NO. 20-5083 WS 2-18-2020

Counties have to “remedy existing disparities and ensure equitable access to funds,” within their counties. That policy doesn’t apply, however, with Metro’s allocation of funds within the metropolitan area.  That’s because  95 percent of the funds raised by Metro (after deducting its administrative costs) are to be allocated to counties based solely on population.  But by every imaginable definition of homelessness, the homeless are not evenly distributed proportional to the overall population.  Homelessness in all of its forms, and particularly in its most serious forms–the unsheltered living on the streets and the chronically homeless, who’ve been without a home for a year or more–are dramatically more concentrated in Multnomah County than in the two suburban counties (Clackamas and Washington).  In addition, three-fourths of African American homeless and seven of eight Latino homeless persons in the region live in Multnomah County. The county with the largest burden of dealing with homeless persons of color gets far less resources, per homeless person, than surrounding suburbs.

Where the homeless live in Metro Portland?

Here we explore the data gathered in the 2019 “Point-in-Time” surveys of the homeless population in Clackamas, Multnomah and Washington Counties.  The Point-in-Time data collection effort probably understates the magnitude of the homelessness problem, but provides the best data on its location within the region, and the clearest picture of the race and ethnicity of the homeless.

We focus on two data points from the Point-in-Time Survey :  the unsheltered population (people living in the streets) and the total homeless population, which includes sheltered homeless (in shelters, missions, or temporary accomodation).  The following table shows the latest data on the populations of the three counties, and the number of persons counted as homeless in the latest (2019) Point-in-Time survey.  The first panel of the table shows the actual counts; the second panel shows the percentage distribution by county.Multnomah County constitutes 44 percent of the region’s population, but is home to about 77 percent of the region’s unsheltered homeless and about 70 percent of the region’s total homeless population.  Unsurprisingly, homelessness in Portland, as in most of the United States is concentrated in urban centers.

Notice that the Metro ordinance provides that funds will be allocated not according to the homeless population in each county, but the total population in each county.  What this means in practice is that some counties will get much more than others relative to the size of their homeless population (and by implication, homeless problems). Multnomah County also has a higher proportion of homeless people who are “unsheltered” than Clackamas or Washington Counties. Multnomah County also accounts for three-quarters of those in the three counties who are classified as “chronically homeless.”

We’ve computed the estimated allocation of a $250 million per year program based on overall county population, as shown the the following table.  Each county gets the same share of the $250 million total as its share of total population:  Multnomah County gets 44 percent or about $110 million, and the other counties get proportionate amounts.  We’ve also computed the amount that each county gets per homeless person and per unsheltered person. We use the relative size of the unsheltered and homeless population in each county to index the  need, and show how much is available in each county relative to that need.

These data show that on a per homeless person basis, Washington County gets about five to six times as much as Multnomah County, and that Clackamas County gets about two to three times as much as Multnomah County.  Per unsheltered homeless person, Multnomah County gets $54,000; while Washington County gets more than six times as much:  $355,000.

To be clear:  This table is using a “per homeless person” measure not as an absolute indicator of spending per person.  The scope of the homeless problem overall is larger than captured by the point-in-time survey, and also the measure is intended to be broader, i.e. providing rent subsidies to keep households from becoming homeless. But we take this figure as a robust indicator of the relative need in each county, and the resources that each county has relative to the need, as identified by the most severe and acute aspect of the homelessness problem.

(Note that using county totals doesn’t correspond exactly to the provisions of the Metro ordinance.  Only the population within the Metro boundary (which excludes outlying cities in each county) constitutes the basis for the distribution.  Excluding these areas would affect both our estimates of the allocation of funds and our estimates of the homeless population in each county.  For this analysis, we have relied on published and available county totals.)

This system of allocation works to the disadvantage of persons of color.  Multnomah County accounts for more than three-quarters of the homeless persons in the region who are Black or Latino.  According to the three county’s Point-in-Time surveys for 2019, Multnomah County accounted for 373 of the region’s 582 Black homeless (76 percent) and 648 of the region’s 743 Latino homeless (88 percent).  What Metro’s measure does is to replicate, if not amplify, the racial and ethnic inequity of the homeless in Multnomah County by providing them vastly smaller resources than it provides per homeless person in the suburban counties.

If this effort is all about political coalition building, and if each county is viewed as a separate fiefdom, then slicing the revenue pie proportional to population makes sense.  But if homelessness is really a shared regional concern, and if Metro is really a regional government–rather than just a revenue-raising and pie-slicing middleman for counties, then it really ought to embrace the logic of its own rhetoric about equity, and about impact.

If three-quarters of the homeless are in Multnomah County, then Metro ought to spend most of the region’s resources there.  There’s no plausible regional argument for spending five or six times as much on a homeless person in Washington County than on an otherwise similar homeless person in Multnomah County.  If Metro is serious about overcoming decades of discrimination against communities of color who’ve been disadvantaged in their access to resources then it should devote at least as much, per homeless person to those communities as to others.

Funding in search of policy and results

Everyone recognizes that homelessness is a complex, gnarly problem.  But aside from sketching the outer boundaries of what’s permissible, and asking counties to report what they spent the money on, there’s nothing in this measure that spells out a strategy, or expected results, or really established any real accountability.  The “outcome-based” portion of the ordinance actually says nothing about outcomes.  It basically just says “counties, do what you think best.”

SECTION 11. Outcome-Based Implementation Policy
Metro recognizes that each county may approach program implementation differently depending
on the unique needs of its residents and communities. Therefore, it is the policy of the Metro Council that there be sufficient flexibility in implementation to best serve the needs of residents, communities, and those receiving Supportive Housing Services from program funding.

What this tells us about equity

It’s become increasingly fashionable to talk about the equity implications of public policy. For many governments, and Metro in particular that takes the form of long lamentations about past injustices and ornately wordy but nebulous commitments to assure engagement and participation and to hear the voices of the those who’ve been disadvantaged by past policies.  But the clearest way to judge the equity of any proposal is to look past the rhetoric and look to see where the money is going.  And in the cases of this measure, which is pitched at dealing with a problem that disproportionately affects the region’s central county, it ends up devoting a disproportionate share of resources to suburban counties.

Equity has to be more about allocating resources to areas of need and insisting on measurable results, rather than just performative virtue signaling and ritual incantations of historic grievances.  When you look at the numbers associated with this measure, you see that real equity considerations take a back seat to the lofty rhetoric.

If you’re homeless in the Portland area, it shouldn’t matter what county you live in.  But Metro’s proposed allocation system would provide vastly fewer resources per homeless person in Multnomah County. That’s particularly ironic in light of the fact that most of the persons of color who are homeless in the region live in Multnomah County.  If you’re looking to redress the resource inequities that have worked to the disadvantage of these groups, this approach doesn’t do that.

Editor’s Note:  As of this writing, its unclear how much money the voters will be asked to provide.  As Oregon Public Broadcasting reported:

Metro was hoping to raise $250 million to $300 million per year, but it appears the tax it proposed might raise only about half that amount.  Latest estimates are it would provide $135,000,000 per year.

This post has been revised to correct formatting errors.

Mapping Walkable Density

Walkable density mapped for the nation’s largest metropolitan areas

by DW Rowlands

Editor’s Note:  We’re pleased to offer this guest commentary by DW Rowlands.  DW Rowlands is a human geography grad student at the University of Maryland, Baltimore County.  Her current research focuses on characterizing neighborhoods based on their amenability to public transit and on the relationship between race and the distribution of grocery stores in the DC area. She also writes on DC transportation, history, and demographic issues for Greater Greater Washington and the DC Policy Center. Follow DW on twitter at @82_Streetcar or contact her by email at d.w.rowlands<at>gmail.com.

In a companion commentary, DW Rowlands describes a technique for adjusting density measurements to account for the connectedness of local street networks. This measurement shows the difference between the actual walkable density (how many people live nearby based on how far one can walk) rather than straight-line or “ideal” density (how many people are nearby based on a measure that considers only straight-line distances).  The difference between the two measures (walkable density vs. ideal density) is an indicator of how well connected a neighborhood is for people walking.  This page shows maps of each of the nation’s largest metropolitan areas, with census tracts shaded based on how closely each neighborhood’s actual walkable density approaches its ideal (straight-line) density.  Areas shaded dark blue are those where realized walkable density comes closest to ideal density; areas shaded light blue are those where the disconnectedness of the street network means that actual walkable density is dramatically less than idea density.

Atlanta

Baltimore

Boston

Chicago

Cleveland

Denver

Detroit

Houston

Los Angeles

Memphis

New York

Philadelphia

Phoenix

Portland

San Francisco

Seattle

 

Understanding Walkable Density

A new way of measuring urban density that explicitly considers walkability

by DW Rowlands

Editor’s Note:  We’re pleased to offer this guest commentary by DW Rowlands.  DW Rowlands is a human geography grad student at the University of Maryland, Baltimore County.  Her current research focuses on characterizing neighborhoods based on their amenability to public transit and on the relationship between race and the distribution of grocery stores in the DC area.  She also writes on DC transportation, history, and demographic issues for Greater Greater Washington and the DC Policy Center. Follow DW on twitter at @82_Streetcar or contact her by email at d.w.rowlands<at>gmail.com.

Conventional density measures versus walkable density measures

Density is one of the most fundamental properties of urban areas: what makes a city different from a suburb, and suburbs different from rural areas is chiefly how many people there are, and how close they are to each other.  The fact that people in cities live and work near each other is both economically important—it makes it easier for specialized jobs and workers and stores and customers to find each other—and culturally important—it exposes people to those different from them while also making it easier to find others who share their interests, needs, and cultural traditions.

Maps of Walkable Density vs. Ideal Density for US Metro Areas

Density is also very important for transportation.  Most people are unwilling to walk much more than half a mile on a regular basis, which means that destinations—jobs, stores, transit stops, and so on—are only within “walking distance” of people within a half mile of them.  However, while it’s common to calculate the density of a city, or a metro area, by dividing the total population by the total area, this method isn’t always informative. If 80% of the people in a city live in 20% of the city’s area, then the average person is experiencing much higher population density than this average value would suggest.

A more-useful way to measure the density that people actually experience is to calculate the “population-weighted average” of population density.  The population density of each Census tract or block group is calculated individually, and the densities are averaged together, weighted by their population.  A population-weighted average shows the level of density experienced by the typical resident in his or her census tract. This means that the densities of block groups with more residents have a bigger influence on the overall value.

This measure has limitations as well.  A particularly significant one is the fact that not everywhere within a certain straight-line distance of a location can be reached by walking that distance: in cities, pedestrians are generally limited to following the street network.  In a city with a traditional street grid, this doesn’t make that big a difference, though it means you have to walk further to get somewhere diagonal to the street grid than somewhere you can get to by following a single street.

Most neighborhoods in American metro areas don’t have ideal street grids, however: winding roads and cul-de-sacs force pedestrians to take indirect trips, and bodies of water, hills, freeways, industrial areas, and superblocks often pose barriers.  This reduces the number of destinations that can be reached by walking a given distance. To take this into account, I’ve developed a statistic called “Percent Ideal Walkshed” to measure the fraction of locations within a half mile of the center a block group that are actually within a half-mile walk of it.

Calculating percent ideal walkshed as measure of walkable density

To calculate percent ideal walkshed, I began by finding the center of every Census block group in a metropolitan area.  (Here, I’ve selected four block groups in downtown Denver, outlined in green, to use as examples.) If the street grid didn’t constrain where one could walk, everywhere in the beige half-mile-radius circles would be within a half-mile walk of the centers of the block groups.

However, since pedestrians can’t fly, they do have to walk along the street grid.  The streets within a half-mile walk of the centers of the block groups are highlighted in red.  In an area with a good street grid, like the left-most block group, this “walkshed” forms a diamond shape with its tips nearly at the edge of the half-mile circle.  If the street grid is broken up by freeways, railroad tracks, or other obstacles, though, the walkshed may cover much less of the circle.

I then created buffers around these streets to convert the one-dimensional walkshed of streets into a two-dimensional area—here shown in brown—that is a half-mile walk from the center of the block group.  The better-connected the street grid is in the region around a block group, the closer the area of this brown buffer will be to that of the beige half-mile circle.

Finally, I color-coded each block group by the ratio of the area of the brown walkshed buffer to the area of the beige half-mile circle.  I call this value “Percent Ideal Walkshed,” because it measures the fraction of area within a half mile of the center of the block group—the ideal walkshed—that is within the true walkshed of the center of the block group.

When this analysis is performed over the whole metropolitan area, it shows that the areas with the most-ideal walksheds are mostly concentrated in the core—which one would expect to be densest—but that clusters with high percent ideal walkshed values are found elsewhere, too, indicating suburban areas with good, well-connected local street grids.  We’ve mapped walkable density in 17 of the nation’s largest metropolitan areas; you can see those maps here.

One very consistent pattern in these maps is that urban cores have much higher values of percent ideal walkshed than their suburbs do.  This means that core areas—which tend to be the densest parts of a metro area—have walkable densities closer to their conventionally-measured densities than suburbs do: conventional density measures understate the walkable density of urban areas relative to suburban areas.

Metro areas ranked by walkable density

Maps of percent ideal walkshed give a convenient way to tell how much the structure of the street network reduces effective “lived” density on a local scale.  However, they don’t give a very useful sense of how good or bad the street network in a metropolitan area overall is. In calculating an average value of percent ideal walkshed, we run into the same issue we saw with average population density: what really matters is the quality of the walkshed of where the average person lives.

To determine the quality of the walkshed experienced by the average resident, I calculated the population-weighted average population densities—the average population densities of block groups, weighted by the populations of the block group—and walkshed-adjusted population-weighted average population densities—the same calculation, but with each block group’s density multiplied by its percent ideal walkshed—for each of the 25 largest metropolitan statistical areas (MSAs) in the United States.


In addition to the population weighted figures, I performed the same calculations with job-weighted job density, finding the connectedness of the street network where the average job in a metro area is located.  Unsurprisingly, since jobs tend to be more concentrated in downtowns with good street grids than population is, these values are generally higher than the population-weighted figures.
The ratio of the walkshed-adjusted to non-walkshed-adjusted density for a metro area is equivalent to the population-weighted average of the percent ideal walkshed for the metro area.  Unsurprisingly, the metro areas with the highest ratios are New York, Philadelphia, Chicago, and San Francisco: all cities with dense urban cores with good grid networks. Next on the list are less-dense metro areas in the Midwest and West that have regular grid networks that extend even into their low-density suburbs, as well as Boston, which has a dense urban core that has a well-connect, but non-grid, street network.  The cities that do the worst by this measure—Charlotte, Orlando, and Atlanta—are all low-density cities in the Southeast that lack significant street grid networks.


The pattern here is fairly similar, but there are some differences.  Washington does significantly better in the job-weighted rankings, presumably because its urban core, where most jobs are, has a regular grid, while most of the MSA’s residents live in post-World War II suburbs with much less well-connected street grids.  (Unlike other major Northeastern and Midwestern cities, most of the Washington MSA’s population growth has happened in the automobile era, due to the expansion of the Federal government during the New Deal and World War II.)

Notably, only two metro areas have worse-connected street grids where the average job is located than where the average resident is located: Detroit and Riverside.  In Detroit, this is likely because the urban core, which has a regular street grid, has lost the vast majority of its jobs, while employment is decentralized to suburban locations.  While the urban core has also lost significant population, its job loss has been worse, and many of the residential suburban areas still have regular street grids.  

In Riverside, the effect is more likely related to the fact that the Riverside MSA is, to a significant degree, a suburb of the Los Angeles MSA. Jobs in the Riverside metro area has a disproportionately large, low density distribution industry and a large share of retail and other jobs that serve residents, which are more likely to be found in car-oriented suburban areas with disconnected street grids (for example, near freeway interchanges) than jobs in metro areas that have strong downtown commercial cores.

Both the population and the job density data tell roughly the same story, however.  Just as conventional density measures understate the walkable density of urban cores compared to suburbs, they understate the walkable density of older traditional metro areas compared to newer sunbelt ones.

Appendix: Technical Notes

Walksheds were calculated with the “service area” tool in QGIS 3; all other data processing was done in R, primarily using the sf package for data analysis and the tigris, tidycensus, and lehdr packages to download data.  All calculations were done using the local state plane that contained the metro area (or the CBD of the metro area, for metro areas that extend across state plane boundaries) as calculations in a US-wide planar projection caused too much distortion to get accurate and consistent area values.

Walksheds were calculated from the point on the street network nearest to the centroid of the land portion of each block group.  All walksheds were based on a travel distance of 800 m and a 10-m tolerance for geometry gaps. Although this is narrower than the width of many roadways, it was found that using a larger tolerance caused issues in certain areas—particularly in Portland—where block sizes were very small and a larger tolerance could cause the QGIS service area tool to randomly jump across blocks.  An 80-m buffer around the walkshed line features was used to calculate the area of the walkshed.

Data Sources

My walksheds were calculated using US Census 2018 Tiger-Line street networks with roads marked as freeways and freeway ramps removed.  Unfortunately, this data does not include information on the presence or absence of sidewalks.

Population and job data were also taken from US Census products: the 2018 5-year American Community Survey estimates for population and the 2015 Longitudinal Employment-Household Dynamics data for jobs.  This jobs data includes private, state government, and Federal employees, but excludes active-duty military personnel and some civilian Federal employees in defense and national security jobs.

Formulas

Population-weighted average population density was calculated with the formula

where pk and dk are the population and population density of the kth block group and Ptot is the total population of the metro area. 

Walkshed-adjusted population-weighted average population density was calculated with the formula

where wk, pk, and dk are the percent ideal walkshed, population, and population density of the kth block group and Ptot is the total population of the metro area.  The formulas for job-weighted average job density and walkshed-adjusted job-weighted average job density were the same except that job densities and numbers of jobs were used in place of population densities and populations.

Fighting Climate Change is Inherently Equitable

Happy Earth Day, Everyone!

If we care about equity, we need to make rapid progress on climate change

Equity needs to be defined by substantive outcomes, not vacuous rhetoric and elaborate process.

Ultimately equity is about outcomes, not merely process. The demonstrable results a decade or two from now have to be measurably more equitable and just than what we have today.

The overriding priority for Earth Day is taking serious action to blunt climate change. But while there’s a growing, though still far from universal, agreement that climate change is real, there’s a problem.  Many advocates are making claims about equity an obstacle to taking decisive action to reduce greenhouse gases.  Change is always hard, especially for the powerless and disadvantaged.  But we have to find ways to save the planet, while buffering the impact on the hardest hit. Somewhat ironically, our experience with Coronavirus shows how we can tackle these twin objectives by tackling them separately and simultaneously, rather than insisting that they somehow be combined and that one be subordinated to another.

Case in point:  Last year, Portland voters considered (and rejected) a multi-billion dollar ballot measure that was a typical example example of a process that nominally simulates equity, but which does nothing to address climate change.  It has the trappings of inclusion–a process that has seats at the table for youth and people of color/frontline communities, and has the rhetoric of equity.  It has also  gone through a stilted and misleading exercise of classifying projects as equitable based on whether they happen to be near neighborhoods with high concentrations of low income people or people of color.  On this criteria, the original construction of I-5, which plowed through the middle of the region’s largest African American community would have been scored as “highly equitable.”  The cumulative result of a proposed $4 billion expenditure does nothing to reduce climate change–generating by the staff’s own estimates a five one-hundredths of one percent reduction in greenhouse gas emissions.  By failing in its primary task–to reduce GHGs–the result is inequitable, because the continued march of climate change will bear most heavily on low income populations.

Equity has to be about more than proximity, about glowing rhetoric, and about enervating involvement processes.

From a hyperlocal perspective, the most equitable solution might seem to be to spare frontline communities from having to do anything or bear any burden. If each local planning effort prioritizes insulating its frontline community from burden or cost, above taking effective action to reduce greenhouse gas emissions, then collectively we’ll make no progress in solving our shared, global climate crisis.  As Alon Levy has persuasively argued, putting “rebuilding trust” ahead of taking action on the street is a self-defeating strategy

There’s an emerging mentality among left-wing urban planners in the US called “trust before streets.” It’s a terrible idea that should disappear, a culmination of about 50 or 60 years of learned helplessness in the American public sector. . . . The correct way forward is to think in terms of state capacity first, and in particular about using the state to enact tangible change, which includes providing better public transportation and remaking streets to be safer to people who are not driving. Trust follows – in fact, among low-trust people, seeing the state provide meaningful tangible change is what can create trust, and not endless public meetings in which an untrusted state professes its commitment to social justice.

And that will be the most inequitable outcome of all, because as everyone has stipulated, the frontline communities will bear the brunt of the costs associated with climate disruption.

Our community can’t do anything effective to reduce GHGs because it would have a disproportionate impact on our frontline communities.  Somebody else, somewhere else, ought to bear the burden of solving this problem.

But that risks being a recipe for universal inaction, or a prescription for performative but largely ineffectual policies.  When asked to come up with an example of Portland’s future climate policies, the city’s planning director highlighted a potential future mandate for electric car charging in new multi-family buildings, ignoring that a disappearingly small fraction of low income people live in new apartments, can afford electric cars, or even own cars, and that parking mandates have been shown to drive up housing costs, reduce affordability, and encourage sprawl and car-dependent development.

Equity advocates make a powerful, persuasive and true case that the effects of climate change are disproportionately felt by the poor and people of color.

In an important sense, if you’ve got enough income, you are likely to be able to escape, avoid or mitigate many of the personal negative effects of climate change. You can move to a state or neighborhood that is far from rising seas, or wildfires, or unbearable heat. Its always been the case that people with more income use it to buy nicer places to live, which is the main reason why you find nicer parks, more tree cover, lower crime and better air quality in high income neighborhoods.  Neighborhoods that don’t offer those amenities lose people who have the income to move elsewhere.  The result is that low income people end up in housing that is less pleasant, has fewer natural amenities, has higher crime, and is more likely to induce asthma–and is more vulnerable to climate change.

Things we do to reduce global levels of pollution have disproportionate benefits for the poor.  Consider eliminating lead from gasoline.  There’s an increasingly impressive body of evidence that points to the serious cognitive and behavioral effects of lead air pollution. When the federal government phased out lead as a gasoline additive in the 1980s, it had measurable effects on the school achievement and crime levels in cities around the nation, and particularly benefited kids in low income neighborhoods.  It was an inherently equitable strategy.

Yet banning lead also produced a regressive increase in the price of gasoline.  Oil companies used lead as an “anti-knock” additive because it was cheaper than blending higher octane fuels that didn’t cause pre-ignition (knocking).  Estimates are that banning lead probably drove up fuel prices about two cents a gallon or so, and as we all know, a price increase is regressive, because it bears more heavily on the poor than the rich.  But would anyone argue that it would have been more equitable, especially to the poor, to keep lead in fuel so that gas would continue to be cheap?

Those who are most vulnerable are the poor, especially the globally poor, who lack the resources to adapt or escape the effects of climate change.  A world in which we fail to slow or reverse climate change is a world that is in every meaningful sense more inequitable than the one we live in today.

There’s an important logical implication from these facts:  Strategies to reduce climate change are inherently equitable.  Some rich people may be indifferent between a world that is 2 degrees centigrade warmer than today; hundreds of millions of poor people aren’t–they will be inescapably worse off.

Helping victims is a separate tasks from innovating solutions

We have to distinguish the fundamentally different tasks of finding solutions and easing the burden of victims.  While the globally poor and communities of color are particularly vulnerable to the effects of climate change, that fact doesn’t imbue them with any special wisdom about the solution to the problem.

Here’s an analogous situation we’ve all lived through.  Consider the coronavirus (Covid-19).  It disproportionately affects people in infected locations, and like most viruses, is especially dangerous to the elderly and those with fragile lungs and immune systems. Yet the Pfizer and Moderna vaccines did not  emerge from the personal knowledge and experience of those victims. Nothing about being a victim necessarily qualifies one to design a solution.

In fact, many of the immediate steps we need to take to minimize the spread and severity of coronavirus impinge directly on the well-being of victims.  We quarantine them.  And for the most part people in quarantine understand and agree that the personal discomfort and risk that quarantine poses for them is more than outweighed by the social good of limiting the spread of the disease. But surely no one believes that the optimal decision about whether to quarantine the passengers and crew of a cruise ship for an additional two-weeks will most optimally be made by a vote of those on-board.

That’s not to say that we shouldn’t prioritize and generously aid victims. The pandemic provides another lesson about separate tasks of helping innocent victims while aggressively pursuing solutions. The US government has approved multiple trillion dollar aid packages, including a range of direct payments, forgivable business loans, extended unemployment insurance and other measures, recognizing that no one should be forced to bear the costs and dislocation caused by the need to fight the pandemic by throttling large parts of the economy.  But these efforts were independent of the effort to develop vaccines and promote social distancing.

Harking back to our example of lead in gasoline:  It’s unclear whether focus groups addressing chronically low achievement and high crime rates in urban neighborhoods in the United States in the 1970s would have identified reducing the lead content of gasoline as a high priority strategy.  For complex global problems, victimhood isn’t a substitute for science.

The key criterion for judging climate strategies has to be whether they are effective. An ineffectual strategy arrived at by a “just” process does not advance the cause of equity.

Climate change is not somehow the unique product of the continuing inequities in our society. While there’s little doubt that racism and poverty amplify and concentrate the negative effects of climate change, it is also true that vibrant, highly equal social democracies like those of Western Europe face exactly the same technical, organizational and economic challenges that the US does in fashioning and implementing climate change policies. Even in a world with a perfectly equitable distribution of income and and absence of racism, we would face the same challenge of figuring out how to reduce the level of carbon in the atmosphere.

None of this is to gainsay that we shouldn’t be sensitive to the negative effects of strategies implemented to reduce greenhouse gas emissions on the poor. That’s why economists overwhelmingly favor some version of the carbon tax-and-dividend or cap-and-dividend approach to reducing greenhouse gas emissions. A well constructed carbon tax would provide a direct method of compensating vulnerable populations who bear a disproportionate share of the impact of climate change. And unlike piecemeal and performative steps, would provide the scale of resources needed to meaningful mitigate the burdens of solving this shared global problem.

 

Lying about safety to sell freeway widening

ODOT’s lies about safety at the Rose Quarter are so blatant they can be seen 400 miles away.

Freeway widening isn’t about deaths or injuries, but “motorist inconvenience” according to this safety expert, making this $800 million project an egregious waste of funds

Traffic safety is a real issue, and by any objective measure, Oregon is failing badly. Between 2013 and 2016 traffic deaths in the state increased by more than 50 percent.  Car crashes kill hundreds of Oregonians each year.

Those grim statistics rightly make safety a primary concern in transportation planning. But they’ve also led the Oregon Department of Transportation to hijack that concern and to pervert concerns about safety to support spending hundreds of millions of dollars on a project that has ultimately, nothing to do with safety.

This case is so egregious that it’s visible a full state away, in Boise, Idaho.

Don Kostelec is one of the sharpest voices for transportation safety in the US.A traffic engineer, he consistently points out the flaws in our current system of designing road to optimize vehicle speeds and throughput.He researches, practices and writes tirelessly on the subject; his “Twelve Days of Safety Myths,” published by Strong Towns, is an insightful exposure of the biases of many current industry practices.

Last March, Kostelec was an expert presenter for Boise’s citizen planning academy, a training course that helps citizens become more knowledgable about a range of land use, transportation and housing issues. In his presentation, “Is Congestion Really the Problem?” Kostelec singled out Portland’s Rose Quarter Freeway project as a classic example of a state Department of Transportation using lies about safety to sell a project designed to make cars move faster.

As we’ve noted at City Observatory, one of the Oregon’s Department of Transportation has loudly (and dishonestly) claimed that I-5 at the Rose Quarter is the state’s “#1 crash location.”  As Kostelec notes, that sounds pretty scary to the public and most officials:

The Oregon DOT . . . says that part of the project need to spend $450 million to add some amplified on and off ramps  is due to safety issues on I-5, and it says that it has the highest crash rate in the state. And you’re a policy-maker, you’re a legislator, you’re a planning and zoning commissioner, and you hear the state DOT saying we have the highest crash rate on the state system–we have to do that, right? We’ve got to do something?

Well, you’ve got to go to the Appendix . . .

And Kostelec takes his audience to the Appendix, which looks like this:

A slide from Don Kostelec’s March 2019 Presentation.

As Kostelec points out, according to ODOT’s own data for the most recent five-year period, none of the crashes in the project area have been fatal, or even serious.  Nearly all of the crashes are non-injury fender benders.

Kostelec digs deeper:  Despite ODOT’s claims that peak hour traffic is slowed by these fender-benders, ODOT’s data shows that most crashes actually happen during non-peak hours.

“. . . when you look at when crashes are occurring, for the most part, they’re occurring mid-day,  not at the time of day that the traffic models and things are trying to address.”

Because most of the crashes happen when the freeway is not jammed, widening the freeway is unlikely to do anything to reduce the number of crashes. If anything, as Kostelec argues, faster traffic is likely to increase the severity of the crashes that do occur.

The key takeaway here is that we ought to care about lives lost and injuries sustained, not the number of crashes. As a result, the I-5 Rose Quarter project isn’t about safety, its rally about motorist convenience. Kostelec:

“What are we trying to do here?  We’re trying to prevent a bunch of no-injury crashes.  Now nobody wants to be in a fender bender.  I have a minor ding.  But that’s what I call a motorists inconvenience crash . . . and we’re proposing to put $500 million into this . . . good for you Oregon DOT.  You’ve done a hell of a job with safety, . . .

Designing our roads–and indeed, re-engineering our entire urban environment–for the speed and convenience of drivers are what has produced our current lethal transportation system. It’s shocking and perverse that highway engineers at the Oregon Department of Transportation can use the mantle of “safety” to peddle a an $800 million project that addresses no real safety need, at the same time it routinely pleads poverty when asked to fix the many multi-lane arterial streets in the Portland area that routinely kill and maim our citizens.

Editor’s note:  $450 million to $500 million was the range of ODOT estimates of project costs when Kostelec gave his talk in March 2019; to almost no one’s surprise that has gone up, and ODOT now estimates the project could cost between $700 and $800 million, and buildable freeway covers that some community leaders say are essential to the project could cost another $200 to $400 million.

 

 

Memo to the Oregon Transportation Commission: Don’t Dodge

Climate change? Not our job.  We’re just following orders.

The Oregon Transportation Commission is on the firing line for its plans to build a $800 million I-5 Rose Quarter freeway widening project in Northeast Portland. There’s been a tremendous outpouring of community opposition to the project:  more than 90 percent of the 2,000 comments on the project’s Environmental Assessment have been in opposition (a fact conveniently omitted from ODOT staff’s recitation of its outreach process).

The commission has lately had to endure an onslaught of commentary from young climate activists who–quite correctly–recognize that freeway widening is exactly the opposite of the kind of transportation investment we need to make if we’re to do anything to reduce greenhouse gas emissions.  Oregon’s global warming commission has pointed out that transportation emissions are increasing, and are now the largest single source of greenhouse gases in the state.  And its a demonstrated fact that wider freeways induce more traffic, longer trips and more emissions, increasing greenhouse gases.   (And, its worth noting, that in just the past four years, per capita greenhouse gas emissions from transportation in the Portland area have increased by 1,000 pounds per person–so whatever we’re doing now is taking us rapidly in the wrong direction).  The people testifying at the Commission’s December 2018 and January 2019 commission meetings have offered passionate, informed testimony on the folly of the freeway widening project.

The commission has sat, mostly silently, through these pointed critiques.

In the end, after listening to the testimony, they shrugged, and basically said:  “It’s not our decision–the Legislature told us to build this road, so we will.”  Here’s OTC Chairman Bob Van Brocklin

We have been mandated by the Legislature to design and build a project, so these are decisions the Legislature and we are not in a position to override the Legislature’s decision on that.  So we have a Legislative mandate to design and build this project. We now, today, are forwarding to the Legislature, at their request, in the bill, HB 2017,  a Cost to Complete estimate that is substantially higher than the amount of money they have dedicated to the project . . .

We’re just following orders.

The claim that the decision is out of their hands rhetorically allows the OTC to absolve itself of responsibility for the climate implications of its actions.  The commission simply isn’t going to listen to climate concerns, or other objections, because (it says) the Legislature has already decided to build the project.  The commissioners are in a tough spot: they lack a plausible direct response to either the emotional or rational aspects of the climate critique of the freeway widening project. So instead of answering these questions, the plan is to dodge them, by claiming the decision is out of their hands.  In their view, climate change concerns, no matter how serious, simply aren’t an admissible argument.

Time and again, through history, this “helpless subordinate” act has been proffered by people looking to avoid taking responsibility for the moral implications of their actions.

If the transportation commission were a row of five junior clerks, they might have a plausible argument. But they’re not.  In Oregon’s citizen commission form of government, Transportation Commissioners are supposed to represent the public’s interest.

Nothing in the statute creating the commission or the law authorizing funding for this particular project prevents the Commission from reporting back to the Legislature that the project is no longer in the public’s interest, for any one of many reasons: it’s going to cost vastly more than the Legislature was told, its going to further damage the neighborhood severed by the freeway decades ago, its going to unacceptably worsen pollution, its going to undercut the state’s adopted legal mandate to reduce greenhouse gas emissions.  Hell, if they were being honest, they’d also tell the Legislature that the freeway widening project isn’t going to work to reduce congestion either.

That’s what commissions are for:  To dig deeper into the details than legislators have time to do in the press of a legislative session. The commission can bring additional information and advice back to the Legislature. Arguably this is exactly what was anticipated in in the passage of House Bill 2017.  It specifically mandated a legislative check-in on complicated, controversial and expensive mega-projects like the Rose Quarter Freeway widening.  In addition, there’s another legislative mandate, one that predates the authorization for this project, which calls for the state to reduce its greenhouse gas emissions by 80 percent by 2050; that counts as legislative direction as well.  And when legislative mandates are in conflict, and when new information becomes available, its incumbent on commissioners to advise the Legislature accordingly, and not blindly follow out-dated, un-wise and contradictory orders.

Rather than plugging their ears and ignoring the outpouring of citizen opposition, and pretending to be helpless minions of the Legislature, the citizen commissioners of the Oregon Transportation Commission should do their jobs, and reflect back to the Legislature the serious problems that have been identified with this project.

In the bag: Pricing works

Denver’s new bag fee is another object lesson on how to use economics to achieve environmental objectives. 

Now do it for greenhouse gases

Starting this month, you’ll have to pay 10 cents for each disposable paper or plastic bag you fill with groceries in Denver.  The requirement goes statewide in 2023, under a  Colorado law that is similar to provisions adopted in other cities (London and Chicago have bag fees) and other states.  Last year, Oregon adopted a law, which mostly bans single use plastic grocery bags, and requires grocery stores to charge a 5 cent fee for every paper bag they provide to customers. These rules all harnesses economic incentives to encourage consumers to remember to bring their own re-usable shopping bags to the store.

There are a lot of reasons to dislike single-use grocery bags; it is a form of waste that is obvious to almost everyone, and bags, especially the plastic ones, are a highly annoying and visible source of litter. Plastic bags also jam up the sorting machines in recycling plants; Portland’s Metro asked consumers to put bags in the trash, rather than recycle them.  If the experience of other places that have implemented bag fees (like Chicago and London) are any indication, Oregon can expect a pretty dramatic reduction in the number of single use bags shoppers take. England’s 5 pence fee on plastic bags reduced the number of bags by almost 90 percent in a few years.   It’s a small, but potent example of how getting the prices right can quickly, and mostly painlessly, help us achieve an environmental objective.  It’s a parable to embrace.

It’s not like we haven’t tried this before.  Many states have bottle deposit laws that provide the same kind of smart economic nudge. These bag fee comes on the heels—well, half a century—after Oregon enacted its first in the nation bottle bill, in 1971. Requiring a mandatory deposit on beer and soda bottles and cans produced a dramatic reduction in litter, and led to widespread recycling. The state has a smart performance based system for setting deposits:  when recycling rates failed to meet established goals, the bottle deposit was raised from a nickel to a dime.  About a dozen states have similar laws now, but most states, including Colorado, haven’t adopted a bottle bill.

Tokenism and Cognitive Bias

But while the bag fee and its cousin the bottle deposit, are a homely, teachable examples of environmental economics, they also represent  a deeper and darker message about the challenges we face in tackling our larger environmental challenges.  The key question is:  if we’re easily able to adopt economic incentives for bags, bottle and cans, why can’t we apply this same tool to the really big problems, same of climate change.

The answer has a lot to do with our distinct cognitive biases:  We are aware of, and pay attention to somethings, and are oblivious to others.  Almost everyone shops for groceries regularly.  Everyone encounters litter (it is visible and annoying).  Everyone throws away (or recycles) single-use bags, cans and bottles.

Small scale recycling efforts are a kind of environmental ablution:  we go through a ritual to demonstrate to ourselves (and others) our moral commitment to saving the planet. These strategies have real, but small environmental benefits.  Trash is ugly and annoying, and often a hazard to wildlife, but it is not the reason the polar ice caps are melting.

Our cognitive bias stems from the fact that carbon emissions, particularly from cars are invisible, odorless and tasteless; they do no cumulate locally, but rather disperse globally.  If cars deposited a charcoal briquet every hundred meters or so (which is roughly the amount of carbon they emit), the vast piles of carbon the clogged our streets would immediately prompt us to clean them up, and ban internal combustion engines (just as we no longer tolerate horse manure and how people are expected to clean up after their dogs). But because carbon is invisible and dispersed globally, we don’t care.

So, ironically, we’re very good at small scale tokenism, like the ban on plastic straws in San Francisco.  It gives a simulacrum of sacrifice, but makes almost no difference to the larger environmental catastrophe we face. Yet it gives us as individuals and our leaders who enact such policies, the impression that we’re doing something. Too often, we engage in Nieman Marcus environmentalism:  engaging in conspicuous displays of “green” consumerism. That may be great for assuaging personal guilt, but does little or nothing to resolve the larger social problem of climate change.

The bag fee is 10 to 20 times higher than a carbon tax

Colorado’s  10 cent bag fee works about to a little less than 2 cents per 10 grams.  A kraft paper bag weights about 55 grams. That means that consumers are paying a fee of about 80 cents per pound for their paper bag (454 grams * .09 cents per gram).  On a metric ton basis, that means consumers are paying a fee of about $1,800 per ton.

To put that in context, most of the commonly forwarded ideas for a carbon tax suggest that a carbon fee of $50 to $100 per ton would lead to a dramatic reduction in greenhouse gas emissions.

Disposable shopping bag fees ask consumers to pay a fee that is by any reckoning about ten to twenty times higher than the fee we ought to be charging for carbon pollution.

If we can charge a bag fee, we can implement some form of carbon pricing. If we do, we’ll discover that consumers, producers and the overall economy can adapt quickly.

 

Bags, bottles and cans: Pricing works

Oregon’s new mandatory bag fee harnesses market forces to promote environmental objectives

Now do it for greenhouse gases

On January 1, a new law went into effect in Oregon, which mostly bans single use plastic grocery bags, and requires grocery stores to charge a 5 cent fee for every paper bag they provide to customers. The law harnesses economic incentives to encourage consumers to remember to bring their own re-usable shopping bags to the store.

Oregon actually has long experience with this kind of smart economic nudge. The bag fee comes on the heels–well, half a century, after Portland enacted its first in the nation bottle bill, in 1971. Requiring a mandatory deposit on beer and soda bottles and cans produced a dramatic reduction in litter, and led to widespread recycling. The state has a smart performance based system for setting deposits:  when recycling rates failed to meet established goals, the bottle deposit was raised from a nickel to a dime.

There are a lot of reasons to dislike single-use grocery bags; it is a form of waste that is obvious to almost everyone, and bags, especially the plastic ones, are a highly annoying and visible source of litter. Plastic bags also jam up the sorting machines in recycling plants; Portland’s Metro asked consumers to put bags in the trash, rather than recycle them.  If the experience of other places that have implemented bag fees (like Chicago and London) are any indication, Oregon can expect a pretty dramatic reduction in the number of single use bags shoppers take. England’s 5 pence fee on plastic bags reduced the number of bags by almost 90 percent in a few years.   It’s a small, but potent example of how getting the prices right can quickly, and mostly painlessly, help us achieve an environmental objective.  It’s a parable to embrace.

Tokenism and Cognitive Bias

But while the bag fee is a homely, teachable moment for environmental economics, it also has a deeper and darker message about the challenges we face in tackling our larger environmental challenges.  The key question is:  if we’re easily able to adopt economic incentives for paper bags, why can’t we apply this same tool to the really big problems, same of climate change.

The answer has a lot to do with our distinct cognitive biases:  We are aware of, and pay attention to somethings, and are oblivious to others.  Almost everyone shops for groceries regularly.  Everyone encounters litter (it is visible and annoying).  Everyone throws away (or recycles) single-use bags, cans and bottles.

Small scale recycling efforts are a kind of environmental ablution:  we go through a ritual to demonstrate to ourselves (and others) our moral commitment to saving the planet. These strategies have real, but small environmental benefits.  Trash is ugly and annoying, and often a hazard to wildlife, but it is not the reason the polar ice caps are melting.

Our cognitive bias stems from the fact that carbon emissions, particularly from cars are invisible, odorless and tasteless; they do no cumulate locally, but rather disperse globally.  If cars deposited a charcoal briquet every hundred meters or so (which is roughly the amount of carbon they emit), the vast piles of carbon the clogged our streets would immediately prompt us to clean them up, and ban internal combustion engines (just as we no longer tolerate horse manure and how people are expected to clean up after their dogs). But because carbon is invisible and dispersed globally, we don’t care.

So, ironically, we’re very good at small scale tokenism, like the ban on plastic straws in San Francisco.  It gives a simulacrum of sacrifice, but makes almost no difference to the larger environmental catastrophe we face. Yet it gives us as individuals and our leaders who enact such policies, the impression that we’re doing something. Too often, we engage in Nieman Marcus environmentalism:  engaging in conspicuous displays of “green” consumerism. That may be great for assuaging personal guilt, but does little or nothing to resolve the larger social problem of climate change.

The bag fee is 10 to 20 times higher than a carbon tax

Oregon’s 5 cent bag fee works about to a little less than 1 cent per 10 grams.  A kraft paper bag weights about 55 grams. That means that consumers are paying a fee of about 40 cents per pound for their paper bag (454 grams * .09 cents per gram).  On a metric ton basis, that means consumers are paying a fee of about $900 per ton.

To put that in context, most of the commonly forwarded ideas for a carbon tax suggest that a carbon fee of $50 to $100 per ton would lead to a dramatic reduction in greenhouse gas emissions.

Oregon’s bag fee asks consumers to pay a fee that is by any reckoning about ten to twenty times higher than the fee we ought to be charging for carbon pollution.

If we can charge a bag fee, we can implement some form of carbon pricing. If we do, we’ll discover that consumers, producers and the overall economy can adapt quickly.

 

Freeway deja vu all over again: The freeway builders ignore school kids

The Oregon Department of Transportation has a decades long-tradition of ignoring Portland Public Schools when it comes to freeway projects

So here’s our story so far.  The Oregon Department of Transportation, ODOT, is proposing to spend half a billion dollars widening a mile long stretch of freeway in Portland adjacent to the Rose Quarter.  We’ve chronicled the numerous objections to the project at City Observatory:  It won’t reduce traffic congestion, it will generate additional driving (and greenhouse gas emissions), it will lower the quality of life in the central city, doesn’t do anything to address the carnage of ODOT roadways, and a host of other concerns.

A key problem with the project is that it expands the footprint of the I-5 freeway into the school grounds of the Harriet Tubman Middle School. As we’ve related at City Observatory, the school (which predates the freeway by a decade) bears the brunt of air pollution, which has increasingly been shown to affect student achievement. (It’s also monumentally unfair that cash-strapped Portland Public Schools have had to spend millions to filter the school’s air enough to make it breathable for students).  Suffice to say, Portland Public Schools has raised serious concerns about the project.

Like thousands of other Portlanders, the school board has asked ODOT to prepare a full-scale environmental impact statement, one which would fully disclose the project’s health and environmental impacts and fully consider alternatives. For a time, it appeared that ODOT would acquiesce to these demands.

Deja vu: ODOT built I5 right next to Tubman School in 1962, and plans to widen it to be even closer now.

But now, it appears that ODOT plans to plow ahead based solely on its flawed, short-form Environmental Assessment, ignoring the school board’s concerns.  As Blair Stenvick of the Portland Mercury reports, PPS officials are angry. They’ve adopted a resolution calling out the Oregon Transportation Commission for moving forward without a careful look at the serious issues that have been raised:

“At this time,” reads the PPS resolution, “the OTC has privately stated that it plans to unilaterally take action at its December 17 public meeting without addressing any of the troubling and significant impacts that the widening will have on students and community health.”

To anyone familiar with the history of the I-5 freeway, this shouldn’t come as any surprise.  This is exactly what the Oregon State Highway Department (what ODOT used to be called), did to Portland Public Schools 60 years ago when this segment of the I-5 freeway was first built.  Back when it was just a line on the map, it was called the “Minnesota” freeway, because its proposed route essentially obliterated Minnesota Avenue.  The I-5 route sliced trough the schoolyard of the Eliot School (shown above in red), bisected the attendance areas of several other schools, closed off dozens of city streets, and focused traffic on others that were prime travel routes to local schools. Not surprisingly, local school district officials were alarmed. But the planning process gave them little opportunity to voice their concerns.

Let’s turn the microphone over to Eliot Henry Fackler, writing his master’s thesis “Protesting Portland’s Freeways:  Highway Engineering and Citizen Activism in the Interstate Era” at the University of Oregon about the freeway fight (published in 2009).  Just as today, in the early 1960s, PPS officials raised serious concerns about the freeway’s impacts on students.  It was assured that their concerns would be addressed, but . . . the agency went ahead with its plans exactly as announced, making no allowances for the school district’s concerns.

It did not take long for Portlanders to see the negative consequences of imposing massive interstate highways on a functional cityscape. On June 23, 1961, the Portland City Council and state road engineer Tom Edwards met with citizens concerned over permanent street closures caused by the partly-finished Minnesota Freeway. The route, a section of Interstate 5, sliced through the city’s Albina neighborhood.

Fifty-one streets had already been dead-ended to make way for the new depressed highway in the city’s only predominantly African American neighborhood. “I think it is unfortunate that this has not come to our attention until at this late time,” Howard Cherry, a member of the Portland School Board stated. “I would like to be heard at a proper time with the council and the highway commission.” Likewise, Daniel McGoodwin of the American Institute of Architects (AlA) implored the Highway Commission to “find a less damaging solution.” Reading from a statement prepared by the AlA, McGoodwin argued that the freeway “would create a great problem for the city and disrupt long established neighborhood patterns.”

The criticisms made by a qualified architect like McGoodwin put City Commissioner William Bowes on the defensive. “We have done everything you can think of to make it as attractive as possible,” he said, adding, “if you can call a freeway attractive.” The most incisive critique came from local architect Howard Glazer who complained that the highway designers’ failure to consult with residents was “an example of what’s happened before and will undoubtedly happen again.” When presented with a map showing the freeway skirting the edge of the neighborhood, Glazer pointed out that the map “is a slice of the city and doesn’t show adjacent territory.” No matter how carefully they were planned, urban interstates would reduce residents’ ability to quickly get groceries, visit friends, go to school, or attend church. At the meeting’s conclusion, state engineer Edwards assured those in attendance that “every attempt will be made to solve these problems.’, The freeway opened to traffic in December 1963.  No changes were made to the route.

The portents were today’s experience were clearly foreshadowed in the original debate over the freeway.  Some of the most important–and to today’s ears, familiar, arguments were featured in a June 24, 1961 story published in the Portland Oregonian (from which Fackler’s story is, in part, drawn).

It’s clear that the highway department was simply announcing what it was going to do, with no real thought given to modifying its plans. Attorneys confirmed that the process was not to consider objections, the Portland school board had been given no notice, and was assured it would be listened to–and was then ignored. Even at the time, participants foresaw that this would happen again.

The City Attorney advised the public attending the City Council meeting that it was simply an information session, not a hearing, and that they had nothing to say about the freeway; the Oregonian relates:

City Attorney Alexander Brown said, “This is not a hearing, it’s an informational meeting. It is unfortunate that the word ‘hearing’ was used in connection with this session. The inference of a hearing is that objections may be considered.”

Dr. Howard Cherry, a member of the Portland School Board objected to the freeway’s poor planning, saying:

“I think it is unfortunate that this has not come to our attention until at this late time. And I would like to be heard at a proper time with the council and the highway commission.”

According to the reporting by the Oregonian, City Commissioner William Boles

“assured Cherry that the council will be very glad to sit down and discuss the problem of traffic on Alberta Street with the school board.”

Presciently, one prominent citizen who testified, Howard Glazer, worried about the effects of the freeway on local street traffic and warned that the state had regularly–and would again, ignore local concerns:

“. . . an example of of what has happened before, and will undoubtedly happen again.”

As today’s Portland school board members, it is happening again. For history buffs, here’s a grainy image of the June 24, 1961 Portland Oregonian story.  Deja vu!

Editor’s Note:  This commentary has been revised to note that the Portland School Board has adopted the resolution calling for a full environmental impact statement.

Alexa: What is Cascadia Vision 2050?

A tech-centered vision of the future of the Pacific Northwest envisions creating a series of new urban centers 40 to 100 miles away from the region’s current largest cities—Seattle, Vancouver and Portland.

The answer to sustainability isn’t building new cities somewhere else, it’s making the urban centers we already have more inclusive, prosperous and sustainable.

By Ethan Seltzer

City Observatory is pleased to feature this guest commentary by Ethan Seltzer. Ethan Seltzer is an Emeritus Professor in the Toulan School of Urban Studies and Planning at Portland State University. He previously served as the President of the City of Portland Planning Commission and as the Land Use Supervisor for Metro, the regional government. He has lived and worked in Oregon and the Portland region since 1980 and is a contributor to City Observatory. 

 

“A region is an area safely larger than the last one to whose problems we found no solution.”

Jane Jacobs

The Cascadia Innovation Corridor Partnership—a project of Challenge Seattle and the Business Council of British Columbia—is the latest organization to claim Cascadia as its territory of interest and to promote interconnections between Vancouver BC, Seattle, and Portland as the solution to our problems. You can view their vision for our region at Connect Cascadia.

To their credit, the Partnership is trying to envision how we’ll welcome 5 million new residents in the I-5 corridor over the next 30 years without blowing apart the reasons that this has been a good place for people for over 14,000 years.

Any vision for Cascadia has to position itself within this landscape, its livability, and the closeness of nature that has attracted and kept so many here through economic ups and downs, and as other parts of North America have thrived. The most recent term of art for matching growth with livability and environmental quality has been sustainability, and the Partnership is proposing nothing less than the build-out of Cascadia as a “global model for sustainable growth.”

While seeking to be a model for any other place is of questionable significance for sustainability either here or elsewhere, at least aspiring to be more sustainable is an important goal. It doesn’t take a lot of digging to see that we have a bunch of work to do. We are nowhere close to meeting our carbon emission goals, and the trends are in the wrong direction. Our iconic salmon in this region continue to decline in number and in health. Forests on both sides of the border are in deplorable condition. The entire region is attempting to build itself out of its failing transportation and transit systems despite employing the same strategies that got us into this mess. And social justice, a presumed cornerstone for sustainable development, remains in the eye of the beholder.

You might expect a manifesto for the next global sustainability leader to explicitly address at least some of these in a convincing way. Unfortunately, the effort isn’t there, now or perhaps ever. Their vision for sustainability, borrowed from the World Bank, posits that sustainable cities have three central features: a growing economy, declining greenhouse gas emissions, and livability for all. The Partnership expects to become a global sustainability leader by focusing on housing and “development”, transportation, and environmental stewardship, in sum poor proxies for even the World Bank’s definition of the term.

Their proposal is to expect little out of the three central cities other than current trends, including the growth of their suburbs, and to grow new population centers out of existing mid-sized cities located 40-100 miles from each of the three central cities. All of this would be linked by high speed rail. If only it made some sense.

Three problems with this vision

The first problem: Most of what they present is not actually about Cascadia. Most of it is how we might appear in contrast to other so-called megaregions. We never got anywhere by trying to be better versions of other places. This vision does nothing to paint a picture of a better Cascadia. Instead we get a picture of a Cascadia marginally better than other places we’d rather not be, and rather not live.

Second problem: there is nothing bold or even new about growing outlying cities into bigger places. In 1938 Lewis Mumford was hired by the Northwest Regional Council to visit Cascadia and think about how the provision of cheap, plentiful electricity at Bonneville could change the Northwest. His report, issued in early 1939, basically concluded that existing cities should not be the location for new growth because they were already lost. Instead, the new development coming with all that power ought to go into new garden cities in the countryside.

Mumford’s prescription was inspired more out of concern over inadequate town planning than sustainability, but the “logic” is not unfamiliar. Like the Partnership’s vision, Mumford’s was equally sketchy, and both of them promise the same thing: innovative growth and development brought about by avoiding the real problems already present in real places. Whether the “hub cities” proposed by the partnership or the garden cities that Mumford famously championed, both completely ignore the dynamics of sprawl and the engines of growth and infrastructure development that have already made growth management terrifically elusive even in Cascadia.

Fundamentally, putting our chips on these new towns while writing off the existing cities might work as a real estate strategy, but there is no way that our northern temperate rainforest will become a sustainability leader if we leave out the cities. Any convincing tale of sustainability leadership has to be about the whole thing, not about the favored locations.

Third problem: This is ultimately an infrastructure vision, not a vision for sustainability. The central goal here seems to be the development of some sort of high speed rail. However, it occurs over very short distances likely to do little to get people out of their cars. Where is real road pricing? What will they do to ensure that relatively short commutes in and out of hub cities will not be accompanied by even worse carbon emissions performance in the future? Back in the 30s, Mumford and Benton MacKaye came up with the notion of the “townless highway” as a means for coping with the already evident problem of car-based sprawl. It never came to pass, and instead we’ve got the highways and sprawling housing we only know too well.

What might be an authentic vision of Cascadia ?

There are other weaknesses here, but instead of focusing on the inadequacies of this vision, I’d rather offer some building blocks for a real vision for Cascadia. To begin with, for business organizations to not focus on what would make their own operations more sustainable first seems like a giant omission.

What can the members of these organizations making up the Partnership do to reduce employee commutes? Certainly the pandemic has given us all some ideas. And what can they do to even make their own operations more sustainable? Where are they located right now, and what are the implications for both equity and sustainability of creating isolated corporate campuses? What do they sell, and how do we, as a region, take into account their business practices and product lines when assessing our supposed sustainability? How is their political and economic power helping to make Cascadia a better place, a healthier place for everyone to live? In essence, “Physician, heal thyself.” That could be innovative, and impressive.

Next, what are Cascadia’s weakest links when it comes to being the most sustainable place it can be? Where do we get our water from? Energy? What can we do to make both of those more sustainable now? How do we use our land and other natural resources? What condition are they in? Where and when are we making decisions today that foreclose future generations from inhabiting this place in their own time as we have been able to do today?

More to the point, what have we missed by largely dismissing the lessons learned by 14,000 years of indigenous inhabitation in this landscape? What aren’t we doing that we should be doing? While we’re on that topic, what will this vision do, specifically, to lift up the Indian nations in our midst? How will we embrace our history here, all of it, in a more sustainable version of what we’ve created? Whose voices need to be central in any “vision” for a sustainable Cascadia?

And speaking of the landscape, the rural parts of Cascadia are conspicuously absent in the Partnership’s vision. What, in fact, are the right boundaries for Cascadia? Is I-5 even a relevant unit for understanding the geography of our livability here? The fact that this partnership’s vision is defined by a freeway corridor speaks volumes all by itself. Interestingly, the Partnership has a subcommittee on “sustainable agriculture”. Where is their voice in this work?

Will moving significant population closer to the farms and forests make farms and forests more sustainable in the future? Perhaps if you own a lot of forest land that you hope to one day convert to ranchettes. But probably not if the thin margins for primary producers need to compete for land, the basic infrastructure for farming and forestry, with the tech wizards to be served by this plan.

Rather than building a region for the next 5 million, why not build it for the folks already here? If the vision really is sustainable, the next 5 million will be able to do just fine. Keep in mind that people are amazingly adaptive. How are folks already adapting to the presumed inadequacies of our transportation systems and our housing supply? What can we do to support what they’re doing to live a good life now? What can we do to increase and improve the stores of human capital that every Cascadian has and deserves? Where is education and training in the Partnership’s vision? What would Cascadia be doing if it had, as its primary goal, to ensure that all residents can find a satisfying place in the economy growing here and globally? What can and should the visionaries of today learn from the people and communities around them?

One of the central tenets of sustainable planning is that those with the least choices out to have more and better ones. Getting well-off households to work more quickly isn’t our biggest problem. It’s getting those with the least a better shot at thriving along with the rest of us. Consider a metaphor: the Oregon Beach Bill has declared that every inch of the Oregon Coast ought to be accessible to all of us. What would the equivalent be for Cascadia? What have we, to date, made inaccessible to too many that we need to advance today?

Sustainability has to be about more than technology and shorter faster commutes

Sadly, the default in exercises like this one seems to be a vision for faster commutes. What would a 21st century education and training system look like? What do we need to do to make sure that it spans all the borders within Cascadia? What if people sought to come to Cascadia not to get rich but to get smarter throughout their working lives?

Sustainability is not just a vision for urban form and function. Similarly, Cascadia is not just a setting for meeting the needs of ascendant corporate sectors. The Partnership wants to teach the world a lesson about sustainability. We can do that if we focus on creating a place where we can all be human in the best ways possible. That would be innovative. That would attract the attention of the world. That should attract the interest of us all.

Want more housing? Build a landlord.

If we’re going to have a lot more missing middle housing; we’re also going to have a lot more landlords

Accessory dwellings, duplexes, triplexes and fourplexes are suited to “mom-and-pop” landlords, but tough tenants rights requirements may discourage many homeowners from creating more housing.

By Ethan Seltzer

City Observatory is pleased to feature this guest commentary by Ethan Seltzer. Ethan Seltzer is an Emeritus Professor in the Toulan School of Urban Studies and Planning at Portland State University. He previously served as the President of the City of Portland Planning Commission and as the Land Use Supervisor for Metro, the regional government. He has lived and worked in Oregon and the Portland region since 1980 and is a contributor to City Observatory. For the past 8 years he has been the co-owner of one rental unit in the City of Portland that has been in the family for 11 years.

Portland, like a few other states and localities in the US, is rethinking it residential zoning. Single household zoning occupies about 40% of the land in Portland, and the lion’s share of land zoned for residential use. Today, Portland is growing rapidly, but the new housing to accommodate that growth is largely being built outside of the single household zones.

Sure, Portland has had zoning to allow Accessory Dwelling Units (ADUs) on the books since 1980, and has long allowed, by right, duplexes on corner lots in single household districts. However, those housing types have amounted to a mere trickle of new housing capacity in the last 40 years, particularly when compared with the rate of population growth and its associated growth in the demand for new housing.

A fourplex in Northeast Portland.

In recognition of the role that every residential zoning district and every neighborhood has to play in meeting the needs of Portland residents for suitable, affordable housing, the City embarked on its “Residential Infill Project.” The Residential Infill Project, or “RIP,” was created to devise new rules for adding missing “middle” housing types to existing neighborhoods:

By updating the rules that govern the types of housing allowed in our neighborhoods, we have an opportunity to accomplish two main goals: 1) Expand housing choices in residential neighborhoods to help ensure a more inclusive and diverse community. 2) Limit the size of new buildings to bring them more in line with existing homes. (https://www.portlandoregon.gov/bps/article/738843, page two)

Far from a wholesale demolition order for existing neighborhoods, the Residential Infill Project envisions an orderly and incremental addition of new housing types to existing neighborhoods, including the retention and modification of existing structures to contain more than a single dwelling unit.

However, missing completely from the RIP is the fact that the creation of new “middle” housing units brings with it the creation of a vast number of new landlords. Phrased another way, the success of Portland’s Residential Infill Project rests entirely on the emergence of a large number of individual land and homeowners willing to become landlords. Build an ADU, and you are now both a homeowner and landlord. Convert a single household dwelling into a duplex or triplex, and you’ve become a landlord.

Because of the incremental nature of the changes, and the relatively small scale of the infill, economies of scale are hard to come by. Certainly someone will figure out a business plan to take advantage of this new development capacity at scale. For the most part though, it is dependent on individuals willing to assume the risk of being a landlord. In short, meeting the goals of the Residential Infill Project to expand housing choices in all neighborhoods and for all households, and to keep new development small in scale, requires a new cadre of committed landlords.

Interestingly, just as the City and the State of Oregon are taking steps to require “middle” housing development opportunities in all single household zones, they are also taking steps to enact new tenants’ rights laws at the behest of the same coalitions advocating for the zoning changes. In Oregon a desire to sell a unit is no longer a legal reason to ask a tenant to leave. In fact, tenants now have a right to stay until the owner has a signed sale agreement with a new buyer. In Portland, no-cause evictions are, for the most part, illegal.

If you are a large landlord, this is simply adding to the cost of doing business. Leases will change, lawyers will be involved, and rents will rise to incorporate and cover these new costs of being a landlord.

However, for small landlords, owners of single rental units, or those contemplating using their existing properties more intensively, these new controls on the real excesses of corporate housing rental companies will also apply, for the most part, to them. This means that the uncertainty associated with owning rental property has just shot up for those considering becoming a part of the “middle” housing movement in Portland and Oregon.

The reaction on the part of small, prospective “middle” housing landlords could take several forms. They may decide to skip it entirely. Or, they might make affordable rents much less affordable to cover the risk. Or they might not put the units they create on the market, renting instead through AirBNB or simply through word of mouth to friends and family.

In all cases, the lack of attention to the needs of small landlords in both the tenants’ rights acts and the Residential Infill Project does not bode well for encouraging a large new group to become the landlords that Portland needs. In short, we’re on a path to become a city of impersonal, large, corporate landlords when what we’ve been working towards, apparently, is just the opposite.

Of course, it doesn’t need to end up this way. Portland could take steps to encourage and incentivize small landlords. It could recognize that there is a world of difference in capacity between those owning 4 or fewer units and those building 50 units at a crack. It could do more to bridge the gap between small landlords and tenants, creating programs to help them find each other and to negotiate mutually fair and supportive leasing agreements. It could even be so bold as to create nonprofit, neighborhood-based housing management associations to make it easier, simpler, and more transparent for investors and tenants to find and hold places in our City.

Unfortunately, the City is pursuing none of these, or to my knowledge, any others. To date, the City is seeking “middle” housing at the same time that it’s villainizing the landlords it’ll need to be successful. Tenants’ rights are important and needed. But more landlords are needed now. Landlords and tenants need each other, and simply changing the zoning is, at best, less than half a step in the right direction.

Climate crisis: Cities are the solution

A new report shows how cities are central to any strategy to fight climate change

Cities have the “3 C’s: Clean, compact, connected

National government policies need to support cities

Let’s describe a low carbon future in positive, aspirational terms

Will the future be brighter or darker than today? That’s a central question in the climate debates. In an effort to focus attention on the severity of the challenge, much of the discussion is inherently pessimistic.  Advocates on one side talk about the dire consequences of inaction; those on the other side emphasize (and in our view exaggerate the difficulties of adjusting to lower carbon living. We’re pleased to see a new report–Climate Emergency, Urban Opportunity–from the Coalition for an Urban Transition, which paints the climate challenge in a fundamentally more optimistic way, with better cities at its center.

For all of the apocalyptic and “eat your peas” rhetoric that surrounds discussions of how to tackle the climate crisis, this report takes the important step of framing the role cities can play in positive, concrete terms. Great green cities are (and will be) wonderful, enjoyable places to live. We’ll be closer to the things we want; we’ll spent less time and money traveling and stuck in traffic, we’ll have more options on how to get around.  (Just as with the Sightline Institute’s guide to the rhetoric of housing affordability, how we talk about things matters)

A report released late last year is a guidepost to making this kind of positive case.  Entitled: Climate Emergency, Urban Opportunity, the report spells out how cities that are clean, compact and connected will not only have lower carbon emissions, but will be better, more enjoyable places to live, work and play. The core of the report is its “3 Cs” framework, and the narrative explains how each of the elements is mutually reinforcing, and that how together, they form a coherent vision of the role that cities can play making our world more just, sustainable and equitable.

As we’ve long noted at City Observatory, the market is already moving in this direction.  People, especially young adults, are increasingly moving to cities — and if that’s producing affordability problems its a sign that we aren’t building great urban spaces fast enough to meet the growing demand.  We’ve also noted the rising premium that housing in walkable neighborhoods commands; more evidence that there’s a market demand, and a shortage of places that are compact and connected.

It’s important to paint a picture of how low-carbon urban living can enrich our daily lives, and “Climate Emergency, Urban Opportunity” does just that.  Instead of sprawling, low density car-dependent development, we could have cities that offer more.

Reversing this trend by pursuing more compact urban development could deliver better living standards and more vibrant cities. People could enjoy easier access to jobs, services and amenities. Public services could be cheaper, as they could be delivered more efficiently. More time in shared spaces could help to connect people across class and cultural lines. Higher densities could support a greater variety of shops, restaurants and public spaces within neighbourhoods.

The report does a nice job of using graphics and examples to illustrate the livability benefits of low carbon cities. Around the world, cities like Stockholm, Windhoek and Seoul are building the kind of urban neighborhoods that exemplify this inclusive, interesting and low carbon ideal.

Building great cities requires national government support

It’s become fashionable to tout “the New Localism“–the idea that in the face of national government intransigence (or outright denial) of the need to tackle climate change, that Mayors and cities can tackle this big global problem.  While some cities are making strides, ultimately, achieving this vision of compact, clean, connected cities will require the full support of national government.  National policies on transportation, housing, and energy all set the context for local efforts to promote urban development, and for too long have penalized urbanism and subsidized decentralization and sprawl.  For example, the report specifically endorses carbon pricing (something only national governments can accomplish) and the reallocation of national transportation budgets from road building to transit and active transportation. National governments also need to empower cities and local governments to implement road pricing. This report is valuable because it makes a strong case that strengthening cities is essential to achieving important national objectives, starting with climate, but including economic and social progress.

Too often, climate change is framed in apocalyptic terms, with .  But its better to view it as an opportunity to fix many of the things we got wrong when we built human habitation around an expensive, unsustainable and in many ways anti-social car-dependent system. Our best days are ahead of us, if we can only have the imagination to build the world we want to live in.

Coalition for an Urban Transition, Climate Emergency, Urban Opportunity: How national governments can secure economic prosperity and avert climate catastrophe by transforming cities,  September 19, 2019.

Here’s what climate change denial looks like

Pretending that climate change can be solved by widening roads to keep cars from idling in traffic is dishonest and reprehensible, yet that’s exactly what Portland’s regional government is doing.

A new poll in Portland is promoting the discredited myth that cars idling in traffic congestion are a principal cause of climate change.

Portland’s regional government, Metro, is putting together the outlines of a $3 billion transportation funding measure, and hired a survey research firm to conduct a poll, supposedly to assess public attitudes about transportation. But in reality, the poll isn’t so much about assessing attitudes as it is trying to develop, refine and market a false and misleading message about the causes of climate change and what it will take to solve them. The poll repeatedly makes the claim that reducing traffic congestion will lower carbon emissions by reducing idling.

What this amounts to is publicly funded messaging to support climate change denial. It’s cloaking arguments for more road construction with a big lie about the role of traffic idling on carbon emissions, and a disproved theory that wider roads will reduce congestion.

The debunked idling myth

One of the favorite myths of highway advocates is the notion that an important cause of climate change is the carbon emissions from cars idling in traffic. If we could just widen roads so that cars never had to idle (or ideally,even slow down), we’d clearly lessen greenhouse gas emissions, right?

As we’ve reported at City Observatory, that myth has been repeatedly debunked by careful scientific inquiry. Widening roads causes the well known problem of induced demand–greater road capacity produces even more driving, and higher levels of carbon emissions. Induced demand means that highway widening projects don’t reduce traffic congestion, and therefore the imagined emission reductions from less idling never occur. The definitive evidence on this point comes from Metro’s own backyard, from Alex Bigazzi and Miguel Figliozzi, two transportation researchers at Portland State University. Their research shows that savings in emissions from idling can be more than offset by increased driving prompted by lower levels of congestion.  When you reduce congestion by whatever means, people drive more, and the “more driving” more than cancels out any savings from reduced idling. As they conclude:

Induced or suppressed travel demand . . . are critical considerations when assessing the emissions effects of capacity-based congestion mitigation strategies. Capacity expansions that reduce marginal emissions rates by increasing travel speeds are likely to increase total emissions in the long run through induced demand.

And its a problem that grows steadily worse over time–a more car dependent transportation system induces the additional decentralization of jobs, stores and housing, leading to longer trips. The variation in emissions across US metro areas for example, has everything to do with sprawl and car dependence, and nothing whatsoever to do with variations in time spent idling in traffic.

Its therefore simply dishonest to claim that wider roads and greater traffic carrying capacity will lead to greenhouse gas reductions.  Its hardly surprising that die-hard driving advocates would populate social media with this lie. What’s appalling is when a planning professionals propagate this myth.

Yet that’s exactly what Portland’s regional government, Metro, has done in a new poll it fielded in support of a proposed tax levy to support transportation. While it’s nominally pitched as neutral opinion research, its anything but.  It’s more of  a push poll, which systematically repeats the discredited claim that idling is a big contributor to climate pollution.   This thinly disguised “framing” would be bad enough if it were limited only to those who were surveyed, but its being widely disseminated by Metro as proof of public attitudes, and now its driving media coverage.

Willamette Week summarized the poll results as showing Portland residents don’t care much about climate, and want to widen roads.

 

Willamette Week’s reporting on the poll results perfectly delivers the message that its marketing mavens crafted:  We want to widen roads as a way to reduce greenhouse gas emissions.

Here’s the way that the project’s survey research consultants, FM3 summarized their findings in a presentation for Metro.

 

Do you favor or oppose “improvements”?

The question is based on an utterly false premise that reducing congestion by expanding capacity will result in lower emissions.  In addition, the choice of options is deeply biased:  one option is framed as “balanced” while the other is limited to “only” transit biking and walking.  One alternative is framed as providing improvements “roadway improvements” while the other alternative specifically rules out support for “improvements.”  (Characterizing one alternative as an “improvement” and the other alternative as ruling out an “improvement” is the very definition of a question bias).  As Charles Denison pointed out on twitter, the survey’s wording is an absurdly loaded choice.

Survey respondents were treated to a steady diet of repetition of this phony point in poll questions clearly designed to help fashion just the right rhetorical flourish to persuade voters.  Here are a couple examples of statements that respondents were presented (emphasis added):

In order to address the growing challenge of climate change and reduce carbon pollution, this measure would improve public transit to make it easier for people to choose not to drive; promote more use of zero-emission vehicles for transit; and reduce traffic and gridlock to cut back on carbon pollution from idling cars

A measure that protects clean air and reduces the pollution caused by idling cars and trucks, particularly along routes where people live and children go to school. Investments would help increase the availability of transportation options that preserve air quality and reduce the risk of lung disease and asthma.

It’s plain that what Metro is doing is market-testing the slogans, talking points and 30-second pitches that will be used to sell their bond measure. So what we’re seeing is just an advanced peek at a calculated plan designed to present this message about “widening roads to save the climate” as a way of convincing people to vote for a tax measure. And as Willamette Week’s coverage makes clear, this survey will serve nicely to propagate this myth to a wider audience.

What makes this especially bankrupt is that responsible analysts know that idling is not a major source of pollution or carbon emissions.  Metro’s 530-page Regional Transportation Plan which describes the region’s strategies for transportation and meeting greenhouse gas emission reduction goals contains no estimates of carbon emissions attributable to idling, no strategies for reducing idling as means to lower greenhouse gases, and in fact mentions the word “idling,” only once.

This kind of deceptive practice would come as no surprise if we were talking about marketing a fat- or sugar- laden snack as a healthy alternative. But in the face of an existential global crisis, the regional government is propagating a discredited myth to justify what are likely to be counterproductive investments in road widening. The first duty that any official owes the public is to tell the truth:  this polling is a dereliction of that duty and a subversion of the kind of frank and honest discussion we need to have if we’re going to tackle the climate crisis.

Bartik: The verdict on business tax incentives

Political rationalizations and exceptionalism will always be used to justify giveaway policies

With the possible exception of Greg LeRoy (who tracks state and local incentives for Good Jobs Now) and Amazon’s site location department, there’s no one in the nation who knows more about business incentives and their effectiveness than the Upjohn Institute’s Tim Bartik. Tim has written a bevy of careful academic studies of the impact of a range of tax incentives, most of which are a bit technical for a general audience. Thankfully, he’s distilled the key learnings into a thorough and clearly written non-technical summary of what the research shows. This new book–Making sense of incentives:  Taming Business Incentives to produce prosperity , available in print and also as a (free) downloadable PDF, is must reading for anyone involved in economic development.

Economic incentives are big business. Bartik estimates that state and local governments hand out roughly $50 billion in incentives annually.

While there’s considerable folk wisdom about their efficacy, Bartik’s book takes a hard look at whether they actually work.

The “But For” problem

There are a couple of aphorisms that neatly summarize the problems and challenges of assessing economic development incentives.

“In know that only half of all advertising works, I just don’t know which half.”

“Shoot everything that flies, claim everything that falls.”

The gnarliest question in evaluating economic development incentives is figuring out whether they actually made any difference to where a firm ended up locating or not. Economic development deal-makers will swear up-and-down (and may even believe) that the last dollar of tax incentives that they provided “sealed the deal” and that without them the company would have gone elsewhere. Critics note that the abundance of incentives and the obsequiousness of economic development officials has produced as system of cash prizes for bad corporate behavior which rewards companies for doing exactly what they would have done anyhow, provided that they simply engage in the appropriate corporate kabuki of pretending to look seriously at multiple locations. Here’s where Bartik’s research shines:  he’s used a range of economic techniques to assess whether and how much incentives actually matter to the location of business investment.  His conclusion:  roughly three-quarter of all incentives don’t matter; only about a quarter of the time do they tip the balance. Bartik explains:

For a new facility location or expansion decision, this means that the incentive tipped the location decision toward this state only 25 percent of the time or less. The other 75 percent of the time, the firm would have made the same new facility location decision, or same expansion decision, even if no incentive had been provided.

You’d think that would chasten economic development officials–and their bosses, Mayors and Governors. But it doesn’t. While Bartik is almost certainly right in the aggregate (three-quarters of incentives are wasted), its nearly impossible to tell whether any individual firm’s tax break was the “deal clincher” or simply a giveaway.  Economic development officials seem to always believe (and nearly always say) that their incentive program was essential to the result. Corporate beneficiaries also have strong incentives to play along (to make their government partners feel justified in making the deal).

What this means in practice its rare that anyone ever questions whether a tax break or subsidy was actually the decisive factor in tipping a location decision. Economic development officials and the companies that get the tax break always say that it was a driver in their deal. Confronted with studies like Bartik’s the seasoned economic development practitioner may even concede that other people give away the store, but that our agency (or this deal) is one of those 25 percent where it made a difference.

Two bits of evidence from the field:  Many economic development deals are composed of multiple incentives:  state tax breaks, local tax breaks, subsidies for land development or job training, road or sewer or water line extensions and so on.  It’s common for each of these programs to claim credit for all of the new jobs and investment associated with their particular incentive.  Shoot everything that flies, claim everything that falls.  Some decades ago, when I worked for the State of Oregon, I asked a state manager why her program had extended a subsidy to a firm after it had already announced its investment:  the manager explained that it was a sure-fire win; her program would be able to claim credit for job creation, and that would make its benefits look even larger.

Politics trumps economics

Most of Bartik’s book is informed by careful econometric research that tests whether incentives seem to have any correlation with increases in business investment (they mostly don’t), whether they actually tip business decisions toward particular locations (about three-quarters of the time they’re irrelevant) and whether they generate more benefits that costs (again, usually not).

If Governors and Mayors were cool, fact-driven analysts, Bartik’s work (and that of other scholars) would be game, set and match. But as Bartik acknowledges, a primary motivation for offering incentives is political. Elected leaders want to be perceived as doing something to benefit the local economy, whether it works or not is something that’s only likely to be of interest and accessible to scholars like Bartik. When it comes to economic development, Governors and Mayors are involved in the production of “symbolic goods.” The more drama and publicity (as exemplified by the media coverage of Amazon’s carefully staged “HQ2” competition, the more opportunity for these elected leaders to show they “care” about voter’s economic concerns.  Bartik understands this dynamic:

A political reason for incentives is that they are popular. Targeting the creation of particular identified jobs—which is what incentives do—is rewarded by voters. Voters are more likely to vote for politicians who offer incentives, even if the incentives are unsuccessful. Voters appreciate well-publicized efforts to attract jobs. If a governor or mayor can go after a prominent large corporation with an incentive offer, why not? At least he is trying; he cares. Better yet, the incentives may be long-term, paid for by the next governor or mayor. Political benefits now, budget costs later.

Some years back, I wrote an analysis of state and local efforts to attract and build biotechnology industry clusters. One thing that I observed was that, in addition to the chances of success being extremely remote for any city, it was certain that it would take a decade or more for any imaginable biotechnology development strategy to bear fruit. Since that would be well beyond the political lifetime of any incumbent Mayor or Governor, it seemed to me at first glance that biotech would be an unattractive strategy, because you would never be in office to get credit for success. Yet nearly every state and large city was actively pursuing biotech. Then it dawned on me:  if success was at least a decade away, my city or state’s biotech efforts could never be judged a failure while I was in office. The critical takeaway:  it is the political calculus that drives business incentives, not any consideration of whether they actually work in practice or not.

This observation is confirmed by the extraordinary paucity of practitioner-driven evaluation. Few economic development departments anywhere engage in systematic monitoring of the results of their efforts. They’re very good at creating an echo chamber for announcements of new investment, but seldom, if ever talk about whether job creation promises are met. Instead, the only solid, objective evidence on what works, and what doesn’t comes from scholars like Bartik, which is a key reason why this book is so important.

Concrete Advice

We’ll almost certainly continue to have business incentives. Most of them will continue to be ineffective. One of the most valuable pieces of Bartik’s book is four simple points about how to structure incentives to maximize their economic benefits and decrease their costs. They are:

  1. Focus incentives on traded sector industries (i.e. businesses that sell goods and services in national and international markets), rather than subsidizing local firms who compete largely against other local firms. Target these incentives for distressed areas.
  2. Emphasize services over incentives. Spending money on services that make businesses more competitive, like employee training, do more to produce benefits.
  3. Avoid big long-term deals, they’re generally bad deals for local governments, and “out-year” incentives actually have little effect on current business decisions.
  4. Pay for incentives by raising overall business taxes rather than cutting public services.

Bartik’s work is as close as definitive answers get to shedding light on the game of business incentives. He gives a coherent analysis of the “but for” problem, and demonstrates that most of what is spent on economic development has little impact on patterns of investment. Unfortunately the compelling political calculus for offering incentives continues to trump analytics.The book clearly delivers on the first half of its title (making sense of incentives). Accomplishing the second half, (taming them) will continue to be a difficult task, even with this good advice.

Timothy J. Bartik, Making sense of incentives: Taming business incentives to promote prosperity, Upjohn Institute, WE Series, October 2019 (https://research.upjohn.org/up_press/258/)

 

Does walkability promote economic mobility?

A new study shows a tantalizing connection between more walkable places and intergenerational economic mobility

City Observatory readers will be familiar with the findings of Raj Chetty and his colleagues in the Equality of Opportunity Project. In a revelatory use of big data, they used anonymized tax records to track the lifetime earnings of kids growing up across the United States.  The results show that the neighborhood and metro area one grows up in has a huge impact on your lifetime earning prospects. Strikingly, kids from low income families growing up in some neighborhoods, especially those with mixed incomes and lower levels of racial segregation fare better. The Equality of Opportunity Project’s research casts doubt on some traditional economic development nostrums (having more jobs nearby or faster job growth doesn’t seem to be associated with intergenerational economic mobility), and put more emphasis on a wide range of neighborhood effects (intergenerational mobility is positively correlated with the number of employed adults in a neighborhood and with measures of social capital), suggesting that successful mixed income neighborhoods play a key role in helping kids escape poverty.

To shift gears a bit, if you read City Observatory regularly, you’re also familiar with Walk Score, the ubiquitous easy-to-interpret index of walkability. We and other scholars have used Walk Score to show that walkability commands a premium in real estate markets (more walkable homes, all other factors held constant) tend to be much more valuable than homes that are more car-dependent.  A growing body of public health research also confirms that walkability has significant health benefits:  people who live in more walkable areas walk more and are healthier.

Now, let’s combine these two threads:  the geography of intergenerational economic mobility and walkability.  A new paper from psychologist Shigehero Oishi, Minkyung Koo and Nicholas Buttrick does just that, looking at the correlation between Walk Score and intergenerational economic mobility.

Oishi, Koo and Buttrick aggregate data from the Equality of Opportunity Project’s estimates of integenerational mobility for commuting zones and compare it to the overall level of walkability in the area.  “Commuting Zones” are a geographical unit that usually encompasses an entire metro area and surrounding rural areas.  The US is divided into 377 zones.

The study finds that, after controlling for the “big five” factors that Chetty, et al’s research showed tended to influence intergenerational economic mobility–—percentage of African Americans, degree of income inequality, quality of K–12 education, social capital, and percentage of children with single mothers–that a commuting area’s walk score added a statistically significant and positive effect on intergenerational economic mobility.  Including walkability in a model of intergenerational economic mobility explained an additional 10 percent of the variation in mobility rates among commuting zones. As the authors conclude:

Using tax data from almost nine million Americans born between 1980 and 1982, Study 1 demonstrates that upward social mobility is substantially higher in more walkable areas (r .390). The more walkable an area is (as indexed by Walkscore.com), the more likely Americans whose parents were in the lowest income quintile are to have reached the highest income quintile by their 30s. This relationship holds above and beyond factors previously used to explain upward mobility, factors such as income inequality and social capital, and is robust to various political, economic, and demographic controls; to alternate specifications of upward mobility; and to potentially unspecified third variables.

This is encouraging evidence:  It suggests that regions that are more walkable tend to have better opportunities for kids from low income families to make connections.  (It may also be that walkability is correlated with other factors, like social capital, that maximize opportunities). Oishi, Koo and Buttrick explore these possible explanations by looking at other data on commuting and car ownership and on walkability and a self-reported sense of belonging.  Their data show that in more walkable areas, economic success is less correlated with car ownership (suggesting one way that economic mobility is enhanced by greater walkability). The findings on a sense of belonging and intergenerational mobility are less clear; the relationship, if anything is indirect:

 . . . although the direct association between walkability and upward social mobility was not significant, those living in a walkable neighborhood and those who walked more in their everyday lives felt a greater sense of belonging, which was in turn associated with upward social mobility

The data support the notion that walkability is correlated with intergenerational economic mobility. One limitation of the headline analysis of the Chetty et al data is that they are aggregated at the level of commuting zones. We know that that both walkability and intergenerational economic mobility vary substantially within metropolitan areas; different neighborhoods within a metro area can either be car-dependent or highly walkable.  It seems like the next logical step in this research is to look to see whether this same correlation holds for smaller areas. 

Oishi, S., Koo, M., & Buttrick, N. R. (2018, December 17). The Socioecological Psychology of Upward Social Mobility. American Psychologist. Advance online publication. http://dx.doi.org/10.1037/amp0000422.  Full text available at:

https://s3.amazonaws.com/academia.edu.documents/58104496/Oishi__Koo__Buttrick_2018_American_Psychologist.pdf

Hat tip to Marginal Revolution‘s Tyler Cowen for flagging this research.

Note:  This post has been updated to correct editing errors. (October 22)

Reduced demand: Tolling or restricting cars reduces traffic

We have urban traffic congestion because we heavily subsidize people driving in cities.

Reducing subsidies and lowering road capacity reduces traffic and congestion.

Why are we building highway capacity for users who won’t pay its costs at 90 percent discount?

By now, we all know about “induced demand” the idea that when we build more road capacity (ostensibly to reduce congestion) we simply prompt more and more people to drive. Well, it turns out the reverse is also true:  If we price or take away capacity, by closing streets to some vehicles or imposing tolls, we actually reduce the level of traffic in an area. Time and again, real world experience shows slashing or pricing capacity reduces urban congestion.

New York’s  Miracle on 14th Street

Case in point:  The Miracle on 14th Street. A week ago, New York City effectively banned through-car traffic on 14th Street in Manhattan between Third and Eighth Avenues. This miracle comes in two parts: Part one is the much improved bus speeds on 14th Avenue; once the city’s slowest bus line, the M14 is now racing ahead of schedule, with the added benefit that freed from cars, 14th Street is now a far more pleasant place for people.

Buses are zooming on 14th Street in Manhattan (Streetsblog)

But the second, and in many ways less obvious part of this miracle is on the adjacent 13th and 15th Streets. If you assume that traffic is a kind of incompressible liquid, you’d expect that banning cars on one street would automatically produce gridlock on adjacent streets.  That actually hasn’t turned out to be the case. The adjacent streets are, if anything, less trafficked than before, as reported by the West 13th Street Alliance:

Traffic didn’t divert to nearby West 13th Street (Twitter: @W13StAlliance)

Seattle’s Alaskan Way: Carmaggedon Avoided

City Observatory readers will remember the predictions of “Carmageggdon” made for downtown Seattle when the City finally closed the since-demolished Alaskan Way Viaduct. Part of the viaduct had to be removed before traffic could use the new tunnel, with the result being for a period of weeks, a roadway that had carried about 90,000 cars a day through downtown Seattle was removed, with no replacement.

Instead of gridlock, traffic levels throughout downtown Seattle were reduced. The Seattle Times reported, with a note of incredulity and amazement, that “the cars just disappeared.” By shutting down the flow of traffic from the viaduct to Seattle streets, the closure reduced the demand on those streets and enabled traffic to flow more smoothly.

Things quickly went back to “normal” once the City opened the new SR 99 tunnel to replace the viaduct’s highway lanes.

Here’s the point:  road capacity doesn’t so much relieve traffic as it creates it.  When we reduce the capacity of a system to introduce cars into a city center, we get fewer cars.

But after months of getting a free ride, tunnel users will finally be asked in early November to make a modest contribution to the cost of the project.  The $1 to $2.25 toll that tunnel users will be asked to pay will cover less than 10 percent of the cost of the $3.3 billion in tunnel construction and related costs.

The toll, not surprisingly, is going to discourage some people from using the tunnel.  The Washington State Department of Transportation reportedly expects tunnel traffic to drop by 30 to 50 percent.

Why spend public money on capacity users don’t want?

Reflect for a moment what that means:  Today, as long as the trip is free, about 80,000 vehicles use the new multi-billion dollar tunnel.  When asked to pay less than 10 percent of what it cost to build the tunnel, something like a third to a half of all current users say “it’s not worth it to me.”  In effect, they’re saying, I’ll only use the tunnel if somebody else pays for it. And just for the record, that $2.25 tunnel toll is less than the $2.75 bus fare that King County Metro charges its bus and light rail riders.  That, in a nutshell, is the perverse value proposition that guides so much urban transportation investment in the US: we spend billions on highway capacity that is un- or under-priced, and that people only use because its heavily subsidized, and charge higher prices to those who use more socially and environmentally responsible options.

For the record, the same thing has happened elsewhere when highway users are asked to pay even a modest fraction of the cost of adding road capacity. As we’ve documented, in Louisville, Kentucky, tolls of just $1 each way for a new interstate bridge reduced traffic by almost 40 percent.

The big concern in Seattle again is that the cars that avoid the tolls will clog up other streets. Heather Marx of the city’s transportation department told local television station KOMO:

“During the peak time, there’s nowhere really to go, but we do expect to see more traffic on downtown streets”

But as the miracle on 14th Street in Manhattan, and Seattle’s own experience earlier this year shows is that traffic isn’t so much diverted as it is reduced.

Maybe we should think of highways and major arterials in cities as causes of traffic rather than their solution.  What an urban freeway does, in effect, is point a firehose of car traffic at a city street system. When we reduce the flow of traffic on the freeways in and near cities, we reduce the number of cars traveling to and on city streets.

One other point:  We’re often told that we can’t do anything to reduce car traffic until we dramatically beef up alternative ways of getting around, especially transit. But the lesson here is that car traffic can change quickly, and that reducing car traffic is the fastest and most effective way to improve transit service. Getting cars out of the way makes buses move faster, making them more attractive, and reducing their cost (drivers is more productive, and move more people, more miles if the bus can go faster). No transportation policy is more equitable than one that gets buses moving faster.

 

 

What if we regulated cars like we do houses?

What if we regulated new car ownership the same way we do new housing?

A recent story about Singapore caught our eye:  In Singapore, you can’t even buy a car without a government issued “certificate”—and the number of certificates is fixed city wide.  The government auctions a fixed number of certificates each year, and the price has risen to more than $100,000.  This means that a Toyota Camry, which costs about $30,000 in the US, would cost a Singapore buyer about six times as much (including all taxes and fees).

Not surprisingly, Singapore residents complain about automobile affordability, in pretty much exactly the same way that Americans complain about housing affordability.  That led us to think about the very different way local governments regulate houses and cars in the US.  In short, houses are highly regulated; cars aren’t.

The underlying premise of the NIMBY/YIMBY divide is the notion that individual, often neighborhood permission is required before new housing can be built.  Getting a building permit is a highly regulated and often discretionary process.  Zoning and other restrictions empower neighbors to restrict the number, size and location of homes built, and whether any new housing is built at all.

The contrast between a building permit and a car license couldn’t be starker:  Issuance of car licenses is an automatic, ministerial function:  you pay a few tens or hundred dollars, and the state issues you a vehicle registration and license plate (the function if often effectively delegated to car dealers).  A building permit, as we all know, requires detailed review and may be subject to objections about density, shadows, community impacts and the like.  But if your neighbor decides to buy one, two or six new vehicles, there’s nothing to prevent them from doing so.  (Ironically, of course, the underlying issue behind many NIMBY concerns is parking, but instead of limiting the number of cars directly, we limit the number of cars by making housing scarce or unaffordable).

 

The decision to subject housing to an intrusive, byzantine and restrictive set of regulations, while making car registrations easy, automatic and cheap is in many ways an arbitrary one.  A world in which anyone could get a permit for a new dwelling as easily and cheaply as they get a new car registration would look very different from ours, as would a regime in which you needed to get your neighbors permission to buy a new car.

Environmental Impact Reports for New Car Registration?

The same could be said of environmental impact reports:  housing is frequently subjected to (and effectively blocked by) costly and litigious environmental impact report requirements, while the sale of new cars is completely exempt.

States like California and Washington are famous for their state-level environmental impact statement requirements, which are routinely applied to city and state development approvals for everything from new apartments to the construction of bike- and bus-lanes.

The essence of these requirements is that before undertaking a policy decision, the state and local governments should evaluate the environmental consequences of their actions. California’s CEQA (the California Environmental Quality Act), has been used to block  housing developments around the state. Critics say that CEQA has become “the tool of choice for preventing cities from approving high-density housing . . .. with a quarter of lawsuits against CEQA-reviewed projects targeting housing.” Similarly, in Washington State, the State Environmental Policy Act (SEPA) has routinely been used to challenge higher density housing. The act was even used to object to an environmental foundation’s zero net energy building, because it didn’t include parking spaces. And while actual court victories under these laws are rare, the threat of a CEQA or SEPA lawsuit is often a powerful bargaining chip in negotiations to force developer concessions.

The premise for these environmental disclosure laws is, for the most part, a good one: government decisions ought to be undertaken with a clear understanding and careful weighing of the environmental consequences. So, if we’re going to have such policies, maybe we should consider applying them to particularly environmentally damaging activities licensed by the government.

That got us thinking:  Why not apply these same policies at the Department of Motor Vehicles?  Before the DMV issues a license for a new vehicle, it really ought to be required to consider the environmental impact of doing so. Each additional car on the road will add to road congestion, air pollution, and greenhouse gases, and will likely pose a safety risk for other road users, especially persons on foot and bicycle.  For example, the Environmental Protection Agency estimates that the average car generates 4.6 tons of carbon dioxide emissions per year, so each 2-year car registration should acknowledge the impact of 9 tons of CO2 emissions.  Just as cities now limit the construction of new housing when they think neighborhoods are getting overcrowded, maybe cities could set numerical limits on the issuance of new car licenses:  that would certainly attack the problem of congestion and pollution more directly than is done now.

The DMV should ask you for an EIS for that car registration

A DMV granting a vehicle license is functionally no different than a city planning department granting a building permit, so in theory it seems like the environmental review laws could apply.

Setting tough requirements for vehicle registration to protect the environment isn’t unusual. Other countries take an entirely different approach to vehicle registration. In Japan, car owners are required to show that they have their own private off-street parking space in order to register a motor vehicle.

It seems to us like a modest proposal:  if impact statements make sense for houses, why not apply them to cars—which seem to be a much more serious environmental threat, in light of recent evidence about the growth in greenhouse gas emissions. The fact that we haven’t done this already in the decades of experience with environmental impact statement requirements speaks volumes about the built-in pro-automobile bias in our current legal structure (which has been cataloged in detail in a superb article by Greg Shill).

You can add this to our earlier modest proposal that we extend the Americans with Disabilities Act to freeways, and insist that state highway departments provide accessible bus service on limited access roadways–places that are now effectively unavailable to those who are too young, too infirm or too old to drive. It’s another example of how we might take some fundamental and established provision of law and apply it in a way that is more supportive of our stated social and environmental values.

 

 

A modest proposal: An EIS for the DMV

Many states subject housing approval to environmental reporting requirements; what if we extended this same principle to car registrations.

Back in the early days of the environmental movement–the late sixties and early seventies, one lawyerly idea that was in vogue was the notion of requiring government policy decisions to be undertaken only after fair consideration of the environmental impacts they might have. The idea was that a lot of pollution and environmental degradation was the product of a failure to consider impacts and alternatives in advance, and that by requiring disclosure, governments would make better, greener decisions. The principle is codified in the National Environmental Policy Act, which requires Environmental Impact Statements for major federal decisions likely to influence the environment.

States like California and Washington are famous for their state-level environmental impact statement requirements, which are routinely applied to city and state development approvals for everything from new apartments to the construction of bike- and bus-lanes.

The essence of these requirements is that before undertaking a policy decision, the state and local governments should evaluate the environmental consequences of their actions. California’s CEQA (the California Environmental Quality Act), has been used to block  housing developments around the state. Critics say that CEQA has become “the tool of choice for preventing cities from approving high-density housing . . .. with a quarter of lawsuits against CEQA-reviewed projects targeting housing.” Similarly, in Washington State, the State Environmental Policy Act (SEPA) has routinely been used to challenge higher density housing. The act was even used to object to an environmental foundation’s zero net energy building, because it didn’t include parking spaces. And while actual court victories under these laws are rare, the threat of a CEQA or SEPA lawsuit is often a powerful bargaining chip in negotiations to force developer concessions.

The premise for these environmental disclosure laws is, for the most part, a good one: government decisions ought to be undertaken with a clear understanding and careful weighing of the environmental consequences. So, if we’re going to have such policies, maybe we should consider applying them to particularly environmentally damaging activities licensed by the government.

That got us thinking:  Why not apply these same policies at the Department of Motor Vehicles?  Before the DMV issues a license for a new vehicle, it really ought to be required to consider the environmental impact of doing so. Each additional car on the road will add to road congestion, air pollution, and greenhouse gases, and will likely pose a safety risk for other road users, especially persons on foot and bicycle.  For example, the Environmental Protection Agency estimates that the average car generates 4.6 tons of carbon dioxide emissions per year, so each 2-year car registration should acknowledge the impact of 9 tons of CO2 emissions.  Just as cities now limit the construction of new housing when they think neighborhoods are getting overcrowded, maybe cities could set numerical limits on the issuance of new car licenses:  that would certainly attack the problem of congestion and pollution more directly than is done now.

The DMV should ask you for an EIS for that car registration

A DMV granting a vehicle license is functionally no different than a city planning department granting a building permit, so in theory it seems like the environmental review laws could apply.

Setting tough requirements for vehicle registration to protect the environment isn’t unusual. Other countries take an entirely different approach to vehicle registration. In Japan, car owners are required to show that they have their own private off-street parking space in order to register a motor vehicle.

It seems to us like a modest proposal:  if impact statements make sense for houses, why not apply them to cars–which seem to be a much more serious environmental threat, in light of recent evidence about the growth in greenhouse gas emissions. The fact that we haven’t done this already in the decades of experience with environmental impact statement requirements speaks volumes about the built-in pro-automobile bias in our current legal structure (which has been cataloged in detail in a superb article by Greg Shill).

You can add this to our earlier modest proposal that we extend the Americans with Disabilities Act to freeways, and insist that state highway departments provide accessible bus service on limited access roadways–places that are now effectively unavailable to those who are too young, too infirm or too old to drive. It’s another example of how we might take some fundamental and established provision of law and apply it in a way that is more supportive of our stated social and environmental values.

In the end, it may well be outlandish to require such individual impact statements for car registrations. Alternatively, we might just do the reverse: make getting a apartment building permit as easy and automatic as getting a car registration? Just a thought.

Why economic diversification is a poor guide to local strategy

Too much economic development policy is based on a naive analogy to portfolio theory

Cities looking to strengthen their economies should concentrate on building upon and  extending current specializations

One of the most widely agreed upon bits of folk wisdom in economic development is the idea of “economic diversification.” The notion is that your local economy is like a stock portfolio, and if you are invested in just a few industries, then you are more vulnerable to dislocation if those industries decline. Like a prudent investor, if you were more diversified, you’d be less likely to suffer a decline if any particular industry experienced a downturn.

Local economic development is not like buying and selling these.

 

This bit of folk wisdom got a boost a few years back from economist Ricardo Hausmann the author of the Atlas of Economic Complexity, which assembled a clever statistical portrait of how simple or sophisticated a nation’s economy is based on its mix of industries. Hausmann argues that diversification is the key to higher incomes for cities and nations:

Larger cities are more diversified than smaller cities. Among cities with similar populations – say, Salvador and Curitiba in Brazil, or Guadalajara and Monterrey in Mexico – more diversified cities are richer than less diversified cities. They tend to grow faster and become even more diversified, not only because they have a larger internal market, but also because they are more diversified in terms of what they can sell to other cities and countries. What is true at the level of cities is even more applicable at the level of states and countries. The Netherlands, Chile, and Cameroon have a similar population size, but the Netherlands is twice as rich as Chile, which is 10 times richer than Cameroon. Looking at their exports shows that the Netherlands is three times more diversified than Chile, which is three times more diversified than Cameroon.

On its face that seems like a plausible claim, but upon closer inspection, there are five good reasons to question its accuracy and utility as policy advice.

First, Hausman and others note that highly diversified cities (and countries) tend to be richer.  But this is typically because they are bigger and better educated, not simply because they are more diverse.  A New York can achieve scale in more different industries than Des Moines, so its more diverse.  Also, as we know, educational attainment is critical–something these analyses often leave out.  And the cross-country evidence here is misleading:  Chile and Cameroon are poorer than the Netherlands, not because they are less diversified, but because they have much lower levels of education, and are still primarily dependent on resource-based industries.

Second, economies tend to be more diverse if they are richer.  Greater income and wealth create a greater demand for services, which naturally leads to more diversity  The third world village suffices with rudimentary generalist care; large wealthy nations have a diverse assortment of medical professionals.  But the cause of the diversification is the wealth; and not so much the other way around.

Third, its doubtful that this is of much practical application to local economic development policy.  The admonition to diversify your economy is based on a naive analogy to portfolio theory.  If one could change one’s economic base as easily (and cheaply) as one buys and sell stocks, it might make sense.  But its not clear how a city could easily change its mix of industries. Taken literally, the argument to diversify says that it would be a good thing if your biggest industries got smaller (that would make you more diverse).  But would Seattle really be better off if Amazon, Microsoft or Boeing was half the size it is today?

Fourth, the key lesson of clusters is that firms draw competitive business advantage from having other similar and related firms nearby.  By attracting talent, developing specialized suppliers, and promoting intense competition and benefiting from specialized knowledge spilling over, you get stronger, better firms, and a healthier economy.  Specializations are seldom static: one specialization often provides the knowledge base for new specializations: The process of economic development is often about related diversification:  being good in one technology at one time sets the stage to be good at generating the next technology at the next time.  The important thing is this isn’t random:  its path-dependent.

Fifth, the really pernicious thing about simple-minded diversification thinking is that it leads to the fad-of-the-month club style of economic development.  Yes!  We can be the next big biotech hub and that will diversify our economy!  And it leads people to neglect or ignore, or simply take for granted their existing strengths–which are likely to be much more plausible sources of future economic growth. Are you more likely to succeed in a field you know something about, or in a field in which you know nothing?

While there may be advantages to having a more diverse economy and avoiding excessive reliance on one or a few sectors, its far from clear that a city has much leverage to change its portfolio of industries. And pursuing diversity for diversity’s sake may sacrifice opportunities to build on existing strengths.

A lack of nearby jobs doesn’t cause urban poverty

There’s scarcely any evidence that proximity to jobs matters for escaping poverty.

One of the most popular and persistent theories of urban poverty is that the poor are poor because they don’t live particularly close to jobs. John Kain popularized the “spatial mismatch” theory in the late 1960s, explaining increased and persistent urban unemployment as being a product of poor people of color increasingly concentrating in urban neighborhoods while jobs moved increasingly to distant suburbs. Notwithstanding discrimination or human capital limits, Kain reasoned that the simple distance from jobs was a key reason the poor stayed poor.

That same reasoning underlies much place-based urban development policy. Implicitly, the strategy behind opportunity zones is that it will create more investment and jobs close to where poor people live and arguably, improve their economic opportunities accordingly.

But is distance from jobs a particularly important reason for unemployment? A new paper from UCLA’s Michael Lens and co-authors uses data on the employment prospects and earnings of housing choice voucher recipients to explore the connections between job access and economic success.

Housing choice vouchers are the federal housing subsidies that are portable across locations, allowing recipients to rent private housing, and using the voucher to pay all or a portion of rent.  Lens, McClure and Mast use data on voucher locations to track the residence and movement of low income households.

Lens, McClure and Mast construct an index of job accessibility which measures how close each voucher recipient is to jobs in a region. Importantly, their accessibility measure controls for accessibility relative to labor market competition. So a place scores high in accessibility if their are a high number of jobs relative to the number of workers nearby.

The authors also focus their attention on the lower skill/lower-wage end of the labor market, looking at the availability of lower wage employment (the kind of entry level and modest skilled jobs voucher holders are most likely to qualify for) and the competition they face from other workers with similar skill levels.

The study focuses particular attention on households that see an increase in their incomes, and who move from place to place within a metropolitan area, to better understand the relationship between job proximity and economic success. In general, they find essentially no correlation between proximity to jobs and increased earnings for housing voucher recipients.  If anything, the relationship is negative:  those with increased earnings tend to live further from jobs (or move away from jobs as their incomes rise.  The authors summarize their findings:

. . .work-able housing choice voucher households in the United States are not likely to be any closer to jobs than are not work-able HCV households, which suggests that being near job centers is not a high priority when HCV households in the workforce consider where to locate. Further, we do not find any evidence that an increase in earned income results when HCV households use their vouchers to locate closer to job centers.  . . . These results clearly indicate that earned incomes and job proximity are not strongly related for voucher households. The findings suggest that job proximity is perhaps an overrated concern in policy and research on neighborhood opportunity.

It seems likely, the author’s speculate, that voucher holders who experience increased earnings move to neighborhoods with more amenities (safer streets, better schools), even though such neighborhoods are further from jobs.  This suggests that neighborhood quality, rather than job proximity, is more important to household well-being.

This study adds to a growing body of evidence that within urban areas, the number of nearby jobs is not a particularly important determinant of whether households live in poverty or not.

As we pointed out sometime back, in absolute terms, city residents, the poor and black residents have higher absolute levels of job access than suburban residents, the non-poor and white residents. According to data compiled by the Brookings Institution, despite job decentralization, and the fact that poorer neighborhoods often themselves support fewer local businesses and jobs, the poor residents of the typical large metropolitan area have about 20 percent more jobs within typical commuting distance than do their non-poor counterparts.  The black residents of large U.S. metropolitan areas are have on average about 50 percent more jobs within typical commuting distance than their white counterparts in the same metropolitan area.

The same seems to hold for lifetime earnings of kids growing up in “job poor” and “job rich” neighborhoods.  Raj Chetty and his colleagues used a similar measure to examine the relationship between job proximity and job growth and intergenerational economic mobility. At both the neighborhood and the metropolitan level, they found almost no correlation between job proximity or job growth and the lifetime earnings of kids.

The Lens, McClure and Mast study casts further doubt on the idea of visualizing the spatial mismatch within metropolitan areas as a major contributor to persistent poverty. Just creating more jobs closer to low income neighborhoods is unlikely to reduce poverty.  Without the skills, support systems and personal networks needed to qualify for, find, and hold jobs, especially ones that lead to successively greater opportunity, propinquity is likely to be of limited value.

Michael Lens, Kirk McClure and Brent Mast, “Does Jobs Proximity Matter in the Housing Choice Voucher Program?”, Cityscape: A Journal of Policy Development and Research • Volume 21, Number 1 • 2019
https://www.huduser.gov/portal/periodicals/cityscpe/vol21num1/ch6.pdf

 

Portland’s Climate Fail: More Driving

Carbon emissions from transportation in Portland increased 6 percent last year

In the one are where city policy can make the most difference, greenhouse gas emissions are increasing

Portland has long prided itself in being one of the first cities in the US to adopt a legislated goal of reducing its greenhouse gas emissions. The city’s latest carbon emission inventory report shows that the city is failing to meet these goals. Not is the city far from the glide slope needed to reach its target, carbon emissions are actually increasing–primarily due to more driving.  Transportation is now the largest source of Portland’s greenhouse gas emissions, and have risen more than 6 percent in the past year–mostly offsetting gains in other sectors.

The city’s report concedes that this is a problem:

Transportation sector emissions are increasing dramatically, currently 8 percent over 1990 levels, and 14 percent over their lowest  levels in 2012.  Portland has experienced year over year increases in transportation-related emissions for the past five years, with transportation emissions growing faster than population growth over the same period.

The failure to make progress in the transportation sector is especially apparent when we look at the change in carbon emissions by source.

Since 2000, more than 70 percent of the carbon emission reductions have come from savings in the commercial and industrial sectors. Over that same period, the city has made no net progress in reducing carbon emissions from transportation.  While residential emissions are down about 30 percent–thanks in part to cleaner energy sources for electric generation and home heating and weatherization–the one area where local policy is likely to make the most difference–in transportation energy consumption–the city is failing.

Falsely attributing transportation emissions to job growth instead of cheap gas

The city’s climate report seems to be in denial about the causes and the seriousness of the rise in transportation emissions.  It tries to attribute rising emissions to an improving economy, claiming (page 9):

. . .  transportation sector emissions have increased in recent years. The increase in transportation emissions has tracked closely with the recovery from the 2009 recession.

This claim is false.  Transportation-related emissions actually fell between 2010 to 2014, when the Multnomah County  added 44,000 jobs.  The report fails to mention that all of the increase in transportation emissions came after 2014–when gas prices fell by more than a third, prompting more driving. As long as gasoline prices were high, even with five years of economic growth, driving and transportation emissions continued to decline. In addition, the report emphasizes that the region has achieved  per capita reductions in driving and gasoline consumption, and tries to blame increased transportation emissions on rising population; yet other sectors (residential, industrial and commercial) are managing to achieve absolute reductions in emissions even though we’re housing, employing and serving a considerably larger population.

Ignoring the critical role of prices, and shifting the focus to per capita changes in consumption (which are good) while ignoring aggregate emission increases undercuts a serious look at what is needed to achieve the region’s climate goals.  Transportation is now far and away the largest source of Portland’s greenhouse gas emissions, and they will only be reduced if we change the price of driving to reduce vehicle miles traveled.

Plateauing = Failing

Portland’s Bureau of Planning and Sustainability seems to be putting this failure to make progress in recent years as a “plateauing” when in fact it represents a significant failure. The Portland Tribune reported:

“We’ve had a lot of success in reducing emissions,” said Planning & Sustainability Director Andrea Durbin. “That’s big.”

All of that “success” stopped in 2012, total emissions are up since then.

Here’s the thing:  in the climate arena, plateauing is actually backsliding.  Unless the city reduces emissions each year, it falls further and further behind in meeting its goal.  In essence, the city has lost five years. Those five years will have to be made up, and the task ahead is even tougher: we’ve already done the cheapest, easiest things to reduce emissions (like replace residential oil heating with natural gas). It’s also worth noting that most of the emission reductions aren’t attributable to city policies– much of the success has been the result of state and federal policies–like increased renewable power and declines in coal fired electricity generation.

The city staff recognize that more is needed, but exactly what isn’t clear.  According to Durbin, (again from the Portland Tribune):

Portland Mayor Ted Wheeler believes part of the solution is for the City Council to declare a climate emergency sometime next year. Staff at BPS have been directed to reach out to youth, communities of color and other groups this fall.

Durbin says the mayor also is considering a new ban on fossil fuel infrastructure . . .

If one is intent on banning  new fossil fuel infrastructure a good place to start would be by reversing course on the proposed half-billion dollar Rose Quarter freeway widening project, which would increase driving and add to carbon emissions.

Bottom line:  Despite making progress between 2000 and 2010, Portland is now falling further and further behind in its pledge to reduce greenhouse gas emissions, almost entirely because we’re driving more.  The failure to connect the dots–low prices for fuel stimulate more driving–means we’re likely to fall further behind in the years ahead, unless we do something dramatically different.

On Friday, along with other cities, young Portlanders will participate in a national climate strike–this report shows that their concerns are justified, and that the city needs to do much more if it is to achieve its lofty goals. We hope Portland’s leaders will be listening.

Bureau of Planning and Sustainability,
Multnomah County 2017  Carbon Emissions and Trends,
September 18, 2019
https://www.portlandoregon.gov/bps/article/742164

How Ecotopia is failing its biggest test

West Coast political leaders talk a good greenhouse gas game, but actions speak louder

Throughout Ecotopia, carbon emissions are rising due to more driving, yet the region’s leaders are throwing even more money at subsidizing car travel

This weekend, leaders of some of the world’s most environmentally progressive cities are meeting in Copenhagen for the C40 World Mayors Summit. They’re both calling attention to the gravity of the climate crisis and also celebrating the steps that cities have taken to reduce greenhouse gas emissions. One of the Summit’s key talking points is that 30 cities world wide have passed “peak carbon emissions”–an impressive sounding phrase that simply means that today they emit at least somewhat less carbon than they did five or more years ago.  Several West Coast Mayors are prominent in the event:  Seattle’s Jenny Durkan is speaking there, Portland’s Ted Wheeler is in attendance.  Los Angeles Mayor Eric Garcetti is the newly anointed head of the Summit organization, succeeding Paris Mayor Anne Hidalgo.

Since these mayors are presenting themselves, and their cities as leaders, we though it would be useful to take a look at how the region they represent is doing. So, you might ask:  How are things in Ecotopia?

In 1975, Ernest Callenbach created a fable about environmentalists on the West Coast breaking away from the rest of North America to found a separatist new-age state called Ecotopia.  Stretching roughly from San Francisco to British Columbia, this would be a utopian land governed by enlightened green leaders, powered by renewable energy.

Turns out, it really was a work of fiction.

By comparison to the rest of the continent, governments in California, Oregon, Washington and British Columbia have all been environmentally progressive.  The three states have adopted greenhouse gas reduction goals; BC has a carbon tax. The region has considerable investments in renewable energy like hydropower and wind. Their leaders all pay fealty to the importance of adhering to the Paris Climate Accords (even when repudiated by national governments).  But in each of these states, performance falls well short of rhetoric, especially when it comes to transportation-related carbon emissions, which are the biggest source of greenhouse gases in each of the three Pacific Coast states.

Washington:  More carbon emissions and a freeway-building spree

Most prominently, Washington Governor Jay Inslee built the platform for his now-folded Presidential campaign squarely on the issue of climate change. He prodded his opponents to discuss climate and arguably played a key role in pushing one town hall debate to focus almost solely on that subject. That was certainly long overdue.

But back at the ranch, Inslee’s environmental record falls well short of his Greta Thunberg rhetoric.  Washington is fully embarked on a decade-long multi-billion dollar freeway widening spree, financed by taxes enacted during Inslee’s tenure. The plan dedicates nearly 10 billion dollars to new or widened freeways, and less than a fifth that amount to road maintenance.

And this comes at a time when Washington, like other states, is recording an increase in greenhouse gas emissions, chiefly due to increased driving.  Greenhouse gas emissions in Washington are up 6 percent since 2013, and emissions from on-road vehicles have increased by more than 1 million tons during that time, according to the State’s latest greenhouse gas emissions inventory. (While the City of Seattle gets appropriate credit for allowing greater density, investing in transit expansion, and netting an increase in bus ridership, things outside the central city are much less green.)

Oregon: More Carbon emissions and a freeway-building spree

Like Washington, Oregon has a greenhouse gas emissions inventory, and it shows the same trend: statewide carbon emissions are up 6.6 percent since 2012, chiefly due to the increase in driving statewide. The City of Portland’s own greenhouse audit is just as bad:  increased driving has led transportation emissions to increase 14 percent since 2012.  The failure to make progress in greenhouse gas reduction has everything to do with more driving as gas prices have declined. Since 2012, Oregon’s statewide emissions from motor vehicle gasoline have increased an astonishing 25 percent according to figures from the Oregon Department of Environmental Quality–from 11.64 million tons in 2012 to 14.64 million tons in 2017.

Despite the failure at the state and city level to meet greenhouse gas reduction goals, Oregon is plowing ahead with a multi-billion dollar program to widen Portland area freeways, which will undoubtedly lead to more driving and higher carbon emissions in the future.

Portland Mayor Ted Wheeler and Governor Kate Brown talk a good game on climate, but both have done nothing to challenge pollution inducing freeway projects, noticeably the $500 million Rose Quarter Freeway widening in Portland. The gap between rhetoric and reality on climate change in Oregon is growing ever wider.

British Columbia:  More carbon emissions and abolishing bridge tolls

British Columbia’s carbon emissions inventory shows the same patterns as its Southern neighbors. Greenhouse gas emissions from road transport have risen 14 percent in the past five years according to the provincial report.

Even thought the Green Party is part of the the Province’s ruling coalition, the provincial government abolished tolls on a pair of expensive  Vancouver area bridges–effectively shifting the cost of car travel off to all of the non-users of the bridges, including car-free households. It’s exactly the kind of strategy that’s likely to lead to increased vehicle travel and more sprawling development in the Lower Mainland suburbs around Vancouver.

California:  Cap and Trade, but 30 years behind schedule

California, to its credit, has adopted a cap-and-trade policy, making it a leader in carbon pricing. But it too is falling behind in its efforts to reduce carbon emissions.  The state’s latest greenhouse gas report says that carbon emissions are up five percent in the past five years.  Transportation is far and away the largest source of greenhouse gases in the Golden State.

Like its neighbors, California continues to plow more money into freeway expansions.  It famously spent more than a billion dollars to widen a stretch of the 405 freeway in Los Angeles, only to find that it simply attracted more traffic and ended up just as congested as ever–and with higher levels of carbon emissions. Despite progress in some areas, the state is lagging well behind the pace of improvement needed to meet it’s stated greenhouse gas reduction goals; a new report estimates that it will hit its 2030 goal in 2060, meaning that the state is about three-decades behind where it needs to be.

Pro-Tip:  Don’t blame the economic recovery; It’s all about cheap gas

One of the favorite excuses offered for this consistently bad performance in transportation greenhouse gas emissions is to attribute the growth in carbon emissions to a growing economy. But blaming the economy is simply wrong:  Notice that in almost every case the robust growth in transportation emissions comes only after 2014 (Washington’s data goes only through 2015).  By 2014, the economic recovery was already in its fourth year; what happened that year is that oil prices plunged about 40 percent globally, pushing down retail gas prices by a similar amount. Cheaper gas led to more driving (and higher sales of less fuel-efficient, higher polluting vehicles).

Whither Ecotopia?

If you were putting your faith in Ecotopia to lead the way, and especially if you’ve been beguiled by the tough-talking climate rhetoric of the region’s political leaders, what’s happening on the ground, and particularly in the form of investments in more subsidized car infrastructure, should give you pause. Despite the hype, this green region has a long way to go to be a real role model for saving the planet.

What’s needed, if we’re serious about reducing greenhouse gas emissions, are measures that reflect back to transport users the real environmental costs of carbon emissions.  By any measure, road use is heavily subsidized, and coupled with under-priced gasoline, is undercutting incentives to drive less and pollute less.  Real ecotopians would harness the power of the price mechanism, through the form of a carbon tax, value-pricing of roads, and an end to subsidized parking. We’ll be watching to see if they write a next chapter that is more convincing than the last one.

Inclusive urbanism comes to the presidential race

Beto O’Rourke brings a strong urbanist, inclusive message to the presidential campaign

The 2020 Democratic presidential race has been remarkable for addressing both climate change and housing policy issues that have long been ignored. For example, as the Brookings Institution’s Jenny Schuetz has chronicled, several of the candidates have stated policy positions on improving housing affordability, and an entire multi-hour town hall was devoted to the climate crisis.

While this is a positive development, some of the proposals, especially on climate change, have been a bit tone-deaf from an urbanist perspective:  it’s all alternative fuels and electric vehicles, rather than giving much thought as to how we build greener (and more just) communities, where maybe, just maybe, we’re not so reliant on private automobiles.

That’s why we were delighted to hear a recent comment from Beto O’Rourke, who offered what is perhaps  the most comprehensive statement of inclusive urbanist principles, and who links clearly the ideas of building more dense, inclusive communities and fighting climate change.

O’Rourke speaking on urbanism and economic integration (Links to Video; via Twitter).

Beto’s also offered an unequivocal argument for greater economic integration. Here’s an excerpt:

Here’s a tough thing to talk about, though we must:  Rich people are going to have to allow or be forced to allow lower income people to live near them, which is what we failed to in this country right now.

We force lower income working americans to drive one, two, three hours in either direction to get to their jobs, very often minimum wage jobs and they’re working two or three of them right now.

What if, as we propose to do, we invested in housing that is closer to where we work, very often mixed income housing–where the very wealthiest are living next to those who are not the very wealthiest in this country–to make sure that they can both afford the same public schools?  That we really have that as a place in this divided country can come together without regard to your income or your race, or your ethnicity or any other difference that should not matter right now.

What if we invested–as we propose to do in high speed rail and in transit in all of our cities–to make sure that if you do not have a car or do not want to use a car, you will need to have one or you will not not be penalized for not having one right now.

So, having cities that are smarter, that are denser, that have people living closer to where they work and where there families are, to reduce our impact on climate change and greenhouse gas emissions,  but also to improve the quality of life in these built environments, that’s an extraordinary opportunity.

It’s an exciting and encouraging development to see this position advanced as part of the debate over who should be America’s president. For too long, these issues have languished out of the national agenda.  We hope other candidates will weigh in on these issues.

Hat tip to CBS News Reporter Tim Perry (@tperry518) for the link to this video.

Seeing red

We’re killing more people because more people are ignoring traffic signals

We’ve charted the ominous increase in road deaths in the past several years, and now there’s a new bit of evidence of just how bad the problem has become. In 2017, according to an American Automobile Association analysis of NHTSA data reported by the Los Angeles Times, we hit a new high for the number of people killed by cars running red lights.

In 2017, the latest figures available, 939 people were killed by vehicles blowing through red lights, according to a AAA study of government crash data. .  . . AAA isn’t sure why the numbers are on the rise or why they have increased at a far higher rate than overall U.S. roadway deaths. Since 2012 the number of highway fatalities rose 10%, far short of the 28% increase in red-light running deaths.

There are likely many causes for the increase in fatalities.  Some of it sure has to do with the increase in driving, prompted by cheaper fuel.

Red-light running is also likely another indication of the growing problem of distracted driving. Drivers who are texting or distracted by in-cabin technology are more likely to miss a red light.

It also has to be mentioned that our efforts to use “smart” technology to improve compliance with traffic laws is woeful. Traffic engineers invest untold millions in efforts to automate traffic lights to provide motorists with a green wave, but spend little effort to promote greater compliance with red lights and speed limits.  For example, despite its official policy of trying to achieve Vision Zero, for example, the City of Portland has just eight fixed speed cameras.  Several states, including Texas have banned automated red-light cameras. Automated traffic enforcement is a technology that’s been shown to reduce speeding and red-light running and save lives.

We’re all enamored of the prospects of technology to make life better, but in one of the few instances in which we have a proven technology that’s been shown to save lives, we’ve limited or actually prohibited its deployment, with predictable results, in the form of an increasing death toll.

 

The Week Observed, September 27, 2019

What City Observatory did this week

1. Why diversification is a simplistic, often flawed economic strategy. When it comes to personal investment everyone understands (or certainly should understand) the concept of portfolio diversification–by having a wide variety of different investments, one lowers the risk of loss. That same principle is widely applied in economic development, and cities are looking for ways to “diversify” their economies by developing or attracting new industries that aren’t related to current specializations. While intuitively appealing, the analogy between a community economic base and a stock portfolio is a badly flawed one. Cities can’t buy and sell different components of their economy as one would stocks or bonds, and critically, a local community’s location, talent base, infrastructure and related industries are all likely to influence whether an industry flourishes or fails locally. It turns out that seeking diversification for its own sake is poor guide to local economic strategy.

2. Lack of proximity to jobs is not the cause of urban poverty. One of the most durable assumptions underpinning urban economic development efforts is the notion that the lack of nearby jobs is a major contributor to poverty. The “spatial mismatch” theory of poverty holds that the the suburbanization of employment has, on average, moved jobs further away from poor neighborhoods. A new study looking at housing voucher recipients challenges the validity of this assumption. Using careful calculations of the accessibility of housing locations to jobs (and adjusting for the labor force competition) the study finds no relationship between access to jobs and earnings for voucher recipients. It also finds that workers who move and who record increases in income tend to have lower rates of job accessibility.

Must read

1. The compressed Donald Shoup. Retired UCLA economist Don Shoup is famous for producing a 900-plus page treatise on The High Cost of Free Parking. For those who haven’t yet waded through the entire tome, now there’s a more digestible version, prepared by Shoup himself. This short essay, “Why parking reform will save the city” published by CityLab is a conversational and largely non-technical summary that lays out all of the key arguments for pricing parking and ending minimum parking requirements.

Cities will look and work much better when prices—not planners and politicians—govern decisions about the number of parking spaces. Like the automobile itself, parking is a good servant but a bad master.

 

2. Roads, fiduciary responsibility and public finance. While we’re on the subject of free parking, another insightful perspective comes from Todd Litman. While we think of roads and curbside parking as free, they really aren’t, and a big share of the cost gets shouldered by state and local governments, who regularly invest in extremely expensive assets, and then do a terrible job of getting those who use them and benefit from them to pay either the costs of their construction or ongoing maintenance. In Litman’s view, the failure to properly manage the financial aspects of expensive investments like roadways is a dereliction of a governing body’s fundamental “fiduciary” responsibility to the public. This is particularly a problem for central cities–as Litman argues central cities build and operate roads that are disproportionately used by suburban residents, while city residents (who typically own fewer cars per capita) are far less likely to use suburban roads.

Cities significantly underprice their roads and parking facilities, forcing local taxpayers to subsidize out-of-town motorists. Municipal officials have an obligation to better manage these valuable public resources.

 

The Week Observed, September 20, 2019

What City Observatory did this week

1. What super-commuters really mean.  Media coverage of super-commuters–people who travel more than 90 minutes each way to and from work–is invariably sympathetic, treating these folks as hapless victims, and lamenting the congestion on the highway system. But despite all the attention they get, these long-distance commuters are remarkably rare: fewer than 3 percent of US commuters qualify, and while car-commuting is often the focus of media attention, transit riders are far more likely to have super-long commutes.

It’s also important to recognize that super-commuting isn’t so much about the inadequacy of the transportation system as it is an indication that we’ve simply failed to build enough housing, and in particular a range of affordable housing choices close to urban job centers. Finally, for some workers, a long commute represents “sweat equity” to enable them to afford more space by working several additional hours a week traveling to and from their jobs.

2.  Portland’s climate fail.  Portland has long prided itself in being one of the first cities in the US to adopt a legislated goal of reducing its greenhouse gas emissions. The city’s latest carbon emission inventory report shows that the city is failing to meet these goals. Not is the city far from the glide slope needed to reach its target, carbon emissions are actually increasing–primarily due to more driving.  Transportation is now the largest source of Portland’s greenhouse gas emissions, and have risen more than 6 percent in the past year–mostly offsetting gains in other sectors.

The city’s climate report seems to be in denial about the causes and the seriousness of the rise in transportation emissions.  The report fails to mention that all of the increase in transportation emissions came after 2014–when gas prices fell by more than a third, prompting more driving.  On Friday, along with other cities, young Portlanders will participate in a national climate strike–this report shows that their concerns are justified, and that the city needs to do much more if it is to achieve its lofty goals.

Must read

Why electric cars are not enough. Smart Growth America hosted a Tweet Chat on Wednesday September 18 to discuss the limits of vehicle electrification as a solution to the growing problem of carbon emissions from transportation. SGA argues that while electric vehicles are a necessary step in the right direction, they’re hardly sufficient to blunt the continuing increase in transportation related greenhouse gas emissions. We need changes to state and federal policies that will lead to less driving if we’re going to reduce carbon pollution. The key message: responding to climate change is about making fundamental changes to land use to shorten and eliminate trips, and to make walking, cycling and transit viable alternatives for more of our daily travel.

Bike lanes are good for business.  Eric Jaffe thumbnails a new study looking at the results of swapping bike lanes for parking spaces in Toronto. It’s widely assumed by merchants that less parking will mean lower sales, but that’s not what this research shows. In 2016, the City of Toronto added a bike lane on a 1.5 mile stretch of Bloor Street, and in the process removed 136 parking spaces. Researchers surveyed shoppers and merchants on the street before and after installation of the bike lane to track changes, and to find out out often they shopped, how much they spent and whether stores were opening or closing; researchers also surveyed a nearby street that didn’t get the bike lane treatment, as a control group. In general, the study showed positive results:  customers reported more frequent visits to Bloor Street shops, and higher rates of spending, with the spending strongly associated with more spending by bike-riding shoppers. While not definitive–and while there were some mixed results, the study is suggestive that converting on-street parking to bike lanes is a benefit to local business districts.

New Knowledge

Clustering of invention. Being around other smart people makes you smarter–that’s the genius of cities, as Ed Glaeser argues. University of California economist Enrico Moretti has statistical evidence for this effect on his latest paper looking at the productivity of inventors. Using data on patent activity, Moretti is able to measure the output of inventors based on their location, and their proximity to other inventors.

Using a number of different tests, Moretti finds that controlling for other observable aspects of an inventors productivity (such as their industry), that inventors are more prolific when they are near concentrations of other inventors in related fields. A key finding of this work is that the US is more innovative because it concentrates inventors in a few locations–if they were more widely spread out, they’d be less productive.  While a more even distribution of inventors would lead smaller places to be more productive, it would lead larger places to be less productive, and the magnitude of the declines in larger clusters would more than offset the gains in smaller ones.  As a result, Moretti concludes that de-concentrating talent away from clusters and making it more geographically–as in the so-called “Rise of the Rest”–would likely decrease total US invention output:

The total number of patents created in the US in Computer Science would be 13.18% lower in 2007 if computer scientists were uniformly distributed across cities. The corresponding losses in biology and chemistry, semiconductors, other engineering, and other sciences would be -9.92%, -14.68%, -7.62%, and -9.64%, respectively. The change in the total number of patents in the US would be -11.07%. Thus while the spatial clustering of high-tech industries may exacerbate earning inequality across US communities, it is also important for overall production of innovation in the US.

Enrico Moretti, The effect of high-tech clusters on the productivity of top inventors.  NBER Working Paper No. 26270.  September 2019

In the news

Next City quotes Daniel Kay Hertz’s essay pointing out the inherent contradiction promoting homeownership as a wealth-building strategy and our desire to improve housing affordability.

2019: The Year Observed

What City Observatory did in 2019

We spent a lot of time this year addressing Portland’s proposed half-billion dollar Rose Quarter freeway widening project. You may have thought Portland put its freeway fights behind it in the 1970s, when it killed the Mt. Hood freeway and used the money saved to start a light rail transit system. But the freeway-builders haven’t gone away, they’ve just gotten a good deal more glib–and in our view, less honest–about the implications and impacts of their projects. While this particular battle is specific to Portland, as Alissa Walker of Curbed pointed out in December, a distressingly large number of self-proclaimed climate Mayors are also supporting freeway projects around the country. If we’re serious about climate change, building wider freeways is taking us in the wrong direction.  We summarized our extensive analysis of the Rose Quarter freeway widening project in the post “25 reasons not to widen Portland Freeways”

Our most read post of the year was: “Ten things more inequitable than road pricing.” Road pricing seems to automatically generate push back that its somehow unfair to lower income people; what that argument misses is the larger context of a transportation system that is systematically inequitable. Those without cars a second class citizens at best, and our current systems of transportation finance force people to pay for cars and car travel whether they own cars or not.  Moreover, these existing subsidies are overwhelmingly regressive.

Everyone has a pet explanation of gentrification, and we’ve compiled all the suggested suspects into an A to Z list.  Everything from Arts to Zoning is on the list, and while lots of things are blamed, its mostly guilt by association. The underlying problem, in our view, is the shortage of cities and our chronic failure to allow the construction of housing in the great urban neighborhoods we have. The growing demand for urban living has run headlong into a slowly growing stock of housing in dense, walkable places. Perversely, efforts to block new development in the name of preventing gentrification typically have the opposite effect, making housing even more expensive.

A solution for displacement? Frequently, the tactics proposed to deal with gentrification–by trying to block new development–backfire, by limiting the growth of the housing supply and pitting new residents against existing ones in bidding up the price of a fixed or slowly expanding housing stock. If we want to avoid or minimize displacement in urban areas, we need to build more housing, both market rate and subsidized. One tactic for capturing the value from property appreciation in gentrifying neighborhoods is to dedicate a portion of tax increment revenues to building affordable housing. Portland has done just that for more than a decade, and amassed nearly half a billion dollars to support affordable housing, and built thousands of affordable units in rapidly redeveloping neighborhoods.

New Knowledge

Two of the most important studies of the year come from a single author:  the Upjohn Institute’s Evan Mast.  If you care about housing, you should read both of these papers.

The first study looks at the process of filtering, the way in which the construction of new housing sets of a chain of household moves that makes more housing available for people of all incomes.  We summarized it in our commentary, Kevin Bacon and Musical Chairs teach us housing economics. It’s an article of faith among economists that more housing, even higher end housing, will help ease rising rents. But to lay-people, that seems counterintuitive. A new paper from the Upjohn Institute shows that the construction of new housing creates a kind of chain-reaction of moves by households that propagates to housing in low income neighborhoods. When a household moves to a newly built, market-rate unit, they move out of the home they previously occupied–and the household moving into that unit frees up another unit, and so on.  The Upjohn paper uses a detailed private database tracking changes of address to see exactly how moves into one unit create vacancies elsewhere. And like the famous game “Six Degrees of Kevin Bacon,” it turns out that just a few moves connect widely disparate neighborhoods. The paper estimates that building 100 units of new market rate housing generates 60 household moves into housing in low income neighborhoods.

Mast’s second paper, with two co-authors looks at a closely related question, whether the construction of new high priced housing causes the prices of nearby homes to rise or fall.  We summarized this research in our essay: Myth-busting:  Building new market rate housing doesn’t drive up nearby rents. A favorite assertion of some housing supply-side skeptics is the theory that building new market rate housing in a neighborhood drives up rents in the immediate area. It’s a mistaken analogy to the idea of “induced demand” in transportation. The idea is that expensive new housing makes an area more desirable, and rents rise nearby. A new study uses fine-grained data on changes in rents around newly constructed market-rate apartment buildings in eleven strong-market cities around the country to test this theory.  It finds that new buildings tend to depress the level of rents and rent increases in their immediate vicinity.  The myth of “induced demand” for housing driving up rents is busted.

In the News

Some of our most prominent media mentions during 2019 included the following:

Joe Cortright’s Op-Ed, “Portland’s phony, failing climate policy,” was published in the December 14, 2020 Oregonian.

Willamette Week cited Joe Cortright’s work in fighting the Columbia River Crossing in its story on “The 20 moments that made Portland famous in the past 10 years.”

The Vancouver Columbian included in its Top Ten Stories for 2019 our debunking of state Department of Transportation claims that Oregon and Washington would have to repay the federal government $140 million if they didn’t build a $3 billion Columbia River Crossing.

Slate reported on our comparative analysis of pedestrian death rates in the US and Europe in their article “Don’t count on US regulators to make self-driving cars safe for pedestrians.”

In June, Governing Magazine covered our A to Z list of the supposed causes of gentrification in “The Fables of Gentrification.

In November, The New York Times quoted City Observatory’s report Less in Common in their article “Are my neighbor’s spying on me?”

City Journal cited research from our City Observatory report Lost in Place in its article on “The Bifurcated City.”

Happy 2020 Everyone!

 

The Week Observed, January 10, 2020

What City Observatory this week

1. 2019:  The Year Observed.  We take a look back at 2019 and review some of the most important City Observatory commentaries, interesting stories and valued research.  Our most read post of 2019 was “Ten things more inequitable than road pricing.” Other highlights were our “A to Z” list of all the things people claim cause gentrification (a list that expands weekly, and now includes “little libraries.” We were also proud to offer up our on policy prescription for pre-empting displacement (dedicating a portion of TIF funds in gentrifying neighborhoods to affordable housing construction).

2. Why helping people move out of low income neighborhoods is a good idea. We now know that the neighborhood you grow up in has a huge impact on your future success, especially for kids from low income households. Building on this insight, Seattle has created a pilot program to provide vouchers, along with some more intensive home-finding services, to enable low income households to find and rent housing in “high opportunity” neighborhoods. The preliminary results of the experiment are promising: many households in the experimental group have chosen to move to these high opportunity neighborhoods–where their children are expected to have lifetime earnings about $200,000 higher than in low income neighborhoods; they also report significantly greater satisfaction with their neighborhoods. But a critique published by Next City claims that the experiment is bad for low income neighborhoods, for the households that move, and are likely to cause white flight.  In our commentary, we debunk each of these claims.

3. Why TOPA isn’t necessarily the tops.  Tenant opportunity to purchase is one idea for promoting homeownership.  The idea is to give renters a right of first refusal if their landlord decides to sell their apartment. In theory, such a requirement gives tenants a chance to buy their homes if the come up for sale, and thereby insulate themselves from the possibility of future rent increases. In practice, especially in Washington, DC, the only large city with this requirement, the program has some serious challenges. Few renters have the down payment or credit to buy their homes, and putting together a co-op or non-profit to by an entire apartment building is difficult. Moreover, its not clear that owning rather than renting is better financially for many households, especially those who may need or want to move before they could recoup the expense of purchasing. Also, as we’ve noted a City Observatory, there’s no guarantee that any particular home’s price will appreciate, and many buyers may find that they lose money on their purchase. Finally, in practice, the creation of a “right of first refusal” which can be sold to a third party has led to financial scams.

Must read

1. Habitrails for Teslas? Elon Musk is a master of media manipulation, regularly baiting “tech” reporters into providing free marketing exposure for Tesla and assorted other ventures. Curbed’s Alissa Walker does some real journalism taking a look at the steadily shrinking promises around Musk’s proposed Vegas people mover.  While was billed as “mass transit” what actually seems to be under construction is more like a glorified test rides.  Top speeds, billed originally at xxx miles per hour now seem likely to peak at 45; cool illustrations of 12 passenger vehicles have given way to off-the-shelf 5 passenger Tesla model 3s.  Most of the media types are hopeless suckers for whatever technology Musk is touting this week. While that might be regarded simply as the usual puffery, it comes at a cost to serious discussions to our urban transportation problems. As Alissa Walker puts it:

But the bigger problem is this: Each time a city (or a reporter) shows interest in Musk’s tunnel-boring scheme, it helps him sell more cars. And each time city leaders promote one of his fantastical ideas—tiny tunnels! autonomous vehicles! platooning!—it does serious damage to the real-life solutions being proposed by experts that will actually make life better for their residents.

The Habitrail: The latest technology for hamsters.

2. Road building is outstripping transit investment.  You frequently hear about calls to have a “balanced” transportation system.  But the evidence shows that we’re still overwhelmingly spending more and more on car infrastructure than on transit. The indispensable Yonah Freemark of the Transport Politic has tabulated collective spending on car versus transit infrastructure over the past decade in the United States, and come up with the disturbing conclusion that we’ve invested vastly more in subsidizing road construction and car use than we have transit.  We’ve built about 1,200 miles of additional transit service (counting rail and bus rapid transit) and during the same time, built almost 22,000 additional lane miles of freeways and other high capacity arterials.

3. Building places, not banning cars. The Brookings Institution’s Joe Kane has a commentary arguing that we need to spend much more time and attention thinking about how to build great urban places, rather than just demonizing cars. His essay, “Banning cars won’t solve America’s biggest transportation problem:  long trips,” draws a close connection between urban form and car dependence.  The way we’ve been building our cities and suburbs for the past half century (or more) has accentuated car dependence.  Owning a car is a necessity in much of this landscape.  Overall, we have more than 1.8 cars per household; although some denser, more compact metro’s manage with far fewer.  This map shows car ownership per household in the nation’s largest metro areas, with darker dots representing the most car intensive metros, and lighter ones representing the least.

The key to reducing car dependence is to build more great urban spaces, and to build more housing in the walkable places we already have.  A combination of strategies that promote greater density, and more mixing of different land uses, plus transit, cycling and walking will be needed if we’re going to reduce car dependence.

In the News

In its look back at the teens, “The Twenty Moments that made Portland Famous in the past ten years,” Willamette Week credited freeway opponents, including Joe Cortright with successfully challenging plans for a multi-billion dollar freeway between Portland and Vancouver Washington. They wrote:

On the Oregon side, a ragtag guerrilla squad of skeptics led by economist Joe Cortright poked holes in the rationale for the new bridge: The highway departments’ traffic and financial projections were wrong, they argued, and creating new vehicle capacity would not relieve congestion.

The Oregonian quotes City Observatory’s Joe Cortright in its story examining the Oregon Department of Transportation’s plans to form a new office to guide highway megaprojects:  It’s a hopeful sign that the state is charging the office with also developing congestion pricing, and a chance to bring ODOT into the 21st Century.

 

The Week Observed, January 24, 2020

What City Observatory did this week

Remembering Dr. King. We were reminded of Dr. Martin Luther King’s speech about the pronounced tendency in public policy to prescribe socialism for the rich and rugged, free market capitalism for the poor.  Much has changed in the half century since that remark, but sadly, it’s still the case that many federal policies subsidize the rich while providing little for the poor.  For example, the federal government provides nearly a quarter of a trillion dollars in subsidies for housing each year through the tax code, but the bulk of it goes to high income households. Similarly, despite the pretense that roads pay for themselves, the federal government has pumped nearly $140 billion out of the general fund to provide welfare for road users. The dream is still yet to be realized.

Must read

1. Brookings Institution’s Jenny Schuetz has a gentle corrective on who or what is to blame for housing affordability.  Too often, the debate about housing costs is a search for villains to blame. Senator Bernie Sanders recently singled out corrupt developers causing gentrification. In a commentary, “Who’s to blame for high housing costs? It’s more complicated than you think.” Schuetz contrasts two different theories about housing affordability.  The first theory is the YIMBY analysis that constraints (like zoning limits and parking requirements) that make it hard to build more housing in high demand cities drive up housing costs and create displacement.  The opposing view is expressed in Sanders view pinning the blame on some combination of greedy landlords and developers.

To come up with her answer to this debate, Schuetz diagrams all the players in what she describes as the “housing development ecosystem” and focuses on the critical role that the regulatory process plays in influencing development decisions. Development is inherently uncertain and risky, and regulations that prolong the development process, and inject uncertainty as to its outcome tend inherently to drive up costs and reduce investment.  Schuetz diplomatically concludes that  “It’s complicated,” which is true, but also sounds a lot like  the policy wonks version of the Southern expression, “Bless his heart.”

2. Bloomberg’s Noah Smith: “The Rent Crisis won’t go away without more housing.”  It’s fair to say that job growth, especially in city centers, is a key factor in housing demand.  Activists in San Francisco have made the connection between new office construction and housing demand, and have put forward a ballot measure that would restrict office construction in order to reduce demand for housing, and thereby decrease the pressure on rents.  Noah Smith points out that while that may work in a narrow sense, it undercuts the economic value that comes from concentrating activity in cities, and is a diversion from the more sensible approach, increasing housing supply.  He writes:

The quickest solution is widespread upzoning. Oregon and the city of Minneapolis have recently banned single-family zoning, allowing duplexes and other forms of moderately dense housing everywhere. . . . If measures like these become common nationwide, the country’s suffering renters will finally get a break and the economy will get a boost.

3. Greenhouse gas emissions: Coals is fading, which is good news, but progress elsewhere is insufficient.  It will be a while before the US government comes up with its official estimates of greenhouse gases for the year just ended, but the Rhodium Group has its preliminary estimates of the nation’s carbon footprint for 2019. The good news: overall carbon emissions dropped about 2 percent–chiefly due to a decline in coal burning to produce electricity. Outside of that bright spot, the news isn’t so good. Transportation is now the largest single source of greenhouse gases, and were basically flat over the past year.

 A much faster decline in greenhouse gas emissions is needed to meet globally agreed upon targets, as Rhodium concludes:

If our preliminary emissions estimates prove correct, hitting the Copenhagen Accord’s 17% target exactly will require a 5.3% reduction in net GHG emissions this year—a bigger annual drop than the US has experienced during the post-war period, with the exception of 2009 due to the Great Recession.

 

In the News

Oregon Public Broadcasting quoted City Observatory director Joe Cortright on the problems facing the proposed $800 million I-5 Rose Quarter Freeway widening project. “Planned I-5 Freeway Widening Project in Portland Keeps Taking Hits.”

 

 

 

The Week Observed, January 12, 2024

What City Observatory did this week

The pernicious myth of “Naturally Occurring” Affordable Housing.  One of the most dangerous and misleading concepts in housing reared its ugly head in the form a a new publication from, of all places, the American Planning Association.  The publication “Zoning Practice:  Preserving Naturally Occurring Affordable Housing” purports to offer advice on how to maintain affordability by preventing new development in neighborhoods with smaller, older single family homes.  The publication observes because of their size and age, such homes sell or rent for less than new housing, and mistakenly asserts that preventing them from being enlarged, improved, or replaced—especially with apartments—would some how preserve affordability.  The reality is, of course, the opposite.

The myth at hand is the idea of “naturally occurring affordable housing.”  As we’ve long pointed out at City Observatory, that’s a wrong-headed metaphor.  Housing doesn’t “occur naturally” it is the product of a complex and interactive legal and economic system.  If older housing becomes cheaper, it’s only because we allow more new housing to be built.  When zoning preserves smaller older homes, it restricts the supply of housing and drives up rents and home prices.  The reason 1,000 square foot, 1950s ranch houses in Silicon Valley sell for over $1 million, and never became “naturally occurring affordable housing” the way they did in other cities has everything to do with the zoning and other restrictions that “preserved” these homes and prevented enough new housing getting built.  The pursuit of “NOAH” is a leeches and bleeding prescription for the nation’s housing affordability problem.

Must Read

How to make room for a million more New Yorkers.  Architect and planner Vishaan Chakrabarti shows that New York could build another half million or more apartments–enough to accommodate a million more residents, in areas with great transit service in New York City.  A mix of high rise, mid-rise, and low rise apartment buildings, and office conversions would create additional housing opportunities in all five boroughs. Chakrabati says:

“We found a way to add 520,245 homes, enough to house more than 1.3 million New Yorkers, near transit, and away from flood zones, all while maintaining the look and feel of the city.”

 

The increased housing supply would moderate rents, and a million more people would stimulate the economy.  Chakarbarti’s analysis is important because it provides a detailed and concrete description of what more housing would look like across the city.  It emphasizes that a “Yes in my backyard” solution to housing affordability can be accomplished with a city-wide strategy that adds more homes while retaining a city’s character.  This is the kind of planning exercise more cities need to undertake.

To fight climate change, reduce car travel.  A new report from the Rocky Mountain Institute makes the case that improving active transportation, public transit and building more walkable communities would not only reduce carbon emissions, but would save peoples lives and save households money.

Too often, fighting climate change gets treated as a purely technological challenge of converting vehicles to less polluting energy sources.  The RMI report points out that vehicle electrification isn’t enough by itself, and isn’t happening fast enough to reduce greenhouse gases from transportation, and that steps to reduce driving will also be needed.  But, as the report points out, less driving comes with significant additional benefits in the form of fewer crashes, injuries and death, less air pollution, improved health from greater walking and biking, and, on top of all this, financial savings for households from lower transportation expenses.  Our car dependent transportation system has posed economic as well as ecological costs:

The high price of car ownership is personal to many Americans. With stagnant real incomes and no choice but to drive, gas money can come at the expense of rent, healthcare, nutrition, education, and recreation. The US transportation system wasn’t built with these tradeoffs in mind. From relentless road expansions to zoning and land use policies that encourage sprawling, car-dependent communities, policymakers have long viewed car ownership as a benefit in-and-of itself, rather than the burden many Americans know it to be.

The report’s title sounds a hopeful tune:  “Drive Less and Live More.”  It’s the kind of upbeat approach that is needed to tackle climate change.

In the News

StreetsblogUSA republished our commentary on USDOT’s publication of flawed traffic data purporting to show a big decline in trip-making in the US.

Clark County Today republished City Observatory’s analysis of the rising cost of the proposed Interstate Bridge Replacement project, which is now likely to cost as much as $9 billion.

City Observatory Director Joe Cortright is quoted in the Capital Chronicle’s story on ODOT’s chronic short-changing of road maintenance as it spends profligately on highway expansions.

The Week Observed, January 26, 2024

What City Observatory this week

Robert Moses strikes again:  One of the most infamous decisions of “The Power Broker” was to build the overpasses on the Long Island Expressway too low to allow city buses to use the roadway, cementing auto-dependency and blocking easy and economical transit access to many suburbs.  And eight decades later, state highway departments are still doing essentially the same thing.  Part of the likely $9 billion Interstate Bridge Replacement Project connecting Portland and Vancouver, Washington, is a short light rail extension across the Columbia River.  (So far, so good).  But the highway engineers are planning to build the transitway portion of the project with rails elevated on concrete blocks, so that that transitway cannot be used by buses.  Using “direct fixation” rather than flush-mounted embedded rails is a bit cheaper, but forever makes the roadway impassible to buses (and impassible as an alternate route for emergency vehicles).

Instead, the highway engineers are planning to build an extra wide highway structure to theoretically allow for “bus on shoulder” service.  Building the transitway (which runs on the lower level of a proposed double-decker bridge) so that it can’t also be used by buses, condemns all future bus service to mix with highway traffic (which requires climbing a steeper grade to the top level of the bridge, and merging with  through car traffic).  It’s a calculated decision by highway agencies to block better transit service, and create an excuse for a wider bridge (that can then be converted to more car lanes later).  Somewhere, Robert Moses is smiling.

Must Read

Pushback on New York Times traffic safety story.  The good news over the last month is that The New York Times gave prominent coverage to the dramatic surge in US road fatalities. It’s long overdue, and as road safety pundits regularly note, particularly given the the contrast between widespread publicity and quick action on even modest aviation safety issues (where there’s been great progress) and the steadily increasing death toll on the nation’s roads.  The Times article clearly lays out the tragic statistics, but offers some questionable speculation about the underlying causes of the problem.  In particular, the Times notes the increase in the number of deaths at night, and also correlates the surge in deaths with the advent of smart phones.  As Stephen Coleman Kennedy  writing at Greater Greater Washington argues, there are good reasons to question this speculation:

Focusing on whether someone is using their phone while walking, perhaps while wearing dark-colored outerwear at night, is an argument that rests on the concept that Americans are somehow both a) more technologically advanced than other countries and b) too stupid to use that technology safely. But looking at the data, peer countries with much lower pedestrian fatality rates use smartphones at similar rates.

He also points out that the cross-sectional evidence about crash deaths shows that some places (particularly sprawling, sunbelt metros) have much worse traffic safety records, which is a strong indicator that road design, rather than pedestrian behavior or smartphone distraction, is a key factor in our road safety problems.

Time for single-stair multi-family buildings.  While most of the conversation about promoting affordable housing tends to focus on zoning, there’s a strong argument to be made that we need to re-think some key aspects of our building codes.  One of the most widespread—and least examined—issues, is the general requirement that most multi-story buildings provide two separate routes of egress.  This seemingly innocuous provision of the building code profoundly shapes what can be built, and requires nearly every new multi-story apartment building have long central corridors connecting two separate stairwells, usually at opposite ends of the building.  This requirement effectively prohibits “dual aspect” apartments (where a living unit has rooms fronting on two or more sides of a building).  If we allowed single stair buildings (which are common in most of the rest of the world), architects would have much more flexibility to design structures with a wide range of unit sizes, more creative layouts, and with less space dedicated to blank hallways.   Stephen Smith makes a strong case for allowing single stair buildings for many 3- to 5-story apartment buildings.

Without the requirement for a second stair, buildings can be laid out in a fundamentally more efficient way. With less vertical circulation, the circulation core can simply be repeated a few different times, with apartments of different sizes arrayed off of each core, potentially stretching from the front of the building to the back. If planners redraw zoning envelopes to accommodate thinner buildings, more bedrooms can be packed in less square footage, offering more affordable and competitive family-sized designs.

Building codes are squarely in the control of state and local governments, and Smith has founded the Center for Building in North America to help do the thorough research that’s needed to show how with today’s technology, single-stair buildings can promote fire safety, and much greater housing affordability.

New Knowledge

The lethality of taller vehicles.  For some time, it’s been obvious that the growing size of trucks and sport utility vehicles is strongly correlated with increased road death rates, especially for people on foot.  A new study uses some very detailed crash data to validate the statistical connection between vehicle hood height and traffic fatalities.  It concludes that increasing the height of the front of a car or truck by 10 centimeters—about four inches—increases the probability that someone will be killed in crash by more than 20 percent.

Justin Tyndall of the University of Hawaii assembled a unique database of vehicle-pedestrian crashes that looks at the height and weight of vehicles and other aspects of crashes, and shows that the larger the vehicle, the greater probability that a pedestrian will die.

Tyndall concludes that front-end height is the most significant variable affecting death rates.  He concludes that height is more important the vehicle weight, and also that the changing composition of the vehicle fleet (more taller and heavier vehicles) likely means that pedestrian deaths may increase more in the future:

. . . high-front-end vehicle designs are particularly culpable for the higher pedestrian death rate attributable to large vehicles. A 10 cm increase in the front-end height of a vehicle increases the risk of pedestrian death by 22%. Conditional on multiple measures of vehicle size, front-end height displays the most significant effect. The shift towards electric vehicles is projected to make vehicles heavier still, as the batteries needed to power the vehicles add significant weight (Shaffer et al., 2021). If a strong relationship between pedestrian fatalities and vehicle weight exists, the number of fatalities attributable to vehicle size will likely continue to rise in the coming years. However, I find that once front-end height is controlled for, the impact of vehicle weight is small, suggesting the regulation of body design may be more important for pedestrian safety than the regulation of vehicle weight per se

Tyndall’s study should be read in combination with calculations from the Rocky Mountain Institute that show that fuel consumption and greenhouse gas emissions would be 30 percent lower today if personal vehicles were the same weight that they were a dozen years ago.  The takeaway for policy is that the growing height and weight of cars, SUVs and light trucks is making the roads more dangerous, burning more fuel, and accelerating climate change.

Justin Tyndall, “The effect of front-end vehicle height on pedestrian death risk,” Economics of Transportation, Volume 37, 2024, 100342, ISSN 2212-0122,
https://doi.org/10.1016/j.ecotra.2024.100342.

The Week Observed, January 19, 2024

What City Observatory this week

Why does it take four years and $200 million for consultants to serve up a warmed-over version of the Columbia River Crossing?  The Interstate Bridge Replacement Project’s director admitted that he’s just pushing “basically the same” project that failed a decade ago, but in the process, he’s spent $192 million on consultants, with the largest single chunk of that money ($75 million) going to his former employer, WSP.

And WSP has returned the favor, providing Johnson with questionable reports that purport to rule out examining an immersed tunnel option and which exaggerate project benefits to qualify for federal funding. Between the failed CRC (which cost nearly $200 million a decade ago) and its rebranded clone, the IBR, the two states will have spent nearly $400 million, mostly on consultants, and without turning a shovel. More than 10 percent of IBR’s recent consulting largesse—more than $20 million—has gone to public relations firms to help it sell the project.

Must read

US DOT puts Fresno highway expansion on hold over environmental concerns.  California had been planning to add new capacity and build new interchanges on Highway 99 in Fresno.  Fresnoland reports that, under pressure from environmental groups, the Federal Highway Administration is reviewing this decision. CalTrans claimed that the project would have no significant environmental impact and would not increase air pollution, but local advocates—and now possibly, FHWA—disagree.

The project would serve a massive proposed freight distribution facility, and would increase truck trips and air pollution in an area that is not in compliance with state and federal air pollution laws.  Left to their own devices, state highway departments have often played fast and loose with such environmental reviews; maybe FHWA will hold them accountable for obeying the law.

Seattle area puts road expansion ahead of climate goals.  Writing at the Urbanist, Ryan Packer reports that when it comes to putting their money where their climate pledge is, Puget Sound leaders will just keep on widening highways, thank you.  Two local government officials proposed prioritizing federal funds for safety and transit and active transportation projects, and ruling out using these flexible funds for highway expansion.  As Packer relates:

. . . members of the Puget Sound Regional Council’s Transportation Policy Board, a group of local elected officials from Pierce, Kitsap, King, and Snohomish Counties tasked with regional planning issues, voted against a proposal to bar transportation projects that would add general purpose vehicle capacity to the region’s limited access highways from competing against other public transit and traffic safety projects for a specific pool of federal dollars, . . .

The unwillingness of regional leaders to take this relatively small step illustrates the degree to which a region that prides itself on being climate-forward is not ready to take the hard steps to actually make its emissions reduction goals a reality.

Sadly, it wasn’t even close:  the proposal lost with 15 votes against and just 3 in favor.

California forever:  Urban fantasy or exurban nightmare? There’s a brash, bold proposal for a huge urban-scale development in California, which proposes to build what amounts to a new city for 400,000 people in Northern California.  Proponents are pursuing a ballot measure in Solano County to authorize the proposal, and their pushing plans for a dense, walkable gridded community that ticks many urbanist boxes.  While it seems like an appealing way to address the Golden State’s chronic housing shortage, Benjamin Schneider raises some important qualms at Substack.

While a the local level the plan exhibits an admirable urban form, the new city’s location—and most critically— connections to the rest of the region, are sketchy and car-dependent.  The city would be located between the San Francisco Bay Area and Sacramento.  The city’s new residents would have plenty of non-car alternatives for local travel, but would need cars to travel pretty much anywhere else.  The project anticipates widening local freeways, and in the process adding more traffic to the already congested Interstate 80.  And, as Schneider reports, proponents have given little thought to connecting to regional rail, despite the fact that an Amtrak corridor runs nearby and a BART terminus is a bit further.  It’s tempting to think that a new city in a greenfield can avoid the gnarly political problems that block redevelopment, but adding more population on the urban fringe exacerbates car dependence and increases pollution.

In the News

The Portland Mercury reported on our analysis of the rising cost of the Interstate Bridge Replacement boondoggle.  The price tag rose from $4.8 billion to as much as $7.5 billion just a year ago and project staff are now letting on that the costs will go even higher.

Salem Breakfast on Bikes also pointed its readers to our analysis that the new price tag for the IBR is likely to reach $9 billion.

The Week Observed, January 15, 2024

What City Observatory this week

1. The Urban Institute gets inclusion backwards. The Urban Institute has released an updated set of estimates that purport to measure which US cities are the most inclusive.  The report is conceptually flawed, and actually gets its conclusions backwards, classifying some of the nation’s most exclusive places as “inclusive.” Highly equal cities are almost always either exclusive suburban enclaves (that achieve homogeneity by rigid zoning limits that exclude the poor) or impoverished cities that have been abandoned by upper and middle income households, leaving them homogeneous but poor. (For example, many of Urban’s Institute’s “most inclusive cities” are suburbs like Bellevue, Naperville and Santa Clara–among the wealthiest 20 cities in the US; while Detroit and Cleveland are also highly ranked for inclusiveness. Small geographies, neighborhoods/cities that have high levels of measured income inequality (90/10 ratio, Gini Index) are generally much more inclusive than comparable geographies that with lower levels of measured inequality.  Rich and poor living closer together produces more measured inequality, but also means greater inclusion.

2. Parking should pay its way:  Hartford’s land value tax lite.  One of the perverse effects of the US system of property taxation is that improving urban property, by say building apartments, offices or stores, triggers an increase in tax liability.  That tends to discourage development and reward low value uses of valuable urban land. Effectively the biggest subsidies from this arrangement go to surface parking lots, which usually pay taxes only on the value of land. While in theory a land-value tax would fix this incentive problem, its politically hard to change the whole tax system.  Hartford, Connecticut has a proposal to up fees on parking lots by a modest amount (about 50 cents per space per day) to fund transportation.  The fee would work like a kind of land value tax “lite.”  This is definitely a step toward making the tax system fairer; as parking lots don’t pay anything close to their full-freight in local finance. Take for example stormwater:  Hartford’s building a multi-billion dollar subway for sewage, largely to deal with the runoff from roads and parking lots.  City water users pay the full cost of this project; cars and parking lots pay nothing for a problem they largely create.  More cities should look into asking parking to pay its way.

Must read

1.  The geography of sedition:  Geolocation of Parler Videos, 6 January 2021.

2.  A roundup of what’s been written about Covid-19 and cities.  At City Observatory, we’ve spent a good deal of time tracking data on the Coronavirus pandemic as it spread, paying particular attention to a couple of themes, one linking density to the virus, something which appeared as a popular theory early on, but which must now deal with higher rates of death in rural areas.  A second theme looks at the future of urban areas in an era of work-at-home for many professionals.  James Brasuell at Planetizen has compiled a comprehensive list of news articles and blog posts on these themes; it’s a useful resource for following the debate.

3. Todd Litman:  Housing first, cars last.  Leave it to one-man transportation think tank Todd LItman to succinctly make lay out the definitive strategy for advancing equitable urbanism.  In an essay at Planetizen, Litman writes:

Our current laws do not mandate housing for people, but virtually all jurisdictions do mandate an abundant and costly supply of housing for motor vehicles. Our zoning codes require that most buildings include numerous parking spaces that are generally unpriced, which is a huge and unfair subsidy for automobile use. This increases housing costs, encourages driving, and forces car-free households to pay for expensive parking facilities they don’t need.

There’s a shortage of affordable housing nearly everywhere, particularly in some of our most livable cities. But at the same time, we’re awash in “free” (meaning un-priced) parking, with three or four or more parking spaces for every car in most metro areas.  Indeed, parking requirements essentially tax housing to subsidize parking.  Reversing this fundamentally wrong-headed prioritization is the key to building better  and more equitable cities.

4.  How Covid-19 debunked one old gentrification myth.  Jake Ambinder, writing in the Atlantic.  One of the most resilient folk narratives of gentrification is the idea that by building more housing greedy developers drive up rents and cause displacement.  Not only has that theory never made much sense, and has been disproven by repeated studies, but the deflation of demand in some previously super-heated urban markets shows that when demand and supply are in closer balance, rents actually come down.  The real cause of the affordability crisis in thriving metro areas isn’t the developers, but the constraints on development fomented and defended by NIMBY homeowners.  The prevalence of single-family zoning and widespread apartment bans wrought by nominally pro-environment homeowner groups constrict housing supply, drive up rents, and trigger displacement.  As Ambinder concludes:

The homeowner-friendly slow-growth activism that marked American cities in the late twentieth century is thus best understood not as the predecessor of today’s anti-gentrification politics but as the progenitor of the gentrification crisis itself. In wealthy coastal cities today, one need not develop skyscrapers or shopping malls to be a speculator in urban property. With widespread housing scarcity, simply owning a modest home in Berkeley or Brooklyn will suffice.

New Knowledge

Tragically slow progress in improving vehicle fuel efficiency.  For decades, the big technical fix to our energy problems was the idea that imposing progressively tougher fuel economy standards on the sale of new vehicles would lower energy consumption.  That same idea has been bolted on to many climate action plans (don’t worry, new cars will consume less fuel, and therefore generate fewer greenhouse gases).  A new tabulation of real world fuel consumption and driving data by Michael Sivak shows that what progress we made in this regard largely petered out more than a decade ago, and alarmingly, in the latest year for which data are available, fuel economy actually declined.
Early on, there was a lot of low hanging fruit, and fuel economy improved dramatically between the late 1970s and 1990.  But from 1990 through 2004, there was basically no change in average fuel economy.  Between 2004 and 2008, fuel economy improved appreciably.  Overall, there’s been a startling slowdown in the improvement in fuel economy.  Sivak calculates that from 1973 to 1991, fuel efficiency improved by 2.3 percent per year, since 2008, fuel efficiency has improved by 0.2 percent per year.  In other words, it now takes us ten times as long to get a given improvement in fuel efficiency as it did 40 years ago.
As economists, we’d be remiss if we didn’t note that a lot of these changes had everything to do with fuel prices.  Fuel prices were flat (declining in real terms) during the 1990s, and there was a big jump in oil prices in the early 2000’s.  Car buyers responded both times, buying less fuel efficient vehicles when gas was cheap in the 1990s, and more efficient vehicles as fuel prices rose after 2004.
Since 2010, progress has been very slow.  More and more Americans are buying sport utility vehicles, and the decline in gas prices after 2014 has prompted purchases of more large vehicles, with the predictable result that fuel economy fleetwide actually declined in the most recent year.  As Sivak notes in his analysis, vehicle purchase decisions cast a long shadow in terms of fuel consumption and pollution.  Only about 6 percent of the vehicle fleet turns over in any given year, and the average vehicle lasts for almost 12 years.  This means that much of the fleet that will be on the road in 2030 consists of the large, inefficient vehicles people are buying today.
Finally, it’s worth noting that if anything, Sivak’s numbers actually understate the failure of efficiency standards, because they only consider the per-vehicle and per-mile energy consumption.  The combination of somewhat more efficient vehicles and low energy prices has prompted people (at least prior to Covid-19) to drive more miles, producing even more emissions in the aggregate.  Simply making individual cars more efficient is easily overwhelmed by more driving, with all its external costs.
There’s a naive, carrotist belief that we can painlessly achieve our energy efficiency and climate goals simply by mandating progressively more efficient cars.  The data shows this strategy isn’t working.
Michael Sivak, “Actual fuel economy of cars and light trucks: 1966-2019,” Green Car Congress, January 7, 2021.

The Week Observed, January 22, 2021

What City Observatory this week

Institutionalized housing discrimination. A recent study of housing discrimination in Detroit came to a seemingly surprising conclusion:  Fair housing complaints were less likely to be filed in higher income, higher priced predominantly white neighborhoods than in lower income neighborhoods that were predominantly Black.  The study’s authors were puzzled by the finding, but we think there’s a pretty clear explanation. In expensive mostly white suburbs, there’s no need to practice the overt discrimination of turning down applicants of color, or telling them the house is rented (or under offer):  The high price of housing, dictated by restrictive local zoning codes, automatically performs that chore.  Large lot single family zoning and apartment bans reshape the housing market to make it simply unaffordable for low income people to move into some communities, and given the correlation between poverty and race, this has much the same effect as in-your-face discrimination.

In 2021, it’s the case that the most widespread and pernicious forms of housing discrimination are the subtle institutionalized racism embedded in systems like zoning.

Must read

1. Four reasons why public housing isn’t going to solve our affordability problems. Some housing advocates are hoping that the new Biden Administration will embrace the construction of more public housing as one cornerstone of its ruban policy.  The Brookings Institution’s Jenny Schuetz makes the case that public housing is unlikely to be an effective solution.  A central problem is that the same land use restrictions that block market rate housing (apartment bans, parking requirements, arbitrary review processes) apply with equal or greater force to public housing. Moreover, the institutional capacity of public housing authorities to actually plan and build new housing has atrophied dramatically; and most authorities are overwhelmed with managing the maintenance burden of the legacy public housing stock.  Finally, for any given amount of resources, other types of subsidies—like expanding housing vouchers, or acquiring and redeveloping existing housing—may provide more benefits more quickly to more people.

2. Why the linear city is a straight line leading nowhere.  With great fanfare, and fancy animation, developers have announced plans to build a futuristic 100-mile long linear city in Saudi Arabia.

While it’s billed as a startling new idea, it isn’t.  The linear city was proposed, tried, and failed in the 19th Century, as Vice’s Aaron Gordon points out.

This new iteration is wrapped in all the buzzwords of contemporary urbanism:  there won’t be any cars; all transport will be underground, and people will walk everywhere.If techno-futurism has perfected anything, it is the art of unwittingly re-inventing old ideas, inflating them to a scale so epic that it accentuates all of the idea’s flaws, and presenting it in a slick hype video as the Only Way Forward.

Grandiose plans advanced by high priced consultants are common in this part of the world.  Plan’s for Masdar, a smart green new city in Abu Dhabi, are largely stillborn after a decade.  While only a few very wealthy places can really indulge these futurist follies, it’s all too common for urban leaders to hope for a simple technical fix, rather than tackling the more fundamental problems close at hand.  As Gordon explains:

. . . one of the many problems city governments have is they all too often dig deep into the well of techno-futurist ideas like “smart cities” and “artificial intelligence” when much more realistic solutions like zoning reform and elimination of parking minimums and making certain streets car-free are right there for the taking. They may be hard to do and will piss some people off in the process, but at least they will ultimately solve problems and make people’s lives better.

3. Why are US transit construction costs so damn high?  A recent report from the Eno Foundation argues that its a misperception that transit construction costs are higher in the US than the rest of the world.  But Alon Levy, writing at Pedestrian Observations, begs to differ.  Levy, who maintains an extensive global database on construction costs argues that the Eno study is biased for several reasons:  it looks mostly at light rail, not full rapid transit, it leaves out cost estimates for high income Asian countries, and it also leaves out lesser known (and lower cost) light rail and tram systems in smaller European countries and cities.  The point-counterpoint between Levy and Eno provides a useful framework for answering this complex question.

New Knowledge

Can policy reduce inequality?  For a long time, we’ve known that housing segregation and school segregation have been instrumental in creating and perpetuating racial disparities in opportunity and economic prosperity.  Due to our heavily localized systems of school finance and administration, the quality of a child’s education depends in large part on how nice a neighborhood his or her parents can afford. Since at least Brown v. Board of Education more than 65 years ago, public policy has aimed at trying to reduce disparities by decreasing segregation in schools, and equalizing resources.
In a thoughtful essay that draws on some of the best of recent research, Harvard economist Richard Murnane makes a strong case that we have to do more than desegregate schools and raise test scores if we are to redress persistent racial disparities in economic outcomes. While schools are central to life chances, it’s not just the school that influences outcomes. Schools are embedded in the communities in which they operate and it’s a combination of the richer resources, fewer obstacles, stronger peers and more abundant social capital in mixed income neighborhoods that gives rise to big improvements in economic opportunity.
Murnane highlights the findings from the Moving to Opportunity studies, summarized by Lawrence Katz and others, Eric Chyn’s analysis of the movement of families out of Chicago public housing, and data on the effects of Charlotte’s school lottery program.  In every case, these studies showed that while moving kids from low income families to desegregated schools in middle and upper income neighborhoods had modest effects on their test scores, they were associated with big improvements in life outcomes, measured by higher earnings and lower probability of incarceration. The Chicago housing study found that former public housing tenants who were relocated to middle income neighborhoods had a higher probability of employment and higher earnings than otherwise similar students who stayed in the neighborhood. The Charlotte study found that low income students who won the lottery to attend high-demand schools had lower rates of incarceration for boys and higher rates of college attendance for girls.
Murnane argues that the combination of better soft skills, different experiences and expectations, and different peer groups plays a key role in enabling kids from low income households to succeed in these places.
 . . . it is not choice per se that matters. It is growing up in neighborhoods with lower crime rates and more economically advantaged neighbors and attending better schools with children from more economically advantaged families.
Our focus on schools and education needs to move beyond raising test scores to a more comprehensive understanding of the role of social skills (like reliability; persistence in the face of challenges; listening, negotiating, and communicating effectively; and the ability to work productively in groups with people of different backgrounds) as well as the role of context. Segregated schools, particularly in high poverty neighborhoods present students with more challenges, fewer social resources, and thinner networks, all of which make it harder to succeed in life.  As Murnane concludes:
Given the history of housing segregation in America and the funding structure of American schooling, a great many Black children grow up in high-poverty, unsafe neighborhoods and attend underfunded schools that do not prepare them for success in post-secondary education and the labor market. But the studies described above make a strong case that reducing the social isolation in which a great many low-income Black families live and their children attend school are powerful strategies for reducing race-based intergenerational inequalities.
Richard Murnane, “Can Policy Interventions Reduce Inequality? Looking Beyond Test Scores for Evidence” The Digest (William T. Grant Foundation), No. 6, January 2021.

In the News

StreetsblogUSA republished our analysis of Hartford’s proposed parking fees under the title, “How to stop giving parking developers a free ride.”

The Hartford Courant also wrote about our analysis of the proposed increase in parking fees.

 

The Week Observed, January 29, 2021

What City Observatory this week

1. Why Portland’s Rose Quarter Freeway widening will increase greenhouse gas emissions.  The Oregon Department of Transportation hashas falsely claimed its $800 million freeway widening project has no impact on greenhouse gas emissions.  We examine traffic data produced by ODOT which shows that the widening will increase average daily traffic above today’s levels and produce tens of thousands of tons of additional greenhouse gases. The ODOT numbers are certainly an underestimate because they completely discount the certainty of induced demand:  the scientifically demonstrated increase in vehicle travel that results from expanding freeway capacity in dense urban areas.

2. More performative pedestrian infrastructure.  News reports hailed a new “protected” intersection in Houston, which prompted us to take a close look at the project.  To be sure, as depicted in the artist’s conception, there are wider sidewalks, better lighting, nicer plantings, and even bollards (which we take to be the “protected” part).  But if you step back, the project is just another failed attempt to remediate a fundamentally pedestrian hostile environment. The project actually widens and lengthens two “slip” lanes, which accelerates turning traffic and increases danger for crossing pedestrians.

Pedestrian “infrastructure”- but no pedestrians, because it’s not a walkable place.

Pedestrians must still cross two busy, multi-lane arterial streets that carry about 60,000 cars per day.  And the local neighborhood has almost no walkable destinations nearby.  As we’ve said, much of what gets labeled pedestrian infrastructure, is really to convenience car travelers, and does nothing to fix the fundamental problems of pedestrian hostile development patterns.

Must read

1. Will economics support post-Covid cities?  Economist and Bloomberg columnist Noah Smith muses on whether economic forces will enable smart, flexible workers to “zoom it in” from remote locations instead of living and working in Superstar cities.  He discounts some of the key agglomeration economies that the driven city growth in the past several decades, including the productivity of workers in dense urban offices, and the knowledge spillovers that come from proximity and competition.  The most economic enduring factor attracting talented workers to cities, Smith thinks is urban amentiies:

For young people, this means bars, music venues, fun social events, lots of potential friends in their age group, and — probably the most important piece — opportunities to meet romantic partners. For older people this means a large variety of good restaurants and cultural events like musicals.  For this reason, distributed workforces might just mean that knowledge workers live in different superstar cities.

Thus even if some workers are no longer bound to one large city, their most favored alternative may still be yet another superstar city.  Smith relates that his migration has taken him from New York to San Francisco, and his next move, if there is one, is likely to be to Tokyo.  When it comes to urban amenities—and the diverse crowds of smart, discerning and innovative customers that support them—there’s still no substitute for a great city.

2. American’s broadly favor public housing . . . just not in their back yard (or neighborhood).  A new survey from Data for Progress shows remarkably strong public support for the idea of putting more federal tax dollars into building or subsidizing the construction of affordable housing.

That’s encouraging news given that today’s principal low income housing program, housing choice vouchers reaches only about one in five eligible households.  The Biden Administration is proposing ramping up funding for vouchers, but whether we get more affordable housing, especially in desirable opportunity neighborhoods still hinges on whether local land use controls allow its economical construction; apartment bans and minimum lot sizes make affordable housing uneconomical in many US neighborhoods, and there’s little support, and predictable and vociferous opposition any time there’s a specific proposed project that would change this pattern.

3.  Rights of Way.  Houston Architect Christof Spieler has a thoughtful essay challenging the car-centric practices that dominate transportation policy in his (and other) cities.  Texas metro areas, from Houston, to Dallas, to Austin, continue to be caught in an endless cycle of roadbuilding, projecting that traffic will increase in the future, building more roads to accomodate, and than find that traffic still increases further.

The widening of I-35 in Austin is an assertion that traffic will inevitably continue to increase, and that the only way to handle that is to add traffic lanes. This is a direct projection from the past, a continuation of a decades-long strategy used as Austin grew from 130,000 people in the 1950s to 978,000 today. I-35 was built in the 1960s along the old East Avenue, a six-lane road. The original highway had four lanes, two in each direction. . . . The distance the average resident of the Austin region drives increased for decades. TxDOT sees a freeway where traffic travels at 15 mph at rush hour and concludes that the only reasonable response is to add lanes. The question this workshop poses is what the best way is to add those four new lanes, and how to minimize the impacts those lanes will inevitably have on the city.  However, those additional highways have been as much a cause of additional driving as they have been a result of it.

Ultimately, Spieler argues, the answers to the questions about what Texas cities (and others should do) don’t emerge from travel demand models that simply call for repeating past mistakes, they call for us to think about what kind of places we want to inhabit in the future.

New Knowledge

How close is “close” in local labor markets?  There’s a widespread, but largely naive belief in economic development circles that in order to help the unemployed, you’ve got to create jobs in or very near the neighborhoods in which they live. On some level, the notion is appealing:  it ought to be easier to find a job if its just down the street or at least a short commute away.
But a new study from one of the economic development field’s most respected scholars, Upjohn Institute’s Tim Bartik, takes a very close look at that question and comes up with an answer that, on its face, seems to debunk the “hyperlocal”  view of job accessibility.  The scholarly variant of the “local jobs” notion is called the “job mismatch” hypothesis, the idea that low income populations or dis-favored groups live in neighborhoods that are far from job centers, and that this lack of propinquity causes their higher levels of unemployment.
At the other end of the spectrum from the  “hyper-local” view is the “regional labor market” view–the notion that people’s employment prospects are most heavily influenced by the overall level of job creation in an entire labor market.  Labor markets are usually defined as multi-county metropolitan areas, and a key factor in drawing metro boundaries is data on worker commuting patterns.  The key idea here is that a healthy regional labor market produces, both directly and indirectly, more job opportunities for everyone, regardless of whether they happen to be in or near one’s own neighborhood.
Bartik’s research finds strong support for the “regional labor market view” and finds that the health of the regional labor market has a lot more to do with the unemployed getting jobs that with the growth (or lack of growth) of jobs nearby.  Bartik uses a geography called “commuting zones”—slightly larger than metro areas—and finds that job growth in these areas is more important to explaining increased employment than neighborhood level changes.  Within a metropolitan area (or commuting zone) job creation tends to benefit residents of more-distressed counties; targeted job creation in distressed counties has only modestly more benefits that job creation throughout the metro area (or commuting zone).

Bartik’s paper complements other recent urban economics research that challenges the geographically reductionist view of labor markets.  Michael Lens and his co-authors found no correlation between proximity to jobs and earnings gains for low income households.  Raj Chetty and his colleagues found little evidence of proximity of nearby jobs and lifetime earnings.

Major policy initiatives, like Opportunity Zones, have a kind of “propinquity” model of opportunity: the notion that having jobs nearby, ideally in the same neighborhood, is the key to helping people move out of poverty.  These studies cast serious doubt on this simplistic notion.

Timothy J. Bartik, “How Long-Run Effects of Local Demand Shocks on Employment Rates Vary with Local Labor Market Distress,” W.E. Upjohn Institute for Employment Research, Upjohn Institute working paper ; 21-339, January 21, 2021, DOI: 10.17848/wp21-339

In the News

Willamette Week quotes City Observatory’s Joe Cortright on Portland’s proposed carbon tax for a handful of large polluters—one that unfortunately exempts the biggest sources of greenhouse gases.

 

The Week Observed, February 5, 2021

What City Observatory this week

1. Calculating induced travel. Widening freeways to reduce traffic congestion in dense urban areas inevitably fails because of the scientifically demonstrated problem of induced demand; something so common and well-documented it’s called the “fundamental law of road congestion.” Experts at the UC Davis National Center for Sustainable Transportation have developed an “induced demand calculator” for California metro areas, and with their assistance, we’ve calibrated it for computing increased travel from capacity expansion in the Portland metro area.

The calculator shows that widening I-5 at Portland’s Rose Quarter would add between 17.8 and 34.6 million additional miles of vehicle travel and 7.8 to 15.6 thousand tons of greenhouse gases per year.  This calculation, based on an independent, peer-reviewed methodology disproves false claims made in Oregon Department of Transportation’s environment assessment that the project will have no effect on greenhouse gases.

2. America’s K-shaped housing market.  By now, you’ve probably heard our current economic downturn described as a K-shaped recession. The “K” refers to the diverging fortunes of low-wage and high-wage workers. Layoffs have been disproportionately concentrated among the lowest paid quarter of the workforce, where employment is down more than 20 percent; while employment among high income workers has actually increased. That’s reflected in the housing market, where rents are falling, while home prices are soaring.

The same dynamic is at work:  two-thirds of low-wage workers rent; while nearly 80 percent of high wage workers are homeowners (or homebuyers). The peculiar, and very different shape of the Covid-19 recession explains much of what’s happening in US housing markets.

3. Again, it’s Groundhog’s Day, again.  If it seems like we’re stuck in a loop, when it comes to climate change, it’s because we are.  The latest report of the Oregon Global Warming Commission is out, and it shows just what similar reports showed 2 and 4 years ago:  despite our stated goals of reducing greenhouse gases, we’re not anywhere close to being on track.  The main culprit:  increased emissions from driving, which according to the independent DARTE database have increased by about 1,000 pounds per person in the past five years.

We’re sure that this year’s report will stimulate another round of solemn declarations about the gravity of the climate emergency, but at this rate, we’ll be doing the Groundhog doom-loop for years to come as the planet burns around us.

Must read

1. Induced Travel from Parking.  The idea that added freeway capacity generates additional travel is well-established.  A new study also shows that building more parking also induces more car travel.  Sightline’s Michael Andersen highlights the study findings froma San Francisco housing lottery to filter out what researchers call “selection effects”—the idea that maybe people who tend to drive less anyhow gravitate toward places with fewer parking spots.  Because people were randomly assigned to different housing, the observed differences in driving can’t be attribution to this sorting effect.  And the study shows that indeed, people who lived in places with fewer parking spaces tended to own fewer cars and to drive less.

To economists, this makes perfect sense:  anything that lowers the cost of driving or car ownership tends to encourage more driving.  More space on roadways, more plentiful (and almost always free) parking spaces, low gasoline prices, and free use of the atmosphere as a dump for carbon and other pollutants, all subsidize car use and result in more miles driven.  The study is powerful evidence for the climate benefits of policies that eliminate parking requirements, and which more generally reduce the supply of parking.

2. Don’t Block Gentrification:  Use it to improve the neighborhood for all.  Pete Saunders, who blogs at Corner Side Yard has a provocative essay at Bloomberg Opinion.  As Saunders notes, gentrification is actually surprisingly rare; for the most part, in large US metro areas, rich neighborhoods are getting richer and poor neighborhoods are getting poorer.  While the few places that are transitioning between the two are the flashpoints for discontent, we should view them as finally providing the kind of investment many urban neighborhoods need to overcome poverty; as Saunders says:

Gentrification should be a chance to expand opportunity, not diminish it.

The key to this is having a process for encouraging engagement between long-time residents and newcomers, identifying and bolstering a community’s distinctive assets, and connecting long-time residents to expanding housing and job opportunities.  And, as he warns, just trying to block change wastes an incredible opportunity to utilize the investment in cities to provide wider benefits to those who’ve been cut off:

Plenty of cities have been starved for decades of revitalizing investment; they would benefit just as much from a concerted effort to increase opportunity and affordable housing, while preserving key institutions. One thing is certain: If we transition away from the back-to-the-city movement of the last 30 years without making moreof our cities better, we’ll be worse for it.

3. The dominance of the SUV.  One of the most important indicators for the future trajectory of greenhouse gas emissions is the size and weight of private vehicles.  Between 2004 and 2014, when gas prices were rising or high, consumers responded by buying more passenger vehicles which are generally smaller, more fuel efficient and less polluting than light trucks or sport utility vehicles (SUVs).  But since gas prices plunged in 2014, the market shart of these light trucks and SUV’s has steadily increased.  The latest data, tabulated by Calculated Risk’s Bill McBride makes this clear:

Back in 2008 and 2009, most of the vehicles sold were passenger cars.  Now the market share of SUVs and light trucks is quickly approaching 80 percent.  Many state greenhouse gas reduction plans were founded on the assumption that the passenger car share would increase, which it hasn’t. Moreover, because vehicles last 15 years or more, the heavy, polluting vehicles being purchased today will still be on the road in the 2030s, as the climate crisis becomes increasingly dire.

New Knowledge

Who really pays property taxes?  More than 100 million Americans rent their homes, and their landlords use a portion of the rent they pay to pay property taxes.  One of the key questions in thinking about local public finance is how much of the cost of local property taxes is borne by renters (i.e. simply passed on to renters in the form of higher rents) as opposed to being borne by landlords (in the form of lower profits).  The answer to this question has a lot to say about whether the property tax is regressive or not:  If renters bear all or most of the cost of property taxes, the tax tends to be regressive, because renters are generally lower income than the rest of the population.  Landlords are, on average, better off, so if they bear the cost of property taxes, that suggests that the tax is less regressive (and might even be proportional to income or progressive).
A new study from David J. Schwegman of American University John Yinger of Syracuse University aims to answer that question using some detailed micro-data from rental housing in Buffalo, New York City and Rochester.  The study’s key finding is that it is landlords, rather than tenants, who bear most of the cost of property taxes.  While many studies assume that the tax load is equally split between renters and landlords, the Schwegman/Yinger study suggests most of the burden of the property tax is borne by landlords:
. . . our results suggest that the owners of rental units, who are more likely to be higher-income individuals, bear the majority of a property tax increase. Thus, the property tax is, in fact, more likely to be a proportional tax over the income distribution than previously evaluated. .
This in turn suggests that the portion of rent paid that goes to pay property taxes is less than generally assumed.  On average, most states assume that about 18 percent of rent paid by renters goes to property taxes.  The Scwegman/Yinger estimates suggest that the real figure is much lower, around 9 percent.  This is important because several states have renter tax relief programs that are based on higher estimates.
David J. Schwegman and John Yinger, “The Shifting of the Property Tax on Urban Renters: Evidence from New York State’s Homestead Tax Option,” CES 20-43 December, 2020

In the News

Strong Towns republished our critique of self-styled “pedestrian infrastructure” in Houston which mostly prioritizes car travel and effectively perpetuates and worsens a  hazardous pedestrian hostile environment.

 

The Week Observed, February 12, 2021

What City Observatory this week

1.  How housing segregation reduces Black wealth.  Black-owned homes are valued at a discount to all housing, but the disparity is worst in highly segregated metro areas.  There’s a strong correlation between metropolitan segregation and black-white housing wealth disparities. Black-owned homes in less segregated metro areas suffer a much smaller value reduction that Black-owned homes in highly segregated metro areas.  The value penalty suffered by Black homeowners is greatest in hyper-segregated metro areas, shown in the lower right hand corner of this chart.

Half a century after the passage of federal fair housing legislation, the persistent segregation of many US metro areas is still imposing a financial hardship on Black households. More progress in racial integration is likely a key to reducing Black-white wealth disparities.

2. Disobeying the Governor on congestion pricing.  The Oregon Department of Transportation has specifically disobeyed an order from Governor Kate Brown to take a hard look at congestion pricing before deciding on the course of its environmental review for the proposed $800 million I-5 Rose Quarter freeway widening project. In December 2019, the Governor directed her transportation agency to carefully consider how congestion pricing on I-5—mandated by the Legislature in 2017—would affect the need for the freeway widening project. Separate ODOT studies showed that pricing could eliminate the need to widen the freeway, but ODOT decided to move ahead without including any analysis of pricing in the project’s Environmental Assessment.

Must read

1. Zillow prediction for 2021:  A city rebound:  In the early days of the Covid-19 pandemic, we heard dire predictions that cities were doomed, both out of a mistaken fear that urban density spread the virus, and then based on the assumption that work-at-home would undercut downtown employment.  As we’ve pointed out at City Observatory, rents have softened relative to home prices, and some of the biggest declines have been in superstar cities, like San Francisco and New York.  But as we begin to turn the corner on the pandemic, there are signs that process will reverse.  The analysts at Zillow are bullish about a city rental rebound in 2021, fueled by young adults coming to cities.

In 2021, those that may have left cities temporarily during the pandemic will likely return as a vaccine becomes more widely available and local economies begin to open up again. Young adults moved back in with their parents at much higher rates this year than last, with nearly 2 million 18 to 25 years old still living at home in August. The majority of this age cohort tend to be renters and 46% of Gen Z renters tend to rent in urban areas, suggesting that when young people are ready to strike out again they will return to amenity-rich cities.

2. Where New York City is gaining and losing housing.  A new report from the New York City Planning Department provides a clear picture of where the city is gaining–and losing–housing units.  Since 2010, New York has added a net total of about 200,000 housing units, but growth has been concentrated in just a few areas in the city.  And, strikingly, some neighborhoods have actually seen their housing stock shrink.

Overall, the cities growth has occurred primarily on the west side of Manhattan and in relatively close-in neighborhoods in Brooklyn and Queens.  There’s been very little growth in much of the city.  A third of the city’s growth is in just three areas, all of them former industrial or commercial zones that have been redeveloped for high density residential use, including conspicuously Hudson Yards.

Perhaps the most surprising finding in the report is the fact that neighborhoods in the Upper East Side and Upper West Side are actually experiencing a net decline in housing units.  This is mostly due to remodeling projects that join two (or more) previously independent apartments or houses into a single much larger home.  It’s a reminder that when prices are high and it’s difficult to build new homes, that results in pressure spilling over into the existing housing stock, making it both scarcer and more expensive.   New York’s ten year retrospective on the change in its housing stock is an invaluable baseline for thinking about urban housing issues.  Does your city have a report like this?

3.  A wonder down under:  Falling rents and home prices in Sydney.  Sydney, Australia is one of the world’s most beautiful and expensive cities.  But in the past couple of years, thanks to local policies that make it somewhat easier to build denser housing, both home prices and rents have come down.  A new essay from Anya Martin, describes what happened.  Long concerned about housing affordability problems, the state government of New South Wales set up a regional planning commission to assign housing production targets to local governments in and around Sydney.

As with California’s similar “Regional Housing Needs Allocation” (RHNA) there was a lot of pushback from local governments, especially in higher income suburbs, but in the end, the policy had the effect of facilitating higher densities.  According to Martin, the number of new housing units permitted in the Greater Sydney area rose from an average of 20,000 per year prior to the change, to about 36,000 per year over the past five years.  And, in turn, that’s led to a decline in rents and home prices:

From a peak of A$550 per week in 2015 (£310, US $420), the median unit rent had fallen a substantial 7.3% by December 2019. House prices fell from a median of A$1,075,000 in December 2017 to A$865,000 a year later.

Sydney housing is still expensive, by any standard, but the progress made so far shows that expanding the supply of housing with policies that make it easier to build to higher densities represents movement in the right direction.

New Knowledge

How fighting climate change can improve human health. It turns out the reducing greenhouse gas emissions isn’t just good for the planet, its good for your health, too.  The kinds of measures we need to take to reduce carbon pollution (driving less, walking and biking more, eating less meat and more plant-based food) also will produce significant health benefits.  How significant?  A new study published in the Lancet evaluates the health benefits of alternate greenhouse gas reduction strategies compared to the current trajectory of emissions in several major countries.
The authors conclude that pursuing the goals set in the Paris accords would produce substantial health benefits, and that these could be amplified if climate policies explicitly embraced health-related objectives.  The study compares a “business as usual” current pathways scenario, with two alternatives:  a “sustainability” scenario (SPS) aimed at emissions reductions alone, and a health-focused scenario (HPS), that emphasizes interventions with the greatest health effects.  As the author’s conclude:
Compared with the current pathways scenario, the sustainable pathways scenario resulted in an annual reduction of 1.18 million air pollution-related deaths, 5.86 million diet-related deaths, and 1.15 million deaths due to physical inactivity, across the nine countries, by 2040. Adopting the more ambitious health in all climate policies scenario would result in a further reduction of 462 000 annual deaths attributable to air pollution, 572 000 annual deaths attributable to diet, and 943 000 annual deaths attributable to physical inactivity. These benefits were attributable to the mitigation of direct greenhouse gas emissions and the commensurate actions that reduce exposure to harmful pollutants, as well as improved diets and safe physical activity.
The improvements in mortality are a product both of less pollution (including both greenhouse gases, and other pollutants like particulates, which are reduced in tandem with GHGs), and also with the beneficial health effects of better diet and exercise from the implementation of measures to reduce car travel and carbon-intensive foods.  The report estimates population-adjusted numbers of deaths avoided for nine countries.
Ian Hamilton, Harry Kennard, Alice McGushin, et al, “The public health implications of the Paris Agreement: a modelling study,” The Lancet Planetary HealthFebruary, 2021, DOI:https://doi.org/10.1016/S2542-5196(20)30249-7

In the News

WBFO in Buffalo, cited our ranking of cities by level of housing segregation in their report, “A history of Redlining: housing group calls for more equitable mortgage lending practices.”

The Week Observed, February 19, 2021

What City Observatory this week

1. Covid migrationDisproportionately young, economically stressed and people of color.  Data shows the moves prompted by Covid-19 are more reflective of economic distress for the vulnerable than a reordering of urban location preferences of older professionals.  A new survey from the Pew Research Center shines a bright light on the actual volume and motivation of migration in the pandemic era.  We highlight some of the findings.  Notably, there’s relatively little migration in the wake of Covid-19.  Most Covid-related migration is temporary, involves moving in with friends or relatives, and not leaving a metro area.  It’s also  not professionals fleeing cities:  Covid-related movers tend to be young (many are students), and are prompted by economic distress

 

Pew’s data show that fewer than one in seven moves attributable to Covid-19 are “permanent” and that mover’s tend to be disproportionately persons of color.

2.  Guest contributor Garlynn Woodsong returns with a follow on commentary suggesting an equitable carbon fee and dividend should be set to a price level necessary to achieve GHG reduction goals; kicker payment should be set so 70% of people receive a net income after paying carbon tax or at least break even.  There’s an important lesson from our response to the pandemic:  In the face of a shared crisis, it makes sense to provide generous financial support for those who have to adjust their lives and liveliehoods in order to tackle a big problem.  A carbon dividend, financed by the proceeds of a carbon emissions fee would provide the wherewithal to help most people make needed changes to changes to their carbon footprint, and would offset the financial cost of the transition to those who couldn’t easily or quickly change.

Must read

1.  To meet climate goals, think outside the electric car.  Writing at Bloomberg CityLab, Transportation for America’s Beth Osborne, and Rocky Mountain Institute’s Ben Holland point out the limits of assuming that the problems of our auto-dominated transportation system can be met solely by electrifying cars.   Cleaner cars will help, but it will take decades to replace the existing fleet, and the larger the number of cars, the harder it is to achieve our shared climate goals.

The truth is very simple: If we continue to design our communities and transportation systems to require more driving alone, even if it’s in an electric car, it makes decarbonization far harder. According to Rocky Mountain Institute’s analysis, the U.S. transportation sector needs to reduce carbon emissions 43 percent by 2030 in order to align with 1.5° C climate goals — requiring that we put 70 million EVs on the road and reduce per-capita vehicle miles traveled (VMT) by 20 percent in the next nine years. Even under the most ambitious EV adoption scenarios, we must still reduce driving.

2.  People who leave San Francisco don’t go far.  There’s a lot of hand-wringing that the rise of work-at-home means the death of San Francisco.  While anecdotes about of CEOs and handfuls of professionals leaving the Bay Area for Texas or Florida, there’s been a lack of hard data on the subject.  The San Francisco Chronicle reports that US Postal Service change-of-address filings show that out-migration from the City of San Francisco, while up in 2020, is mostly to other parts of the Bay Area and California.

City economist, Ted Egan reports that migration patterns follow a well-worn path with most movement to nearby suburbs. Few people are moving out of state, and according to the Chronicle, Egan argues that the fact they reside nearby could represent a post-pandemic “silver lining” for San Francisco:

“You are not going to have to worry about getting them to move back from Boise.  It looks more like normal pre-COVID migration flows.  People are settling in to nearby Bay Area suburbs.  They are going to Sacramento and L.A. . . . Austin, Texas is way down the list, Portland is way down the list. New York is way down the list.”

3.  How highways make traffic worse.  This one is more of a “must watch”: Vox’s video explainer of why widening urban highways doesn’t reduce congestion.  If you’re looking for a simple, graphic explanation of how induced demand works, you’ll want to watch—and share—this video.  The video recites one of our favorite stories in this vein, the multi-billion dollar widening of Houston’s Katy Freeway, now North America’s widest—which spectacularly failed to reduce congestion.  Within just a few years of the widening project being completed, commute times in this corridor were longer than ever.  The freeway simply served to make the region even more sprawling, car-dependent and climate wrecking than before.

 

New Knowledge

A flowering of research on local housing supply and rents.  In the past year or so, there have been a number of new studies on the very localized effects of new housing construction.  These are important because they shed a light on how building more housing affects rents in nearby buildings, and whether and to what extent new construction results in, or reduces, housing displacement.  We’ve profiled several of these studies at City Observatory, and helpfully a new publication from the UCLA Lewis Center has a careful, largely non-technical review that compares and contrasts their findings, and what they mean for housing policy debates.

This synopsis of six studies finds that the consensus is that the construction of market rate housing tends to make neighborhood housing more affordable than it would otherwise be.  Five of the six studies conclude that market-rate housing makes nearby housing more affordable across all income levels of rental units; the sixth study finds mixed results with the affordability benefits confined to lower rents for higher end apartments.

In their conclusion, Phillips, Manville and Lens note that a principal reason for expanding market rate development in low income neighborhoods is the land use restrictions that effectively preclude more development in higher income areas.  Because single family zoning and tortuous development approval processes make it impossible to build more market rate housing in established high income areas, the demand for these units spills over onto some low income communities, raising the affordability and gentrification concerns that fuel public debate:

. . . this whole discussion — of what happens when new development arrives in a neighborhood where many lower-income people live — could be largely avoided if we built new housing mostly in higher-income, higher-resourced communities. Development in more affluent places, where fewer residents are precariously housed, could allow more people access to opportunities and alleviate demand pressures elsewhere in a region. But such development rarely happens now, because zoning prevents it.

Tragically, it seems, that a frequent policy prescription is to make low income neighborhoods as potent in blocking new development as their high income peers, a kind of NIMBY-for-all solution that’s only likely to worsen price pressures and displacement across an entire market.

Shane Phillips, Michael Manville & Michael LensResearch Roundup: The Effect of Market-Rate Development on Neighborhood Rents, UCLA Lewis Center for Regional Studies, February 17, 2021

In the News

UrbanTurf, a Washington, DC-based real estate information site pointed its readers to our analysis showing the connection between city segregation and the Black-white wealth gap.

 

The Week Observed, February 26, 2021

What City Observatory this week

1. Revealed: Oregon Department of Transportation’s secret plans for a ten-lane I-5 freeway at the Rose Quarter.  For years, ODOT has been claiming that its $800 million freeway widening project is just a minor tweak that will add two so-called “auxiliary” lanes to the I-5 freeway.  City Observatory has obtained three previously undisclosed files show ODOT is planning for a 160 foot wide roadway at Broadway-Weidler, more than enough for a 10 lane freeway with full urban shoulders.

ODOT has failed to analyze the traffic, environmental and health impacts from an expansion to ten lanes; not disclosing these reasonably foreseeable impacts is a violation of the National Environmental Policy Act (NEPA).

2. The cost to Oregon of reviving the failed Columbia River Crossing project just went up by $150 million.  Buried in an Oregon Department of Transportation presentation earlier this month is an acknowledgement that the I-5 bridge replacement “contribution” from Oregon will be as much as $1 billion—up from a maximum of $850 million just two months earlier.

Must read

1. xx

2. Cities, Covid and the Fog of War.  City Observatory friend David Gisburg and his colleague Dave Smith, both Cincinnati-based executive recruiters have been interviewing city leaders around the country on the lessons to be learned from the Covid-19 pandemic.  They’ve summarized the results of more than 100 interviews in an essay entitled “Fog of War.”  It’s an apt metaphor:  In the midst of the pandemic, we’ve all had to deal with dire and largely unanticipated problems, often with incomplete misleading or misinterpreted information.  In the face of this danger and uncertainty, how should urban leader’s respond?  Their summary, Downtown Leaders – 2020 Lessons Learned: The Pandemic, Civil Unrest and Future Impact, addresses several critical urban topics including: How downtown and center-city leaders responded to the exceptional challenges presented during the year of COVID. Identifies the unique value downtown organizations provided to meet critical needs during a pandemic while driving future visioning. Provides and overview of downtown challenges and opportunities for 2021 and beyond.  Characterizes leadership lessons learned and “talent” required to meet future challenges.  The report draws on  more than 100 Zoom conversations with downtown CEO’s, industry consultants, philanthropists, and market influencers between August, 2020 and January, 2021. The Fog of War explores 9 key issues and addresses “headwinds” and “tailwinds” faced by downtown leaders as they navigated the swirling urban landscape created by COVID-19.

3.  Bellevue thinks another freeway interchange will solve its traffic problems.  Portland isn’t the only city that thinks that congestion can be solved by throwing more money at antiquated freeway solutions.  The fast growing suburb east of Seattle hopes that by adding more interchanges to I-405 near the cities downtown it can lessen congestion.  But, as The Urbanist points out, this kind of solution invariably makes the problem worse.

I-405 northbound car traffic with bridges and skyscrapers in the background.

Bellevue needs another freeway interchange

 

New Knowledge

Lived Segregation.  Most of our understanding of the nature of urban residential segregation comes from Census data on housing, and looks at the extent to which people from different income or racial/ethnic groups live in different houses.  That’s clearly important, but significantly, leaves out how much we mix as we move around neighborhoods and cities on a daily basis (or at least the way we used to, in a pre-pandemic world.  As with so many things, the mass of big data created by our electronic connections creates a new source of information about patterns of segregation.  A new study from Brown University sociologist xxx Candipan, and her colleagues uses geolocated twitter data to study patterns of movement in 50 large US cities.
They’ve used this data to measure the extent to which people from predominantly black neighborhoods xxxx <defn>.  As a summary measure, they’ve created a new “mobility segregation index” which identifies the extent to which people from different races tend to travel only in xxxx
CHART

In the News

Willamette Week quoted City Observatory’s Joe Cortright in its story revealing the previously undisclosed plans by the Oregon Department of Transportation to widen I-5 to ten lanes at the Rose Quarter in Portland.

Streetsblog California pointed its readers to our customized Portland version of the Induced Demand calculator.

The Niskanen Center’s newsletter, This Week in Land Use Regulation highlighted our commentary on the connection between segregation and racial housing value disparities.

 

 

The Week Observed, April 2, 2021

What City Observatory this week

1. How the Oregon Department of Transportation destroyed a Portland neighborhood, Part 2:  The Moses Meat Axe.  We continue our historical look at the role that freeway construction (and the traffic it brought) destroyed Portland’s Albina neighborhood.  Our story began in the early 1950s with the construction of a waterfront highway, and reached a crescendo in 1962, when the construction of Interstate 5 wiped out much of the neighborhoods housing stock. leveled dozens of blocks, and forever transformed the area from a primarily residential neighborhood to a car-dominated landscape.

We’ve got a detailed before-and-after look at how the construction of I-5 wiped out a large much of this historically black neighborhood. It wasn’t just the road’s right of way, but the flood of car traffic that the freeway brought to this area.  As is often said, past is prologue.  Today the Oregon Department of Transportation is proposing to spend $800 million to further widen the gash that I-5 cut through this neighborhood.  It’s a reasonable question to ask why this historic error is being repeated.

2.  The Cappuccino Congestion Index.  Media reports regularly regurgitate the largely phony claims about how traffic congestion costs travelers untold billions of dollars in wasted time. To illustrate how misleading these fictitious numbers are, we’ve used the same methodology and actual data to compute the value of time lost standing in line waiting to get coffee from your local barista. Just like roadways, your coffee shop is subject to peak demand, and when everyone else wants their caffeine fix at the same time, you can expect to queue up for yours.

Just as Starbucks and its local competitors don’t find it economical to expand their retail footprint and hire enough staff so that wait times go to zero (your coffee would be too expensive or their business would be unprofitable) it makes no sense to try to build enough roads so that there’s no delay. Ponder that the next time you’re waiting for your doppio macchiato.

Must read

1. What’s really needed in an infrastructure bill:  Better policy.  Transportation for America’s Beth Osborne has some succinct advice about what needs to be in the vaunted trillion-dollar-plus infrastructure package:  Fixing America’s badly broken transportation policy, which squanders resources while making environmental, social and transportation problems worse.  Simply doing (and building) more of the same has led to more driving, more pollution, and more deaths of vulnerable road users.  Before we throw more money at this flawed system, we need to fix it:

This time around, Americans want an infrastructure package that addresses economic recovery through job creation; rebuilds crumbling roads, bridges, and transit systems; and reduces climate emissions and racial inequities. But our existing federal transportation programs aren’t built to achieve these outcomes—no matter how much more money is pumped into them. In fact, they often produce the opposite result: building new infrastructure we can’t afford to maintain, driving up emissions and creating barriers to people of color trying to get to work and essential services.

2. Patrick Sharkey on the causes of the 2020 Crime Wave.  After decades of long term decline in urban crime rates, many cities experienced a wave of increased crime and violence in 2020.  The Atlantic’s Derek Thompson interviewed Princeton’s Patrick Sharkey, one of the nation’s pre-eminent experts on the subject.  Sharkey points out that any explanation for shifting crime rates is always complex, but in the past year, the disruption to social life wrought by the pandemic:  unemployment, the closure of business and civic institutions and schools, all contributed to greater crime and violence in cities across the country, especially in neighborhoods of concentrated poverty.  As Sharkey explains, a pandemic is a kind of perfect storm for crime:

Last year, everyday patterns of life broke down. Schools shut down. Young people were on their own. There was a widespread sense of a crisis and a surge in gun ownership. People stopped making their way to institutions that they know and where they spend their time. That type of destabilization is what creates the conditions for violence to emerge.

Sharkey acknowledges that an increase in policing is one of the factors behind the statistical decline in crime over the past couple of decades; but in his view more stringent policing isn’t a sustainable strategy. The excesses of policing may have reduced crime, but then also planted the seeds of unrest.  In his view, we can’t rely on a model of brute force, punishment and imprisonment in the long run.

3. Todd Litman’s review of Patrick Condon’s polemic, “Sick City”  Patrick Condon’s new book, Sick City: Disease, Race, Inequality and Urban Land, is at it’s heart, a denial that the supply and demand have any utility in helping us understand problems of housing affordability and availability.  The estimable Todd Litman of the Victoria Transportation Policy Institute has a scathing review of Condon’s book, effectively dismantling most of his key claims.  In his usual methodical and deeply footnoted way, Litman challenges each of Condon’s key propositions, patiently laying out his case. For example, Condon claims that upzoning simply leads to added developer profits rather than augmenting supply and holding down or reducing rents. Litman walks through a series of studies showing the reverse is true.  In the end, he concludes:

Condon does planners, housing advocates, and their clients a disservice by rejecting one of the most effective policies for increasing affordability: upzoning in walkable urban neighborhoods.

New Knowledge

Our new knowledge feature will return next week.

In the News

Todd LItman’s Planetizen review included references to a pair of City Observatory commentaries, one looking at the negative effects of Portland’s inclusionary zoning requirement and another assessing the literature on the role of housing supply increases in reducing displacement.

Strongtowns re-published our analysis of Massachusetts Department of Transportation proposal to build a “diverging diamond” interchange.  While gussied up with gallons of greenwash, the project represents yet another auto-centric re-making of the suburban landscape.

The Week Observed, April 9, 2021

What City Observatory this week

1. How ODOT destroyed Albina:  Part 3 the Phantom Freeway.  Even a freeway that never got built played a key role in demolishing part of Portland’s Albina neighborhood.  In parts 1 and 2 of this series, we showed how construction of state highway 99W in 1951 and Interstate 5 in 1962 destroyed a substantial part of this predominantly Black neighborhood’s housing stock, and propelled its steady population and economic decline.   In Part 3, we look at portion of this neighborhood flattened to accomodate a freeway that itself was never completed.  The planned Prescott Freeway would have cut across North and Northeast Portland.  Before it was cancelled, the Oregon Department of Transportation built a third of a mile stretch connected to the East End of the Fremont Bridge in Portland.  The footprint of that freeway demolished most of the houses shown in the red bounded area on this 1962 aerial photo.

The neighborhood-killing effects of the freeway didn’t end with housing demolition and road construction; the flood of cars created by the freeway fundamentally altered the character of the neighborhood, making the surviving housing less desirable, and promoting a range of car-dependent uses, notably parking lots, that further erode a neighborhood’s residential viability.

2.  Wholly Moses.  A bill in the Oregon Legislature aims to revive two of the tricks power broker Robert Moses used to ram freeways through New York City decades ago. The bill, HB 3065, would authorize the Oregon Department of Transportation to start a series of freeway widening mega-projects, and also let it pledge tolls, and a wide array of other federal and state transportation revenues to repay bonds used to finance the projects.

The combination of driving stakes and selling bonds could lead to a situation where there’s never any future opportunity for elected legislators to question or rein in these projects. Moses discovered that bond covenants could be used to permanently encumber critical sources of revenue, like tolls.  The provisions of HB 3065, as it is proposed to be amended, would essentially allow the Oregon Department of Transportation to similarly encumber all of Oregon’s future toll revenues to repay a series of vaguely defined freeway expansion projects.  It’s an old stratagem that Oregon legislators should be wise to.

Must read

1. Cities will come back.  Enrico Moretti, author of The New Geography of Jobs, has heard the claims that the Covid pandemic’s proof of concept of work-at-home means the end of cities as we know them.  He isn’t having any of it.  In an interview with Jerusalem Demsas of Vox, Moretti argues that neither the virus nor technology has permanently undone the key reason we gather together in cities:  together, we’re more productive.  Moretti’s work and that of other economists shows that agglomeration economies (economic gains from being close to other people with similar and complementary skills and knowledge) produce greater innovation, productivity and economic growth.  These gains draw people and businesses to cities, and are the driving force powering superstar cities like San Francisco, but they operate in cities of all sizes.  As Moretti concludes.

. . . the economic geography of employment after Covid will look a lot like before Covid.  If you have to show up at the office three or four days a week, you still need to live in the metro area where your office is. The link between place of work and place of residence will be restored and people will flock back to places like the Bay Area or Seattle or New York or Boston for the same reason that they were flocking to these places before Covid.

Also, as we’ve noted at City Observatory, people are drawn to cities not just because of job opportunities, but because of the rich networks of social and consumption opportunities that abound in dense, successful cities. These agglomeration economies in production and consumption are likely to reassert their importance as the pandemic fades.

2. Climate Qualms about Biden’s Infrastructure Bill:  Farhad Manjoo, writing at The New York Times, points out that highway departments are likely to use money targeted for road repairs to expand highway capacity, leading to more greenhouse gas emissions, rather than fewer:

. . . one of every five miles of roadway in America is rated in poor condition — but when given federal money for roads, states often spend a lot of it on expansion rather than repair.  This is counterproductive. New roads are often justified as a way to reduce traffic, but that’s not how traffic works — new and expanded roads tend to encourage more driving, just making congestion worse. New roads also make for more maintenance, adding to the backlog of repairs.

There’s plenty of vitally important work to upgrade America’s transportation system, especially its sparse public transit networks, and deadly pedestrian and bike infrastructure.  The Biden Administration’s policy statements are pointed in the right direction, calling generally for fixing things first, and addressing the historical damage caused to many neighborhoods, but the devil will be in thousands of administrative details.

3.  The freeway battle is TransAtlantic.  At City Observatory, we spend a lot of time looking at the climate effects of freeway expansion proposals in the US, but the issue is global.  In the UK, where the nation has a long-standing official commitment to reduce its greenhouse gas emissions, environmental activists are pushing back against the apparent hypocrisy of the national government’s £27 billion proposal to expand “motorways” in England and Scotland.  The Guardian reports on a legal challenge brought by the group Transport Action Network, arguing that the Ministry for Transport’s environmental assessment understates greenhouse gas emissions from the building spree by a factor of 100, by among other things, failing to account for the embodied emissions resulting from construction, and from induced demand.  As the Guardian points out, two of the UK’s prominent transport experts, have testified against the government’s position.

Jillian Anable, professor of transport and energy at the University of Leeds, testified: “We are not aware of any calculation by the DfT of a cumulative figure for carbon emissions through to 2050, arising from all the road schemes funded by the RIS2 … and covering construction emissions as well as ongoing increases in emissions due to higher vehicle speeds and induced traffic.

Freeway opponents in Portland and in cities around the US are making similar challenges to the environmental analyses developed by state highway departments.  It will be interesting to see how courts on both sides of the Atlantic weigh these arguments.

New Knowledge

The unfolding urban rebound. The analysts at ApartmentList.com have their finger on the pulse of the nation’s rental marketplace, and it’s one of the most important place to look for signs of urban life in a post-pandemic world.  They track rental rates, and importantly, apartment search activity, across and within metro areas throughout the United States.  While it was popular, especially in the early days of the Covid-19 pandemic, to predict an unending urban exodus, due to fears of density, the team at Apartment List were always a bit charry to that hypothesis.

Their latest take on the nation’s rental markets shows an unfolding urban rebound.  Not only were the predictions of urban exodus overblown, but it appears that interest in big cities, and particularly in the denser portion of those cities never really waned.  In their view, what we’ve seen is really “urban churn” rather than urban exodus.  To be sure, its possible that the pandemic (and low interest rates) accelerated the movement of some people out of cities; but people are always moving in and out of cities, and in Apartment List’s view, there are lots more people looking to take the place of those who may have left.

That’s what they’re seeing in the data.  In market after market, interest in apartments in dense central locations (primary cities) is outpacing interest in their surrounding suburbs, with search activity up consistently more, compared to a year ago in city neighborhoods.  They also note that the decline in market rents in some cities, particularly superstars like San Francisco, Seattle and others, is likely to stimulate even more interest, as some who might not have looked in those cities for affordability reasons now searching there.  In that way, ApartmentList reasons, the rental rate decline is at least partly self-correcting.  They conclude:

. . . there has been significant speculation about what the pandemic would mean for the future of cities. Predictions of urban exodus felt intuitive, given that social distancing requirements put a pause on many of the activities that make city life vibrant, while at the same time, remote work had freed many Americans from needing to live close to their jobs. Despite this narrative, our search data show that big, dense cities are maintaining their allure even as some residents move away. . . . It is important to keep in mind that these data reflect search activity rather than complete moves, but we see clear evidence that interest in big cities has not waned . . . All in all, the “urban exodus” narrative seems to be one that we can close the door on.

In the News

Bloomberg CityLab included City Observatory’s analysis of the flaws in planning for Portland’s $800 million Rose Quarter Freeway widening project in its story about freeway fights in Houston, Milwaukee and Portland.

Streetsblog cited City Observatory’s reporting on the damage done to the historically Black Albina neighborhood by the construction of multiple state highways; damage that would be aggravated by widening I-5 at the Rose Quarter.

 

The Week Observed, April 16, 2021

What City Observatory this week

1. Taking Tubman:  The Oregon Department of Transportation is planning to widen the Interstate 5 freeway in Portland into the backyard of Harriet Tubman Middle School.  The $800 million widening project doubles down on the historical damage that ODOT highway construction has done to this neighborhood, and literally moves the freeway onto school grounds.  The project also includes erecting two 1,000 foot long, 22 foot high “noise walls” (roughly twice the height of the Berlin Wall) between the school and the freeway.  Already, the air at the school is unhealthy–so much so that the Portland school district had to spend $12 million on filtration equipment to make air inside the building clean enough to breathe.

A wider freeway will increase traffic, air pollution, and noise pollution at Tubman and in the surrounding neighborhood.  At an April 9 demonstration, students and neighbors spoke out against the project.

2.  ODOT’s peer review panel backs away from Rose Quarter traffic forecasts.  A key dispute in the battle over the proposed I-5 Rose Quarter freeway widening project is whether it will increase air and noise pollution.  Critics–including City Observatory–have pointed to flaws in the project’s traffic modeling that cause it to overstate traffic and pollution in the “no-build scenario” and to understate Induced demand in the build scenario.  These errors, coupled with previously hidden plans showing ODOT plans a 10-lane freeway, rather than the 6-lane facility in modeled, mean that the project’s environmental assessment is wrong. In attempting to defend its flawed modeling, ODOT has repeatedly pointed to a “peer review” panel it convened in 2020.  But last week, the leader of that peer review panel specifically said that they did not look at the accuracy of ODOT’s traffic projects, and can’t vouch for them.  The peer review panel has made it clear that claims that its review “supports” ODOT’s work and “confirms” its findings are simply false.  The panel members were instructed to look only at a small, and as it turns out, derivative question, and simply ignored whether the freeway widening increases traffic.

3. Our real climate and transportation policy is cheap gas.  Mayors and Governors like to tout bold climate goals (typically decades in the future), but our real climate policy is expressed by what we’re doing right now.  And across the nation, regardless of your rhetoric, your real climate policy is expressed at the gas pump.  The very low prices that Americans pay for gasoline are the dominant policy decision for both climate and urban transportation.  A quick look back at the past seven years shows that the 40 percent decline in gas prices in 2014 caused a whole range of key climate and transportation policy measures to move rapidly in the wrong direction.  Public transit ridership, which had been increased, fell abruptly.  Miles driven, which had plateaued, suddenly grew.  Greenhouse gas emissions from transportation and road deaths increased sharply.  And Americans, who had been buying equal numbers of cars and SUVs, decisively shifted to buying about 70 percent SUVs.

As long as we persist in keeping fuel prices low, both by global standards, and relative to the damage done by all this driving, we’ll continue to have frustratingly poor transportation systems and ever worsening climate problems.

Must read

1. Ed Glaeser on tying infrastructure funding to zoning reform.  Part of the Biden Administration’s $1.9 trillion infrastructure package is a grant program that would give local governments money in recognition of efforts to eliminate exclusionary zoning policies, like large lot sizes, apartment bans and parking requirements.  But, according to Glaeser, dean of US urban economists, this carrot is too small to motivate policy changes, especially among the jurisdictions who practice the most egregious and persistent forms of exclusion.  He argues:

. . . a competitive grant program is too weak to overcome the entrenched interests — like the homeowners who control local zoning boards and the wealthy residents of cooperatives who oppose all neighborhood change — that limit building in productive places.  If the president wants to break the country out of its zoning straitjacket, the infrastructure plan should ensure that no benefits go to states that fail to make verifiable progress enabling housing construction in their high-wage, high-opportunity areas.
As we’ve pointed out, there’s a kind of prisoner’s dilemma dynamic to local exclusion:  individual suburbs and neighborhoods have strong incentives to not collaborate and have the burdens shifted to other places.  The trouble with a program of limited incentives is it is just those communities that most value their exclusive nature, and least need federal funds (i.e. the wealthiest places) are the one’s who will be list motivated to change by this kind of “small carrot” policy.

2.  Community Benefits Agreement in Sacramento.  Big new investments in distressed urban neighborhoods often raise concerns about displacement, and one of the ways to assure that the benefits of new investment are widely shared is a community benefits agreement.  Darrene Hackler lays out the process and details that produced a community benefits agreement for Sacramento’s proposed Innovation District. The University of California Davis is partnering to build a $1 billion research and development center, and the city and developers have agreed to a $50 million fund to provide a range of benefits including affordable housing in the area.  Hackler’s report, published at Smart Incentives, also includes an analysis of community benefit agreements in other cities.

3.  What good is transit if people can’t live nearby?  Sightline’s Michael Andersen reports on legislation pending before the Oregon Legislature to automatically legalize higher density housing near transit service. As Anderson explains,

On land zoned for housing within one-eighth of a mile of a high-capacity transit station, the bill would strike down local bans on buildings that create up to 45 homes per net acre. In other words, it would legalize three-story attached housing within about three blocks of stations.  . . . there’s [also] a provision for greater flexibility: jurisdictions can also opt to keep lower density at some stops and legalize five to six story buildings at other stops, if they prefer, as long as they achieve the same overall density within the 1/8-mile station areas.

It is, as Andersen argues, a modest ask.  But nonetheless, it’s drawn the opposition of NIMBY’s, and surprisingly, some transit agencies.  The NIMBY opposition is predictable.  Transit agencies fear that if transit service (like frequent buses or new light rail stops) brings with it a guarantee of higher allowable densities, it will be harder for them to expand service. As Andersen points out, however, does it make any sense to spend scarce transit resources improving service to an area that doesn’t have the kind of densities needed to make transit work?

4.  How to repair the damage done by urban freeways.  Usually our Week Observed must read feature highlights insightful views that have already been published, but this week, we’re alerting you to an upcoming event that’s something you might want to attend.

Sponsored by the Third Way and Transportation for America, the webinar is on April 20.  Registration is free.

New Knowledge

Subsidizing single family houses aggravated climate change.  A recent study published in Environmental Science and Technology considers how our national policy of subsidizing single family housing has led to an increase in the nation’s carbon emissions.  A series of federal (and local) policies have tilted the housing market toward single family homes:  the construction of the interstate highway system, subsidies to homeownership, redlining of many urban neighborhoods, and local zoning that greatly restricts multifamily development.  The authors of this paper construct a counterfactual estimate of what US residential greenhouse gas emissions would be if these policies had been more neutral between single- and multi-family housing.

The following chart summarizes their analysis of a series of counterfactual estimates of US residential energy demand based on varying mixes of housing types and sizes.  The two left hand columns show the mix of housing units (actual and counterfactual); the net effect is changing about 14 million houses built between 1970 and 2015 from single family to multifamily, about a 14 percent shift in the share of multi-family homes in the housing stock in 2015. The five right hand columns show the total residential energy consumption associated with a series of alternatives where housing sizes vary.  The greatest reductions in energy use are associated with a switch to 30 percent or 50 percent smaller units.  Just moving the average household to this mix of multi-and single family, without changing housing unit size (CF 2), reduces per household residential energy consumption more than one-fourth from the levels actually recorded.

 

Shifting to more multi-family housing would lower energy consumption, decrease household energy bills and lower greenhouse gas emissions.  As the authors conclude:

Federal housing policy changes have encouraged construction of single-family housing and suppressed multifamily housing, increasing residential energy demand and GHG emissions. Increasing the multifamily share of housing can be expected to produce energy large savings, even with no change of household income or floor area. Policies that suppress demand and restrict supply of multifamily housing thereby directly obstruct a large potential for residential GHG emission reductions. Housing policy can support climate policy by removing barriers and disincentives to multifamily housing.

Finally, if anything, its likely that these estimates understate the energy savings and greenhouse gas reductions from increase multi-family housing.  Multi-family housing has, by definition, higher density, and higher density tends to be associated with lower car ownership and fewer vehicle miles traveled per person, so it is likely that there would be savings in transportation related energy and pollution as well, factors not estimated in this paper.

Peter Berrill, Kenneth T. Gillingham & Edgar G. Hertwich, “Linking Housing Policy, Housing Typology, and Residential Energy Demand in the United States,” Environ. Sci. Technol. 2021, 55, 4, 2224–2233.  Publication Date:January 28, 2021.
https://doi.org/10.1021/acs.est.0c05696

In the News

Bike Portland featured Joe Cortright’s remarks at the No More Freeways rally at Harriet Tubman Middle School.

 

The Week Observed, April 30, 2021

What City Observatory this week

1. Restorative justice without funding is a sham.  Portland’s Albina neighborhood was decimated by the construction of three Oregon Department of Transportation highway projects in the 1950s, 1960s and 1970s, causing the neighborhood’s population to drop by more than 60 percent.  Part of the marketing pitch for the current effort to spend $800 million to widen the Interstate 5 freeway through the neighborhood is claims that it will contribute to “restorative justice.”  That’s a dubious claim to be sure.  Work prepared by project consultants shows that the proposed freeway covers are too small and too swamped by traffic to be of any value.  These consultants have developed some other development alternatives which show what a more restorative approach would look like.  Unsurprisingly, they hinge on building lots of housing to replace that demolished by the freeway.  On this diagram, the yellow buildings are imagined new housing.

And “imagined” is really the correct description, because nothing in the I-5 freeway widening project actually includes funding for this housing.  Just building the five yellow boxes on this chart would cost between $160 million and $260 million.  That’s a noble goal, but without funding, its meaningless.  True restorative justice has to be more than colorful illustrations showing imaginary buildings; it has to be a tangible commitment of real resources.

2. Pricing would fix traffic congestion in Portland’s Rose Quarter without widening the freeway.  The I-5 Rose Quarter project would spend $800 million to widen the freeway to as many as ten lanes.  But according to ODOT’s own consultants you could achieve all of the congestion reduction benefits of ODOT’s proposed widening without spending a dime on construction.  The 2017 Oregon Legislature directed ODOT to implement congestion pricing on Portland area freeways.

3. Why “new towns” aren’t the way for Cascadia to promote sustainability. We’re pleased to publish a guest commentary from Ethan Seltzer taking a close look at a proposal to direct the Pacific Northwest’s future population growth into a series of satellite cities 40 to 100 miles from the established population centers of Vancouver, Seattle and Portland.  This “new cities” proposal for regional growth comes from a tech group called the “Cascadia Innovation Corridor.”

As Seltzer points out, the idea of shifting growth out of established cities is a reprise of a historical theme from a century ago.  The idea that one can minimize or avoid the perceived problems of existing cities by building new ones hasn’t panned out in practice.

Must read

1. Let’s not give the streets back to cars.  Janette Sadik-Kahn and Seth Solomonow, co-authors of Streetfight, have an Atlantic article arguing that cities ought to capitalize on, rather than retreat from the reclamation of street space that occurred during the pandemic.  Across the country, cities have taken space that was used for car storage or movement, and re-dedicated in cycling, walking, dining and other, more human-scaled uses.

To serve their residents well, U.S. cities can’t just return to the pre-pandemic norm. They need to come back more resilient, more sustainable, more economically connected, and more equitable. Reclaiming city streets from the domination of cars is never easy, but it will never be easier than it is right now.

While cities used the pandemic to re-think many long held assumptions about how to design urban spaces, there’s powerful auto-centric inertia coded into the legal and regulatory framework.  For example, the federal governments manual for street design makes it hard for cities to enable more active uses of local streets.

2. Has the hot housing market convinced people we need to increase supply?  Writing at Slate, Henry Grabar recites tales of the frothiness of the current housing market, with prospective homebuyers in many cities being repeatedly out-bid as housing prices spike.  The palpable frustration with short supplies and rising prices, Grabar speculates, may finally convince people that we need housing policies that increase housing supply. The housing policy debate has been characterized by denial and urban myths (that vacant condos owned by speculators are the cause of the shortage).  Grabar hopes that those who are getting outbid for housing will be radicalized and active:

But for every winner there will be many losers, and maybe the process can radicalize these would-be buyers, and their friends, and their parents, and the people they talk to. There really aren’t enough places to live. Those people can channel their frustration with bidding wars into political activism aimed at housing suppressants like parking requirements, restrictive zoning, and density limits. If appeals to neither historical wrongs nor economic growth get the job done, a strong dose of self-interest can’t hurt.

 

New Knowledge

How SUVs made our climate problems much worse.  In the past two decades, Americans have bought fewer and fewer cars. The bad news, of course, is that we’re buying and driving far more trucks and sport utility vehicles (SUVs).  Our energy and air quality policies have been counting on a “technical fix” to cars, in the form of fuel economy standards, with the idea that progressively more efficient vehicles would reduce energy consumption, pollution and greenhouse gas emissions.

But there’s been a kind of “rebound” effect, where rather than continuing to buy the same mix of vehicles, consumers have been buying larger, heavier and less fuel efficient trucks and SUVs.

The problem with SUVs is not just their fuel economy, however. They stay in the vehicle fleet longer, with 32.4% of them still on the road after 20 years, compared to just 15.7% for cars (US EPA, NHTSA, and California Air Resources Board, 2016). Kovach 8 Drivers also use them more; average lifetime VMT is 18.3% higher for SUVs (US Department of Transportation, 2006).

The net effect of the additional truck and SUV purchases has been to increase fuel consumption and greenhouse gases, compared to a world where the the passenger car market share state stable.  The report estimates that the shift to SUVs and  light trucks will produce 867-3,519 million short tons of greenhouse gases across their lifetimes, compared to alternative scenarios. Collectively, this emissions increase is enough to  enough to offset 19-75% of the projected savings from the model year 2011-2025 CAFE standards

Tim Kovach, The rise of SUVs in the US and its impact on greenhouse gas emissions from 2000-2017April 2021.  DOI:10.13140/RG.2.2.10190.5920

The Week Observed, April 23, 2021

What City Observatory this week

1. Fighting climate change is inherently equitable. While there’s a growing recognition of the existential threat posed by climate change, it’s becoming increasingly frequent to pit equity concerns against decisive action to reduce greenhouse gas emissions.  It shouldn’t and doesn’t have to be this way.  Climate change disproportionately affects those with the fewest resources and least power, and consequently reducing climate change is itself  intrinsically equitable. But measures to reduce greenhouse gases will require some big changes, and those with the least resources and power will also be the least well equipped to adapt to change, which should not be an argument for not reducing greenhouse gases, but instead should be an argument for mitigating the negative effects of change.  As a practical matter, that’s exactly the approach we’ve taken in dealing with Covid-19; we insisted on social distancing (policies that disproportionately affected the poor and front line workers), but then also put literally trillions of dollars into direct payments to individuals, loans to businesses, and expanded unemployment insurance, to mitigate the difficulty of coping with that change.  That approach, of simultaneously, but separately taking aggressive action and also buffering those who bear the brunt of change should guide our climate policy.

2. Portland’s freight follies.  It’s a shopworn myth that the success of urban economies depends on freight movement.  To be sure, if there were no roads, it would be hard to operate a modern economy, but the argument is made that somehow more roads (and hopefully less traffic congestion) will somehow generate economic growth.  Highway advocates maintain that failure to build more roads to accomodate an ever-increasing number of trucks will cause economic growth to slow, halt or reverse.  Portland’s experience over the past two decades shows that’s simply untrue.

Truck volumes across the two interstate bridges over the Columbia River has declined almost 20 percent since 2006, even as the economy has grown by a more than 30 percent.  In a very real sense, economic activity and prosperity aren’t determined by how much freight we move or how fast.

Must read

1. An induced demand calculator for Colorado.  The Rocky Mountain Institute and NRDC released an induced travel calculator showing that each additional lane mile of highway built in the state result in 2 to 8 million additional miles of vehicle travel and lead to burning an addition 200,000 gallons of gasoline, with attendant air pollution and greenhouse gas emissions.  The calculator is based on the latest scientific research on induced demand, and follows a template developed by Susan Handy and Jamey Volker at the University of California, Davis.

As Handy and Volker’s research has shown, the standard travel models used by state highway departments to justify new capacity consistently underestimate the additional travel and pollution caused by such projects. A customized calculator, calibrated to local road and driving data is a valuable tool for freeway fighters to challenge expansion projects.

2. The decline of the corner store.  For decades, the decline of the independent, local retail store has been a lament in small towns and big cities.  The Covid-19 pandemic has been another blow to small retailers.  Governing‘s Alan Ehrenhalt reminds us of many of the key virtues of these small, locally owned mainstreet stores.  Harking back to Jane Jacobs, small shops provide the interesting walkable destinations that enliven cities and put the “eyes on the street” that add to a sense of security.  It also turns out that many of the “third places” that enliven urban living are commercial establishments of one sort or another (restaurants, coffee shops, bookstores, and the like) and their decline undercuts urban livability. The decline of retail has many causes, but one of them is that Americans, particularly the middle class, have shifted their shopping to national chains, big boxes and on-line retail. In turn, much of the decline has fallen on the middle-class entrepreneurs that started and ran those businesses.  As Ehrenhalt explores, there aren’t any easy answers to reversing these trends.

3. Chuck Marohn on Infrastructure.  Writing at Strong Towns, Chuck Marohn contributes some important insight to the national debate about infrastructure.  Too much focus has been put on net infrastructure investment, Marohn argues.  Like other capital assets, infrastructure depreciates over time, and our net investment in infrastructure is the difference between new spending, and the deterioration of existing infrastructure.  If our net is too low, it may not be because we’re not build enough new infrastructure, but rather because the old, over-built infrastructure we have is depreciating so fast.  On the sprawling suburban fringe, it’s often the case that we’ve built more infrastructure than we can afford to maintain.  Building even more infrastructure doesn’t solve that problem, but actually makes it worse, by increasing the maintenance deficit and depreciation burden. Marohn explains:

The problem we’re struggling with today is not how little we’re spending, it is how much we built and now have to maintain, and how little we got for those investments. We’re meant to read these shortcomings [the backlog of deferred maintenance] as a consequence of disinvestment. They’re not. They are a consequence of overinvestment, of decades of unproductive infrastructure spending accumulating to crowd out good investments today. It’s true that you don’t have to pay for road maintenance if you don’t build the road, but once you build the road, you are committed. Forever.

Before we go throwing a lot more money at our broken infrastructure system, we should ask why its broken, and figure out how we right size infrastructure so that it isn’t a financial black hole.

New Knowledge

The divergent paths of urban and rural America. Bob Cushing, a frequent contributor to Daily Yonder, has a comprehensive new analysis of the diverging patterns of population growth in the nation’s counties. Cushing uses a rubric for classifying counties by population size and density, a topic we’ve explored in depth at City Observatory.

There’s much to this analysis, and it merits a close look.  We’ll highlight one particularly striking finding:  the very different trajectories of the most urban and most rural parts of the nation.  Cushing uses counties to trace out urban/rural differences, and this chart compares the most urban counties (the central county of each of the 50 largest US metro areas, with the most rural places (counties that aren’t adjacent to any metro area, even small ones).  Back in the 1970s, these very rural areas were, collectively growing almost as much as urban centers.  Since then, and especially in the past decade, it’s mostly been accelerating decline in these very rural areas (red), and strong growth in very urban ones (blue).

There are other interesting findings.  More than a third of US counties lost population over the past 50 years, with most of the declining places being rural.  It’s also important to keep in mind that in every size or urbanness category, there are both gainers and losers.  Suburban counties in the largest metro areas have been the largest net gainers over the past half century; While in the aggregate rural areas have gained population, but the pattern of change is much more mixed than for other categories.  Collectively, rural counties adjacent to metropolitan areas have fared better than more remote rural counties.

Robert Cushing, “Population Gains and Losses Create Two Americas, and the Difference Is Mostly along Rural-Urban Lines,” Daily Yonder, April 15, 2021.

In the News

Nick Kristof’s column “Lessons for America from a Weird Portland,” offering a generally upbeat outlook for Portland’s future in The New York Times cites City Observatory’s analysis of the decline in new apartment construction due to the city’s inclusionary zoning ordinance.

 

 

The Week Observed, May 7, 2021

What City Observatory this week

1. It’s not a bridge replacement, it’s a 5 mile long, 12 lane wide freeway that just happens to cross a river.  The Oregon and Washington highway departments are trying to revive the failed Columbia River Crossing project, peddling it as the “I-5 bridge replacement” project.  That’s incredibly misleading moniker.  While it sounds modest an innocuous, its anything but:  It’s really a 5-5-5 plan:  Spend upwards of $5 billion widening five miles of the I-5 freeway, and pay for (a fraction of the total costs) by charging each vehicle round trip $5.  The images of the project—which the two DOTs have carefully excised from their public presentations—show a monster freeway that Representative Peter DeFazio called “gold-plated.”

It’s a sly and deceptive bit of marketing to dub this giant freeway widening plan a “replacement,” but creating the illusion that this is just fixing a dilapidated and outdated structure is exactly the way state highway departments divert money that could be used for repairs to underwrite oversized boondoggles.

2.  We found the speculators who reaped $2 trillion in profits from the housing market last year.  The public discussion about housing affordability often devolves into a search for villains:  greedy developers, foreign investors, Wall Street banks buying up single family homes.  But new data from the Federal Reserve shows who pulled in $2 trillion in capital gains from housing in 2020:  older, whiter, richer homeowners. Last year’s gains from housing appreciation were three times greater than the total pre-tax household income of the bottom twenty percent of the population.  And under the federal tax code—which generally exempts $500,000 in housing capital gains from taxation—most of that income is tax-free.

As this chart shows, more than half of all the nation’s residential real estate, about $20 trillion worth, is in the hands of the wealthiest one-fifth of all households.  They are the ones who are the big winners from the increases in home values attributable, in significant part, to the policies designed to encourage home ownership.  And while these policies build wealth for mostly older, whiter homeowners, there’s no comparable wealth-building program for the one-third of the nation’s households who rent and who are disproportionately younger, lower income people of color.

Must read

1. Housing lessons from the old school masters.  The Urban Institute’s Yonah Freemark contrasts the generous and geographically well-distributed pattern of social housing in European cities with the much smaller and segregated public housing in the US.  As much as one third of housing in large metro areas in the United Kingdom and France are publicly owned or subsidized, compared with less than 10 percent in comparable US cities.  Here Freemark compares the shares of housing in each of the nation’s three most populous cities that are either social or subsidized housing:

In the US, housing subsidies reach perhaps of fifth of those who are technically eligible; in Europe, housing benefits are much closer to universally available to those who qualify.  And Freemark points out there’s an important spatial difference in social housing on the two sides of the Atlantic.  In the US, most public or subsidized housing is concentrated in city centers, and usually in low income neighborhoods or a dis-favored urban quarter.  In Europe, its far more common to have social housing spread throughout a metro area, including both cities and suburbs.

2. Why artificially cheap gasoline is the opposite of just.  The reason we drive—and pollute—so much in the US is that car ownership and use is heavily subsidized.  One of the deepest subsidies in the US is the very low price of fuel.  Road taxes don’t even cover the cost of building and operating roads, much less the social and environmental damage of driving.  US fuel taxes are roughly one-fifth the average of other industrialized countries.  Charging a price that reflected these costs would lead people to drive less, buy more efficient cars, and choose safer and more environmentally sustainable modes of transport.  But we’re told that any increase in fuel prices would be somehow unfair to low income households.  The Frontier Group’s Tony Dutzik challenges that thinking, pointing out that low income families bear the brunt of climate change, and are disproportionately carless, meaning they also suffer from being second class citizens in a car-dependent world. Most of the economic benefit of artificially low fuel and road prices flows to higher income households, which is inequitable. And in a global context, delaying effective action to reduce US greenhouse gas emissions actually does more to hurt the poor around the world, who are much more vulnerable than even low income Americans.

3. The great ICE hangover.  It’s easy to get enthralled with the idea that electrifying the vehicle fleet will be a simple means to reducing greenhouse gas emissions. But the Transportist’s David Levinson has some sobering thoughts about the fossil fuel hangover implied by the huge existing inventory of internal combustion engine vehicles.  It’s still the case that more than 95 percent of new vehicles have ICE motors, and electric car productions is ramping up only slowly.  In 2030, the vast majority of the nearly 300 million cars on US roads will still be ICE vehicles.  Given the huge sunk costs in both the vehicles, and the fossil fuel industry (from wells to filling stations), there will be strong incentives for both producers and consumers to keep using ICE engines.  As Levinson explains:

One of the undiscussed features of transport electrification is the large number of internal combustion engine (ICE) vehicles that will remain on the road in the absence of prohibition.  There are many stranded incumbents like service stations and their upstream suppliers who will continue to provide fuel for the remaining vehicles, and that fuel will have a lower and lower market price (sans taxes), as demand will have dropped and the supply will not, and existing producers will have huge incentives to pump fuel while it still has some market value. Consumers with older cars will be reluctant to replace their working vehicle when low fuel prices abound. Many just like their cars, and the smell of gasoline is an attractant for some.

It’s a problem that could be solved, Levinson argues, by throwing money at it—about $750 billion by his calculations, to buy-back the existing ICE-engine cars.  There’s a certain internal logic to the idea, but if we’re serious about dealing with climate change, should we be spending three-quarters of trillion dollars here, rather than on more compact urban development, or solar panels?  And why financially compensate those who (a decade or more from now) have decided to buy vehicles we know are contributing to the destruction of the planet.

New Knowledge

The pandemic hasn’t dethroned tech centers. The surge of “work at home” activity during the Covid-19 pandemic has given rise to the idea that tech centers, like Silicon Valley, will lose their edge.  If workers can work remotely, from anywhere, why would they pay the high costs of living in generally more expensive superstar cities.  This is especially true for tech workers, who spend much of their time glued to their computers in any case.

New data from economist Jed Kolko of Indeed.com shows that the geographic concentration of hiring activity by major tech companies hasn’t waned at all during the pandemic.  If anything, the eight leading tech centers have cemented their dominance of tech-related hiring.  While overall hiring growth has been somewhat slower in tech hubs than elsewhere, it’s entirely due to a much weaker performance in non-tech industries in those hubs. The decline in consumer spending by tech workers working remotely has apparently had a vastly larger impact on these tech hub economies than any relocation of tech workers to other metropolitan areas.

Indeed tracks data on hiring announcements by firms in a wide range of industries.  Their data compare the change in job postings in 8 major tech hubs with other large US metros between February 2020 (just prior to the pandemic) and one year later.  Job postings in tech sectors, like IT and software, declined by just about the same amount in tech hubs as other metros.  The big difference in hiring patterns is in other, local-serving industries, like retail, child care and restaurants, where tech hubs had notably larger declines in job postings that other metros.  This suggests that the big impact of work-at-home on local economies has not been to shift the location of tech employment, but rather to reduce local spending in places that have lots of tech workers (who can work remotely, and who have reduced their out-of-home spending as a result).  If or when these tech hub workers return to the office in a post-Covid world, we might reasonably expect those local-serving jobs to rebound as well.

Jed Kolko, “Tech hubs held on to technology jobs during the pandemic,” Indeed Hiring Lab, May 6, 2021. https://www.hiringlab.org/2021/05/06/tech-hubs-held-on-during-pandemic/

In the News

Oregon Public Broadcasting cited City Observatory’s analysis of the destructive effects of freeway building on Portland’s Albina neighborhood in it’s story “A freeway once tore a Black Portland neighborhood apart. Can new infrastructure spending begin to repair the damage?”

Sightline Institute pointed its readers to our story on the true dimensions of the so-called “I-5 Bridge Replacement Project.”

Correction

The originally published version of our commentary, “How ODOT destroyed Albina:  The Untold Story,”  incorrectly claimed that ODOT’s Rose Quarter analysis had not acknowledged the destruction of housing by the construction of Interstate Avenue in the early 1950s.  In fact, one table contained in the project’s environmental justice section concedes that this project demolished at least 80 homes.  We have corrected this story to incorporate this information.  Thanks to a regular reader who pointed this out.  City Observatory regrets this error.

 

The Week Observed, May 14, 2021

What City Observatory this week

Don’t be fooled again.  The Oregon and Washington state highway departments are up to their old tricks in trying to push a multi-billion dollar highway building boondoggle in the POrtland area.  A guest editorial from David Bragdon, formerly President of Portland’s regional government, recounts the lies and deceptive tactics the agencies used in their failed push for the original Columbia River Crossing a decade ago, which did nothing to improve the region’s transportation system, but squandered roughly $200 million on consultants and botched plans.

As Bragdon relates, the two state DOTs are pushing the same high pressure sales tactics of fictitious deadlines and requirements to get the region’s leaders to go along with what now looks like a $5 billion project.

Must read

1. Why the hybrid office won’t last.  There’s immense speculation that our pandemic-induced experience with work-at-home will lead firms and workers to abandon the idea of getting everyone together in offices or other workplaces.  Other’s maintain we’ll get more of hybrid model, working in the office a few days a week, and the rest of the time at home.  Not so fast, argues behavioral scientist Jon Levy.  Writing in the Boston Globe, he concludes that the productivity advantages of close contact, and the disadvantages of being “out of the loop” when working remotely, will have most (or all) of us headed back to the office, pretty much as soon as vaccine deployment allows.

2.   Brookings:  Reducing greenhouse gases requires changing land use.  Adie Tomer, Jenny Schuetz and two co-authors from Brookings Institution’s Metropolitan Policy Program take a hard look at what it will take to make meaningful progress on fighting climate change.  We’ll certainly need to electrify transportation as much as we can, and clean up electric generation, but that won’t be enough:  we’ve got to fundamentally change American land use patterns.

Simply put, the United States cannot reach its GHG reduction targets if our urban areas continue to grow as they have in the past. After decades of sprawl, the U.S. has the dubious honor of being a world leader in both building-related energy consumption and vehicle miles traveled per capita. Making matters worse, lower-density development also pollutes our water and requires higher relative emissions during the initial construction. That leaves the country with no choice: We must prioritize development in the kinds of neighborhoods that permanently reduce total driving and consume less energy.

There’s little doubt the Brookings analysis is right; if anything though, they might be even more pointed in their policy prescriptions.  For example, liberalizing zoning to ease “missing middle” housing like duplexes and triplexes is important, but the big gains for climate are likely to come from widely legalizing apartments, particularly in transit served locations, as State Senate Scott Wiener has proposed in California.

3.  Maryland dials back a big freeway expansion project.  For several years, Maryland Governor Larry Hogan has been pushing a public-private partnership to widen more than 30 miles of the Capital Beltway around Washington DC.  It appears that the Governor is now backing off from a big part of that plan, at least for the time being.  Local officials opposing the project say it signals a shift in policy from the Biden Administration.  Maryland Matters reports that Montgomery County Council President Tom Hucker says:

. . . the state’s decision to drop a large piece of the original plan appeared to be an acknowledgement that President Biden and U.S. Transportation Secretary Pete Buttigieg have a vastly different philosophy than the prior administration.  The state’s approach “abjectly fails the test for transportation projects that the federal government now has — that they move people and not vehicles, and that they pass a climate test and a racial equity test,” Hucker said. “This 1970s-style project fails those 21st century standards.”

Freeway opponents around the country will want to take a close look at this decision.

Must attend

Freeway Fighters Social Hour (Thursday, May 20 @ 7pm ET)

Are you fighting to stop a highway expansion or engaged in a campaign to take down a highway in your community? Come meet others from across the country grappling with similar issues and share what you’re working on. The Congress for the New Urbanism is hosting a virtual Freeway Fighters Social Hour on Thursday, May 20th at 7pm ET. Contact Ben Crowther at bcrowther@cnu.org for a Zoom invitation!

New Knowledge

Does high speed rail benefit intermediate communities?  One of the explicit objectives of many high speed rail projects is to decentralize economic activity and relieve the perceived pressure on urban centers. But in practice, does the construction of high speed rail to smaller towns actually result in job or population decentralization.

Japan has long had the world’s most extensive high speed train network, and it provides a good laboratory for observing the long term effects on development patterns.  A new study by Hans Koster, Takatoshi Tabuchi, Jacques-François Thisse concludes that high speed rail has the unexpected effect of diminishing smaller intermediate communities.

They explain this seemingly paradoxical finding by noting that improved transportation is subject to distance effects, that is, faster travel is of more importance to longer trips than to shorter ones.  As a result, time savings at closer, intermediate destinations don’t make much difference to location economies.  In fact, better transport creates greater competition for those locations, meaning businesses face more competition from larger hubs.  As they explain:

. . .  there is a trade-off between a ‘hub effect’ and a ‘market size effect’. In presence of long-haul economies, the hub effect implies that a connection to the new infrastructure makes it easier to reach other places through lower transport costs. This in turn attracts more firms and employment. By contrast, the market size effect implies that if a region is small, it is easier for firms to set up a business in a core region and to transport goods or people to the small, connected region instead.

High speed rail may bring a nation or a region closer together, but it doesn’t necessarily have positive benefits to every community connected to the network.  In particular, this study suggests that assumptions that high speed rail would bolster new or expanded satellite cities—an idea proposed for the Pacific Northwest and challenged here by our colleague Ethan Seltzer—has the rail effect almost exactly backward.

Koster, H., T. Tabuchi and J. F. Thisse (2021), “To be connected or not to be connected? The role of long-haul economies”, CEPR Discussion Paper 15905.

In the News

The unlikely couple of Richard Florida and Joel Kotkin quoted City Observatory’s analysis of the migration patterns of the young and restless in their commentary on the likely trajectory of post-Covid cities.

The Oregonian cited City Observatory Director Joe Cortright’s testimony to the Oregon Legislature in its coverage of proposed toll-bond legislation.

Writing in The Oregon Catalyst, Lars Larson highlighted our description of the so-called I-5 bridge replacement project, which is really a $5 billion, 5-mile long 12 lane freeway that will be paid for, in part, by $5 round trip tolls.

The Week Observed, May 21, 2021

What City Observatory this week

1. Needed:  A bolder, better building back. In response to an invitation from its authors, we take a look at a  “grand bargain” proposed by Patrick Doherty and Chris Leinberger for breaking the political log jam around infrastructure.   If there is something to be gleaned from Eisenhower and Lincoln (in addition to FDR), it was that each of them proclaimed and implemented an entirely new and much larger federal responsibility for a key aspect of the nation’s development.  Arguably transportation was timely in 1860 and 1956, but the nation’s needs today are different.  Viewing our problems through the lens of transportation, and constructing a grand bargain that provides more (or somewhat re-jiggered) subsidies for transportation misses the opportunity to make the principal focus placemaking and fixing the shortage and high cost of housing in walkable locations.

2. Why can’t DOT’s do environmental mitigation for people and neighborhoods?  Across the nation, freeways have wrought a terrible toll on many of the nation’s urban neighborhoods.  In Portland, we’ve chronicled how three different highway projects over a period of two decades wiped out hundreds of homes and led to a decline of the historically Black Albina neighborhood.  The Oregon Department of Transportation is back, on the one hand apologizing for what it did then, but also promising to double down by widening the I-5 freeway to ten lanes. While ODOT says it supports restorative justice, its clear that the neighborhood needs, most of all, more affordable housing.  Highway agencies like ODOT routinely spend funds on all kinds of mitigation to offset the negative effects of highways (including habitat restoration, noise walls, and even building jails).  We ask why the principle of mitigation that’s used to justify these expenditures shouldn’t be extended to spending to restore the neighborhoods devastated by past (and present) highways.

Must read

1. Parking is the scourge of cities.  UCLA Professor Michael Manville has an accessible and compelling essay explaining how catering to car storage has damaged the nation’s cities. In particular, parking mandates that block and drive up the cost of housing and commercial development, and which transform the landscape into car-dependent sprawl, have, over a course of decades made it impossible or illegal to build the kind of interesting, walkable spaces Americans value most.

Because parking requirements make driving less expensive and development more so, cities get more driving, less housing, and less of everything that makes urbanity worthwhile. This process is subtle. . . . A commercial requirement of one parking space per 300 square feet means developers will put new retail in a car-friendly, pedestrian-hostile strip mall. And a requirement of one parking space per 100 square feet for restaurants means the typical eating establishment will devote three times as much space to parking as it will to dining. America did not become a country of strip malls and office parks because we collectively lost aesthetic ambition. These developments are ubiquitous because they are the cheapest way to comply with regulations.

2. Recapping the housing affordability debate.  Writing at Planetizen, Todd Litman has a comprehensive and opinionated take on the arguments and factions in the nation’s housing affordability.  Like Caesar’s Gaul, the housing world is divided into three ideological parts:  the free market proponents, the housing experts and the  housing supply skeptics.  The “free market” proponents excoriate regulation, particularly environmental regulations which they blame for high housing prices, but as LItman notes, their analyses entirely omit consideration of transportation costs, which mean that low priced housing in sprawling cities comes with higher total living costs for most households.  The labels give away the conclusion:  Litman argues that the housing experts—with references to 14 peer reviewed studies—show that adding housing supply, by eliminating parking requirements, ending apartment bans, and allowing missing middle housing, all done at scale, can moderate or reduce housing costs.

. . . both Free Market Advocates and Supply Skeptics rely on poor quality evidence, and that neither ideological extreme offers comprehensive solutions to unaffordability problems; their prescriptions are only appropriate in specific, limited situations. Housing experts have solid research that can help guide planners to develop the combination of policies that can achieve affordability and opportunity goals.  . . . If we follow the science we can identify excellent solutions to unaffordability problems. There is abundant credible evidence that large-scale upzoning to allow more affordable housing types in walkable urban neighborhoods can significantly increase housing and transportation affordability for low- and moderate-income households.

All this is presented in Litman’s usual clear and methodical way.  Whether you agree or disagree with his conclusions (we’re in broad agreement), Litman has clearly sketched out the basic positions of the different camps.  If you’re looking to get a useful orientation to the multiple and conflicting views on housing and affordability, you’ll find this an invaluable resource.

3. Testing the assumptions behind placemaking. What’s your theory of change?  Around the country, local community development efforts invest considerable resources in a range of placemaking activities.    Brent Theodos of the Urban Institute reflects on the success factors, and in particular the often un-stated assumptions that underpin these efforts.  He’s got a concise and well argued list of questions that should be top of mind when thinking about the complex processes at work in distressed neighborhoods.

Not all conditions can be solved or addressed locally or even regionally—at least not without a mobilization of public resources unlike what we have seen before. Relatively few place-based efforts have thought carefully about which factors can and cannot be addressed locally. Even a robust place-based development strategy may fail in the face of a weakening regional economy or shrinking population. Neighborhoods are not islands. That said, neighborhood redevelopment is possible in declining regions, but doing so requires considerable resources.

In the news

Willamette Week highlighted former Metro President David Bragdon’s critique of the Oregon Department of Transportation’s deceptive sales tactics for its proposed big freeway expansion projects.

The Portland Mercury cited City Observatory’s analysis of the greenhouse gas pollution that would be created by the Oregon Department of Transportation’s $800 million I-5 Rose Quarter freeway widening project.

 

The Week Observed, July 16, 2021

What City Observatory did this week

An open letter to Secretary Pete Buttigieg on his visit to Oregon.  Transportation Secretary Pete Buttigieg came to Oregon this week to look at some local transportation innovations.  The group No More Freeways, which opposes an Oregon Department of Transportation plan to widen I-5 through Portland’s Rose Quarter wanted to enrich his travelogue with some additional information.

In light of the Biden Administration’s commitment to promoting equity, restoring the damage done by the construction of freeways through urban neighborhoods and fighting climate change, the letter suggests that Buttigieg might want to direct the Federal Highway Administration to do a full Environmental Impact Statement on the ten-lane $800 million freeway-widening project, rather than claiming it has “no significant environmental impact.”

Must read

1. Covid didn’t kill cities.  When the coronavirus shut down New York City in the spring of 2020, fears about the future of urban life ran wild. Were we ever going to see cities thrive again? Was it unsafe to live in highly dense areas? People prophesied that the pandemic would bring the End of Cities. This doomsday never arrived though. Cities survived the catastrophic event, as they always do. In this New York Times article, Emily Badger ponders the question, “What is so alluring about the perpetually imminent End of Cities?” The author explores the roots of this narrative and its connection to personal experience as well as aspirations. Anti-urban sentiment has been present since early America as cities have been intrinsically tied to corruption and the stereotypes surrounding people of color and immigrants. Badger speculates that there is an anti-urban element within all of us, which can be triggered by uncomfortable moments. It makes us feel as though a city can be punished for its adverse qualities. The pandemic provided a reason to be alarmed by the nature of city life and provided a form of “retribution,” making this doomsday narrative attractive. However, this sentiment is predominantly one of White professionals and fails to capture the reality of most residents. At the end of the day, the catastrophic end of cities has not arrived. Cities will persevere through this pandemic and the next.

2. How bad is rent inflation?  Alarming says the WaPo.  Not so fast says Kevin Drum.

“Rent prices are up 7.5 percent so far this year,” cried The Washington Post this week. Using Apartments.com and Zillow data, Heather Long makes their case for this alarming spike in rental prices across the country. The article highlights the increases in rent prices of cities like Phoenix, Boise, and Stockton (deemed the “Inland West”). Long calls on prospering landlords and struggling residents to further her argument. The article concludes with an ominous warning for inflation: if rent prices continue to increase at these exorbitant rates, we should be prepared for other prices to increase as well.

Kevin Drum disagrees. In his blog post, Drum refutes Long’s claims of stark rises in rental prices using data from the Bureau of Labor Statistics. He argues:

“Not only have rents not risen 7.5% so far this year, BLS says they’ve risen a paltry 0.5% in big cities and less than 1% in midsize cities since January. And rent growth has been falling steadily for over a year. Compared to 12 months ago, rent is up a very modest 1.2% in big cities and 2.6% in midsize cities.”

As it turns out, roverall ents are not increasing by the alarming 7.5 percent. The Washington Post article essentially cherry-picks booming rents in selected cities. It ignores the declining rental prices in bigger metropolitan areas like Seattle and New York. Rents will always be surging in some cities and lagging in others, but that does not justify causing alarm across the entire nation. Don’t fret, today’s boom in Phoenix’s rental prices unlikely to trigger national inflation.

3. Single-family zoning and the right:  False populism of the NIMBYs.  Writing at New York Magazine, Eric Levitz underscores the contradictions and faux-populism of right-wing commentators like Tucker Carlson, who fail to acknowledge the critical role that single family zoning has in worsening housing affordability for low and middle income households, while simultaneously decrying policies that would make housing more affordable.  Levitz writes:

‘What makes his [Tucker Carlson’s] defense of single-family zoning so instructive is that it’s both anti-free-market and anti-working-class…. For the downtrodden “forgotten men and women” whom right-wing populists claim to champion, single-family zoning is actually a scourge. It prevents homeowners in economically depressed regions from affordably relocating to thriving metros. And it forces renters to forfeit an ever-higher share of their income to landlords…. Viewed from the perspective of society as a whole, single-family zoning is ruinous….

Carlson and others may be addressing economic anxieties, but they are those of homeowners looking to guard their wealth, and to maintain financial, geographic and social distance from the lower class.

Carlsonism is a politics of downward-looking class resentment disguised as its opposite…. This era’s self-styled populist conservatives have consistently demonstrated their fealty to small-time capitalists—and contempt for the median laborer—in just about every major policy fight of Biden’s tenure.

As we’ve shown at City Observatory, housing wealth (and appreciation) accrues largely to wealthier, whiter, older households.  If we’re talking about working class Americans, especially younger generations and people of color, single family zoning isn’t helping them prosper.

New Knowledge

Black Entrepreneurs, Job Creation, and Financial Constraints.    According to the Federal Reserve’s Survey of Consumer Finance, the current racial wealth gap remains awfully similar to the 1960s. The median and mean wealth of Black families is less than 15 percent of white families. What impact does this disparity have on Black entrepreneurs and their business? 

Research by Mee Jung Kim, Kyung Min Lee, J. David Brown, and John S. Earle analyzes these Black-White differentials in wealth, financial constraint, and employment. Black-owned businesses tend to have fewer resources, creating a greater dependence on outside funding. At the same time, they are less likely to receive a bank loan. Overall, Black owners face greater financial constraints compared to white owners, a phenomenon likely inflated by information asymmetries and racial discrimination. The research provides an in-depth look at the differences in characteristics among Black and white businesses. The authors find that Black owners, on average, are more likely to be younger, have higher education, greater motivations for entrepreneurship, and manage younger, smaller firms compared to white owners. Black-owned businesses have about 12 percent fewer employees that white-owned businesses. However, the researchers’ empirical analysis implies that with the same financial access, Black-owned firms would actually be significantly larger than white-owned firms. 

The research also explores the impact of an intervention, the Community Reinvestment Act. They find that increased access to credit enables Black-owned firms to grow faster.  Expanded credit access enabled by the CRA raises employment by 5-7 percent more at Black businesses compared to white firms in the treated neighborhoods. The researchers have a thoughtful discussion on the impact financial constraints have on Black businesses. Their comparison of firm characteristics and employment coupled with an analysis of the CRA underscores a key roadblock Black businesses must overcome. Black entrepreneurs face greater financial constraints, limiting their ability for growth, employment, and success. Looking forward, increasing their financial access through policy like the CRA can help alleviate the disparities we see today.

In the News

D Magazine asks “What’s a convention center worth to Dallas?”  They question whether the city should puts hundreds of millions into upgrading its convention center, given the dim prospects for this business and the ruinous competition among cities.  They cite a City Observatory op-ed co-authored by former Seattle Mayor Mike McGinn and Joe Cortright, outlining the grim economics of the convention business.

The Week Observed, July 30, 2021

What City Observatory did this week

Oregon Department of Transportation’s Climate Fig-Leaf.  Transportation is the largest source of greenhouse gases in Oregon, and the state’s Department of Transportation is—yet again—advancing PR heavy strategy documents that contain no measurable objectives or accountability.  The latest plan, a so-called “Climate Action Plan,” repeats disproven climate myths (that idling in traffic is a key source of greenhouse gas emissions, or that electronic freeway signs will reduce carbon emissions).

A cynical fig-leaf from Oregon DOT

Instead of real action items, the document offers a string of mostly meaningless busy-work tasks, none of which have any demonstrable effect in reducing greenhouse gas emissions. Strikingly, the document doesn’t acknowledge that transportation accounts for 20 million tons of greenhouse gases in Oregon a year, and that amount has gone up since ODOT first advanced its “Sustainable Transportation Strategy” eight years ago.  When you read the details of the plan, the agency’s real priorities are apparent:  getting more money to build roads.  The Oregon Department of Transportation is complicit in concealing and worsening climate change, just as the state is being plunged into record heatwaves and wildfires.

Must read

1. Why solving climate change requires tackling land use.  The transportation sector is responsible for largest source of greenhouse gas emissions in the United States, emitting 1.6 billion metric tons of CO2 equivalents annually. Writing for Center for American Progress, Kevin DeGood argues:
“Climate change cannot be addressed without reforming land use, and land use cannot be changed without reforming transportation.”
So, how can we reform transportation and land use? DeGood highlights the solutions necessary to fixing unsustainable transportation systems by comparing the transportation systems of Washington DC and Ohio and advocating for the INVEST act. Urban density promotes more economical transportation; each mile of roadway in the District of Columbia  supports 4 1/2 times as many residents compared as Ohio, and over 50 percent of its residents commute by transit, bike, or walk. The capital’s system is more efficient, productive, and environmentally friendly compared to Ohio, showcasing a potential for sustainable transportation and land use system at a human scale. The INVEST act, already passed by the House, provides changes to the national transportation goals and creates new environmental benchmarks for states to receive federal funding. DeGood argues that steps need to be taken to combat climate change and create a healthier, sustainable nation.
2. How we saved 3 hours a week by not commuting. The Bureau of Labor Statistics recently released the American Time Use Survey results from May to December of 2019 and 2020. They explored time we spent in work, leisure, travel, childcare, and more. Their comparisons of time usage in 2019 and 2020 present some compelling takeaways. As many of us experienced, the prevalence of remote work increased in 2020. The rate nearly doubled, rising 20 percent points from 2019. For many, the commute to work transformed into a short walk to your desk. Time spent traveling decreased by 26 minutes per day, from 1.2 hours in 2019 to 47 minutes in 2020. We saved an average of 3 hours a week by not commuting to work this past year. How’d you take advantage of that time?

3. The electric car obsession is getting in the way of reducing transportation greenhouse gases.  More and more electric cars are on the road today.  But electrification can’t be the only means of reducing transportation greenhouse gases. However, solely focusing on an electric fleet of automobiles is hindering progress towards our goal of net zero emissions. In fact, it would take decades to see significant progress and eliminate the global supply of fossil fuel cars. Christian Brand wants you to put down your Tesla catalogue and consider a better solution – active travel. Brand and his associates at Oxford present their research on active transportation (walking, cycling, and e-biking) and its ability to reduce global emissions in this article. Emissions from one bike ride can be more than 30 times lower than a drive in a fossil fuel car and about 10 times lower than an electric car. Brand advocates for active travel and urges cities to make safer, more accessible roads for bikes and pedestrians. He asserts,

“Active travel can contribute to tackling the climate emergency earlier than electric vehicles, while also providing affordable, reliable, clean, healthy and congestion-busting transportation.”

If we want to get to net zero emissions quickly, we must consider the importance of active travel.

4.  Cities aren’t cesspools. The narrative that cities are hellacious centers of violence, crime, and deadly vices is so 1975. As the Covid pandemic and the culture wars remind us, athe anti-urban impulse is a recurring theme in American politics, one that widens the  disconnect between rural and urban America. Writing for the New York Times, Paul Krugman criticizes the myth and those who benefit from it. Krugman disputes the narrative of urban doom by exploring the social problems of the ‘eastern heartland’ as well as the impacts of COVID-19 on the nation. The mythology of urban vice and rural virtue overlooks both the common strengths and problems of both regions, and also neatly elides the fact that powerhouse urban economies generate the bulk of national revenue that subsidizes red states and rural areas.

 

The Week Observed, September 17, 2021

What City Observatory did this week

The cost of Oregon DOT’s Rose Quarter project has nearly tripled to $1.25 billion.  Just four years ago, the Oregon Department of transportation sold its mile-and-a-half long I-5 freeway widening project through Portland as costing a mere $450 million.  Earlier this month, it revealed new cost estimates that show the project will cost almost triple that amount (as much as $1.25 billion).  And that’s before any payment of interest on the borrowing that will be required to move the project forward, much less the cost of building housing and rebuilding schools promised as a way of mitigating the damage the freeway has done to the historically Black Albina neighborhood.  Massive cost overruns are a regular feature of ODOT projects, which have routinely posted final costs 2 to 3 times the estimates floated when they were approved.  Just as the agency famously dynamited a whale on an Oregon beach in 1970, this is another thing that’s blowing up in the agency’s—and taxpayer’s—faces.

ODOT then: Exploding Whales. ODOT Now: Exploding Budgets

Must read

1. The chain reaction of building luxury apartments. Often times, the sight of new luxury apartments leads to accusations of gentrification and rent increases. Advocates for affordable housing typically challenge this construction because they believe it only helps the wealthy. Full Stack Economics’ Timothy Lee refutes that argument and highlights research from Finland and the USA to show the comprehensive impact of luxury apartments on the overall housing market. Last month, three Finnish economists published a research paper that showcased a chain reaction from high-end apartment construction that fall across all income levels. The economists wrote, “For each 100 new, centrally located market-rate units, roughly 60 units are created in the bottom half of neighborhood income distribution through vacancies.” The new construction led to affordable housing options for rich and poor renters. In America, the results from research by economist Evan Mast prove to be similar. Chain reactions occur when new apartments are built. In this piece, the author emphasizes the positive impacts of construction, regardless of its status. When new luxury apartments are constructed, older buildings receive vacancies. The more new buildings being constructed means that older buildings will be “filtered down” to lower income tiers at quicker rates. Lee argues that the goals of developers and affordable housing advocates do not conflict like it may seem. More housing is good housing – even the luxury high rises.

2. Half a million public EV chargers is not the answer.   The discourse on electric vehicles has been all the rage recently, particularly with President Biden’s goal of 50% EV sales by 2030. Just last week, the New York Times published an article announcing the US was in need of hundreds of thousands more public chargers. They argued that in order to make the adoption of EV feasible, fast public chargers would need to be placed across the nation (similarly to gas pumps). However, Vice’s Aaron Gordon does not agree. In this article, the author refutes that a lack of public chargers is a major barrier to the EV industry. Instead, he challenges readers to chance their concepts of an electric vehicle. While a normal car needs a gas pump, an electric vehicle does not need a public charging station to drive where it needs. Gordon explains,

“Most EV owners charge their cars like their phones, plugging it in overnight and having a full charge that lasts all day or more in the morning. In fact, a car charge will last much longer than your phone, on average. You’ll probably use a public charger much less often than you plug in your phone away from home.”

With the ability to charge at home, an electric vehicle eliminates the consistent need for public refuel centers. Gordon writes a thorough explanation for why adding half a million public EV chargers like the New York Times suggested would be unsustainable, expensive, and wasteful. The author asserts that “the path to a better, greener future is actually far more achievable than one might think or experts are advising.” The future of electric cars is viable. Adding unneeded infrastructure will only make this future harder to achieve.

3. The disappearance of long commute times in the Seattle area.  The rise of remote work as a result of the COVID-19 pandemic has lead to drastic changes in people’s travel patterns. What length of commutes have been altered the most due to remote work? The Seattle Times’ columnist Gene Balk uses Nielsen data to explore the changes in the work commutes of Seattleites. Balk found that the number of people who did not commute more than doubled from pre-pandemic life. Breaking down the data by commute time showed a compelling relationship. Nearly 250,000 workers who had commutes over 20 minutes no longer traveled to work while there was a slight increase of workers commuting in under than 20 minutes. In particular, there was a 34% decline among 30-59 minute commutes. Remote work environments eliminated the longer commutes at a staggering level. Galk compared these results to occupational status to find a slight correlation. According to the data, a higher proportion of white collar workers lived more than 20 minutes away from their office. These workers were also more than twice as likely to work remotely compared to a blue collar occupation. Remote work changed the lives of many throughout the past year, and for nearly a quarter of a million in the Seattle area, it terminated the long commutes.

New Knowledge

France’s SRU Law and its Potential in the US.    At the beginning of the century, France took a major leap in combating the economic segregation in its housing. In December 2000, the Los relative à la Solidarité et au renouvellement urbains (SRU law) passed, jumpstarting a redistribution of affordable housing more evenly across all communities in French metropolitan areas. In 2013, the law was strengthened, mandating that urban metro areas reserve 25 percent of housing stock to social housing by 2025. (Unlike inclusionary housing requirements in the US that require private developers to set aside privately built units as affordable housing, the SRU law applies to the construction of social housing, paid for directly with public funds). Dr. Yonah Freemark recently published an evaluation of the law’s effectiveness in France and tested the applicability of this housing initiative in the United States.

In his study, Dr. Freemark finds that the law led to  a more equal distribution of subsidized housing in French urban areas. In particular, the SRU law was effective in opening up neighborhoods that had historically excluded low-income housing. In the French cities with the lowest share of social housing, the number of these affordable units increased from 626 in 1999 to 4,737 in 2017. Wealthier, lower-affordability communities saw significant improvement in their shares of social housing, which highlights the law’s success. Segregation by income across the country decreased as a result of the SRU law. While Freemark finds that France’s goal to have 25 percent social housing by 2025 is unlikely to be accomplished, the SRU law has shown its ability to impact the structure of urban housing markets.

The study then examines the possibility for a similar fair-share housing act to be implemented in American urban areas. Freemark compares France’s housing distribution to the state of Connecticut. What he finds – Connecticut’s affordable housing stock is significantly lower in urban areas and its distribution statewide is severely less evenly among metropolitan areas compared to France. Freemark claims that the public subsidized, affordable housing in Connecticut is more inequitably distributed than France ever experienced. This highlights major room for improvement that states like Connecticut could have by implementing similar programs to the SRU law. While there are clear limitations, Freemark asserts that there is potential for programs that could benefit thousands of American families. France’s fair-share SRU law has been effective at creating more equitable housing across the country, and it is a model that could aid low and moderate-income families in the United States as well.

In the News

Strongtowns has a great article profiling freeway fights across the country, plaintively asking why state DOT’s are planning like its 1961, instead of climate crisis 2021.  They focus a spotlight on City Observatory’s coverage of Portland’s proposed Rose Quarter Freeway widening project, calling out the cynical woke-washing of the project by the Oregon Department of Transportation:

The Week Observed, September 10, 2021

What City Observatory did this week

Talkin’ ’bout my gentrification.  Jerusalem Demsas of Vox has a thoughtful synthesis of what we know about gentrification.  If we’re concerned about poverty and inequality, gentrification is far from the biggest problem we face. Gentrification is surprisingly rare, and while it brings inequality into sharp focus, there’s precious little evidence of widespread harms.

The bright spotlight shining on a relative handful of gentrified neighborhoods hides in the shadows the real and far more pervasive problem of concentrated poverty. And, as we’ve written at City Observatory, the myopia of gentrification scholarship means that few studies every consider the counterfactual case of asking what happens to people who live in poor neighborhoods that don’t rebound.  The most common anti-gentrification tactics—blocking new market rate housing—actually make housing affordability and displacement worse.

Must read

1. What is the conservative position on zoning?  Libertarianism, local control or “owning the libs.”  Writing at the bloc Pocket Track, Alex Schieferdecker has a thoughtful essay digging into conservative politics and political philosophy as applied to land use.  There’s a division on the right, according to Schieferdecker.  Philosophically, the libertarian and less-government conservatives ought to be opposed to most land use controls, which ought to put them in league with the YIMBY activists.  After all, what is single family zoning but a constraint on the property rights of a landowner.  Meanwhile–and with a substantial boost from Donald Trump and right-wing media–other conservatives have come out swinging to protect the ability of governments to strictly regulate property (as long as they’re local governments and they’re regulating out apartments).

The underlying divide between right wing pro-zoning politicians and latent-YIMBY libertarians has everything to do with zoning’s key role in establishing and enforcing class and racial divisions.  As Schieferdecker writes:

. . . the intention of the vast majority real-world zoning laws has been to separate people from each other. These laws were originally written explicitly to separate different races until that overt approach was ruled unconstitutional. Cities then seamlessly pivoted to writing zoning laws that separated different styles of housing as a proxy for social classes which in turn were a proxy for race. Not everyone is willing to say it out loud, but the central purpose of much of American zoning remains the preservation of spatial income segregation. The identical idea is instead smuggled into the public conversation under the guise of “protecting property values” or “preserving community character.”

2. Glaeser & Cutler on the shortage of cities.  In a Wall Street Journal Op-ed that previews a key theme from their new book—”The Survival of Cities”—economists Ed Glaeser and David Cutler voice a familiar theme:  Many thriving desirable cities, especially in California, have blocked the construction of new housing that would allow more people to participate in their productive thriving economies.  Blocking housing construction in productive cities pushes up home prices, leads to sprawl and lengthened commutes, and contributes to greater energy use and greenhouse gas production.  An essential first step, according to Glaeser and Cutler is getting state legislatures to mandate that local governments allow more housing, something that is starting to happen, but in their view needs to go considerably further.

With housing, the key actors are state legislatures, because they can rewrite the rules of local zoning on a dime. Last month, the California legislature passed a law that could make the permitting of two-unit projects far more automatic. It’s a good beginning, but states should go further and only allow localities to impose regulations and rules that have gone through rigorous cost-benefit analysis.

Overall, this op-ed echoes a key theme we’ve pursued at City Observatory:  the demand for cities and urban living has far outstripped the supply of great urban places. Improving our cities, and building more housing in the places where demand and opportunity are abundant is a key to addressing many national challenges.

3. American’s don’t really hate density.  The latest version of a Pew survey on America’s residential neighborhood preferences has invited a wide range of punditry.  The survey claims to find that by a 60 to 39 percent margin, Americans would prefer to live in larger homes, more distant from schools, stores, work and other common destinations.  While some see the demise of urbanism, or the lingering effects of the pandemic, we think Alex Pareene has a far more nuanced take.  The question of the idea community has gotten bound up in the culture wars:  The real motivation has less to do with urban form than the ability to determine who one’s neighbors are:

White Americans want an endlessly appreciating asset and the ability to police who their neighbors are and what they do. Housing segregation, suburban sprawl, and planned communities are how they won those things, and how they protect them. Automobile dependence is mainly a necessary side effect.

As Pareene points out, when it comes to vacationing at Disneyland or going on an ocean-cruise, middle and upper class Americans have no resistance to dense, car-free environments–they even pay a premium for the privilege.  What’s really at work are issues of class and control.

New Knowledge

The red-blue dimension of Covid vaccination.  New data on county level vaccination rates underscores how the profound political divide in the United States is affecting public health.  People who live in counties that voted most strongly for Joe Biden have the highest rates of vaccination; those who voted for Donald Trump have the lowest.    The following scattergram, using the red/blue partisanship convention shows vaccination rates of US counties as of August 31, with county population proportional to bubble size.  The fitted regression line implies that on average a county that voted 65 percent for Biden had about a 55 percent vaccination rate, while a county that voted 65 percent for Trump had a 40 percent vaccination rate.

Over the summer, the relationship between county partisanship and vaccination status has grown steadily stronger.  In the Spring, county vote for President explained only about 5 percent of the variation in vaccination rates across counties; today it statistically explains about 45 percent of the variation.  Unlike the previous waves of the pandemic, when there was no available vaccine, the current wave, driven by the Delta variant is plainly a product of our political division.

Charles Gaba, Weekly Update: U.S. #COVID19 Vaccination Levels By COUNTY & Partisan Lean, ACAsignups.net, August 31, 2021.

 

The Week Observed, April 1, 2022

What City Observatory did this week

The Cappuccino Congestion Index.  Media reports regularly regurgitate the largely phony claims about how traffic congestion costs travelers untold billions of dollars in wasted time. To illustrate how misleading these fictitious numbers are, we’ve used the same methodology and actual data to compute the value of time lost standing in line waiting to get coffee from your local barista. Just like roadways, your coffee shop is subject to peak demand, and when everyone else wants their caffeine fix at the same time, you can expect to queue up for yours.

Just as Starbucks and its local competitors don’t find it economical to expand their retail footprint and hire enough staff so that wait times go to zero (your coffee would be too expensive or their business would be unprofitable) it makes no sense to try to build enough roads so that there’s no delay. Ponder that the next time you’re waiting for your doppio macchiato.

Must read

San Francisco struggles with rezoning.  UC Davis Law Professor Chris Elmendorf has a thoughtful–and blistering–critique of San Francisco’s proposed up-zoning plan.  Here’s the background:  State law requires cities to zone for a share of their region’s future housing needs as part of a process called RHNA (regional housing needs analysis).  San Francisco’s share for the next decade is about 80,000 new homes.  In the past RHNA requirements were toothless, and cities could (and did) make plainly unrealistic assumptions about what could be developed based on zoning.  Amendments to state law give the state housing agency the ability to make tougher demands, including forcing cities to use much more realistic assumptions about how much land that is zoned for development will actually be developed in the next several years. Essentially, cities must evaluate the probability that up-zoned parcels actually get built on, and then assure there’s enough zoned capacity that the goal will be met.  But in the process of doing its analysis, Elmendorf argues, the city has dramatically exaggerated how many units will get built, and its policies counting on new housing getting built in locations that are simply uneconomic:

Finally, after all the massaging of numbers, San Francisco concludes that it ought to rezone for roughly 22,000 more homes, and that for fair-housing reasons, they should be located on west side of city. Some housing advocates are rejoicing.  But: in connection with its analysis of constraints, San Francisco hired a consultant for pro-forma analysis of different types of housing projects in different areas…and the consultant concluded that *nothing pencils out on the west side*.   On the basis of that study, the city planning department says that with current permitting process, impacts fees, exactions, and construction costs, the *only* kind of project that’s economically feasible is a 24+ story high-rise in city’s highest-demand neighborhoods. Yet San Francisco “plans” to meet its 22,000 unit shortfall (after hand-waving) by rezoning west-side corridors for 55′-85′ projects that per city’s own analysis would have *negative* rate of return. This is a cruel joke. Except it’s no joke.

New York’s Hotels-to-Affordable Housing Stumbles Over Zoning Requirements.  The Covid pandemic caused hotel occupancy to plummet, and created an opportunity to transform under-used hotel rooms into affordable housing.  New York City set aside $100 million to buy and renovate hotels, but so far, precious little has happened, according to Bloomberg.  The key problem:  zoning and building code requirements create delay, uncertainty and higher costs for potential conversions.  One organization looking to convert hotels reports:

Changing a property’s certificate of occupancy to residential use is time-consuming and triggers building code requirements that are costly or impossible to comply with, said Breaking Ground’s Rosen. . . . The HONDA Act also required that every room have a bathroom and kitchen or kitchenette, even though about 80% of the city’s 120,550 hotel rooms are in Manhattan and most are too small to accommodate even a small cooking area. Adding kitchens and complying with other building code requirements for residential buildings would require expensive renovations.

It’s a reminder that even where this is money for affordable housing, zoning and building regulations are still a huge barrier.  

No, we didn’t need to destroy our cities with urban freeways.  It’s well known that in the 1950s and 1960s, highway departments (and some cynical city leaders) chose to plow interstate highways through the middle of urban neighborhoods, especially those occupied by low income residents and people of color.  As we reported at City Observatory, even neighborhoods like Greenwood in Tulsa, which were literally bombed and burned by white radicals, managed to be rebuilt, only to be finally done-in by the construction of interstate highways.  The Daily Show’s Trevor Noah recounted this story, but then, largely excused it, saying “well, the freeways had to go somewhere.”  Tony Dutzik of The Frontier Group begs to disagree:

About halfway through detailing the history of racism in mid-century highway construction, Noah said something that caught me short:

“Now don’t get me wrong: The highways had to go somewhere, right?”

To which I immediately responded in my head: Did they? 

The answer is no. Plowing multi-lane expressways through the middle of American cities was a choice – and a colossally bad one. Those highways did more than just wipe out poor and minority communities or reinforce racial boundaries – they cut the literal heart out of many of our cities, slashed their tax base, accelerated the movement of people and wealth to the suburbs and cemented our dependence on cars, subjecting generations of Americans to health-threatening levels of air pollution and increasingly rapid climate change.

As Dutzik points out, virtually no European cities made this mistake, and the US cities that tore out urban freeways have built thriving neighborhoods in their place.  This tale of the supposed inevitability of urban freeways is evidence of the deep-seated bias in our national conversation, and as Dutzik puts it, a failure of imagination.  If we’re going to make progress on these issues, whether climate, or car-dependence or building affordable, inclusive cities, we have to overcome this kind of blinkered thinking about possibilities.

New Knowledge?

Editor’s note:  Our “New Knowledge” feature generally highlights recent research that we think our readers will find informative.  This week, as we do occasionally, we highlight some recent research about which we have questions.

School Closures and the Gentrification of the Black Metropolis. This paper looks at the correlation between school closures and neighborhood gentrification.  It purports to have found that the closing of schools in predominantly Black neighborhoods is associated with a higher probability of gentrification.

To start out with, it’s important to focus on the paper’s definition of gentrification:  if a low income census tract experienced any real increase in housing prices or an increase in its educational attainment in excess of the citywide average, it was considered to have gentrified.  The study reports that 29 percent of all eligible tracts gentrified.   That’s a high number, because its a low threshold:   this definition doesn’t necessarily signal a wholesale change in a neighborhood’s demographic composition.

Moreover, the study actually buries the lede about gentrification in Black communities:  consistent with other research, this study shows that predominantly Black and Hispanic neighborhoods are dramatically less likely to gentrify than otherwise similar white  neighborhoods.  Hispanic neighborhoods are 83 percent less likely that white neighborhoods to gentrify; Black neighborhoods are 54 percent less likely to gentrify.  The author’s finding is that Black neighborhoods with school closures are more likely to have experienced gentrification  than Black neighborhoods without school closures.  The effect, though statistically significant, is not large:  about 27 percent Black neighborhoods with school closures gentrify, compared to 19 percent of Black neighborhoods where schools don’t close.  Its still the case that White neighborhoods with (or without) school closures are more likely to gentrify than Black neighborhoods with school closures.

Francis A. Pearman, II & Danielle Marie Greene, School Closures and the Gentrification of the Black Metropolis, Stanford Center for Education Policy Analysis, CEPA Working Paper No. 21-02.  February 2022.

 

The Week Observed, April 15, 2022

What City Observatory did this week

A universal basic income . . . for cars.  One of the most widely discussed alternatives for tackling poverty and inequality head on is the idea of a “Universal Basic Income”–a payment made to every household to assure it had enough for basic living expenses.  While there have been a few experiments and a lot of political hyperbole, it hasn’t really been tried at scale.  But now, California is on the verge of enacting a Universal Basic Income, but instead of being for people, its for cars.  California Governor Gavin Newsom has proposed giving a $400 debit cards to every California households based on whether they own one or two cars.  It’s a symptom of our deep car dependence thant faced with somewhat higher gas prices (still lower, in inflation-adjusted terms than a decade ago), politicians are falling all over themselves to insulate cars and driving from their real costs.

Look who’s going to get a Universal Basic Income (wikipedia)

 

It speaks volumes that we’re so quick to allocate resources to cars and so reticent to have similar energy when it comes to tackling poverty.

Must read

Darrell Owens on exclusionary zoning for poor neighborhoods.  California is wrestling with statewide efforts to end apartment bans and allow more housing to be built where people want to live.  One subtext of this debate is an argument that upzoning ought to be restricted to higher income and predominantly white neighborhoods, and that lower income neighborhoods, especially those with a substantial population of color, ought to be exempted.  There’s a kind of historical symmetry to this argument:  it’s typically been the case the higher income white neighborhoods have used density restrictions to bar development, with the result that housing demand pressures get displaced to lower income neighborhoods.  But as Darrell Owens points out claims that we’re doing these “sensitive” neighborhoods any favors by limiting upzoning is a fundamental mistake.  Precluding upzoning in lower income neighborhoods would just tend to intensify gentrification pressures in those areas:

. . .  the only thing single-family zoning does anyhow is preserve a lot of old houses for buyers who want single-family dwellings. Rich people consume a lot of space and energy, a lot more than working class people, so single-family houses will always be the most appealing to them. Keeping only inefficient, single-family homes in poor areas while densifying affluent ones would just focus wealthy demand for that housing in those poorer communities anyhow—likely exacerbating gentrification.

As we’ve pointed out before, there’s an assumption that if a neighborhood doesn’t allow change in its housing stock, that it will somehow remain affordable, when in fact, limiting supply has exactly the opposite effect.

More climate denial on I-5.  As regular City Observatory readers will know, we’ve closely followed the devastating climate impacts that will arise from planned widenings of I-5 in the Portland Metro area.  Between Tacoma and Olympia, the Washington Department of Transportation is pursuing another freeway widening project, one which will widen the roadway across the Nisqually Delta at the South end of Puget Sound.  Ryan Packer, writing at The Urbanist shows that despite studies showing that  alternative land use policies that would reduce vehicle travel are a more environmentally sound option. Widening I-5 would raise greenhouse gas emissions by more that 2 percent, while better land use would reduce them by more than 1 percent.  WSDOT is pushing forward with a widening project that would cost billions, and increase greenhouse gases, in large part because it a strategy focused on land use changes wouldn’t allow it to be the lead agency in the NEPA review

Another Exploding Whale for the Oregon Department of Transportation.  Highway megaprojects routinely blow through their announced budgets.  A proposal to rebuild and widen the I-205 Abernethy Bridge south of Portland has just seen its budget increased by nearly 40 percent.  This is nothing new for the Oregon Department of Transportation, which has ultimately ended up recording 100 percent cost-overruns for virtually all of the megaprojects it has undertaken in the past two decades.  Just last year, ODOT said the project would cost $375 million.  Bids for construction came in higher than expected, and have ballooned the project’s cost to $500 million.

The miscalculation is ominous for two reasons:  It suggests that either ODOT low-balled the cost estimate just nine months ago, or that the inflation of construction costs is soaring out of control.  Either way, that should give leaders reasons to doubt claims that ODOT is making about its estimates for other billion dollar plus projects.  And how will these cost overruns be paid for: Either other projects and priorities will be short-changed, or I-205 users will have to pay even higher tolls, or both.

In the news

BikePortland featured a summary of Joe Cortright’s presentation to the April 11 YIMBYtown session on transportation, land use and housing.
The Portland Business Journal quoted Joe Cortright’s comments on the impact of inclusionary zoning requirements on new housing construction in Portland.
An article published by Governing quoted City Observatory’s reporting on the Oregon Department of Transportation’s efforts to “woke-wash” its freeway expansion projects with diversity-themed stock photos.

The Week Observed, April 22, 2022

What City Observatory did this week

How sprawl and tax evasion are driving demands for wider freeways.  The Oregon and Washington Departments of Transportation are proposing to spend roughly $5 billion to widen a 5 mile stretch of I-5 between Portland and Vancouver.  The case for the widening is based on the need to accomodate a supposedly inexorable increase in traffic.  But a closer look at traffic demand shows that car travel between Vancouver and Portland has been fueled by exurban sprawl in Southwest Washington, as well as massive sales tax evasion by Washington residents. Some 93 percent of expected growth in peak hour traffic to Oregon is expected to come from new development at Vancouver’s urban fringe.

And roughly 10 to 20 percent of all car trips across the Columbia River are Washington residents shopping in Oregon, who save roughly $120 million annually by avoiding Washington’s 8 percent sales tax in tax-free Oregon.

Must read

Still NIMBY in Minneapolis.  Minneapolis famously gained national notoriety for its path-breaking rules to allow duplex and triplex houses in most of the city’s single family zones.  That’s real progress, but much of the zoning deck is still stacked in favor of NIMBY interests.  A good example is related by MinnPost’s Bill Lindeke, who tells of local resident Cody Fischer’s plans to build an four story, 26 unit apartment building on an inner city lot.  The developer is keenly attuned to urban interests, with plans to for passive, low-energy construction, deluxe bike storage on a site adjacent to good transit service.  It’s exactly the kind of urban density called for in the city’s 2040 plan which passed the City Council on a 12-1 vote three years ago.  But construction of xxxx’s building is still subject to approval by the planning commission, which after hearing local objections about parking, shadows and a lack of neighborhood character, voted against approving the project 6-3.   The local city councilor made the motion to decline the project, saying it was “just not a good project” and that Fischer hadn’t been “neighborly.”  The case is a reminder that city plans are of little value if they don’t guide the decisions to actually build things.  And in cities with ward or district systems for electing city councilors, it may be easy to get agreement on sweeping generalities at a municipal level, and practically impossible to get permission to move ahead with a specific development when irate neighbors show up at city hall and button-hole “their” council member.

The high cost of parking requirements in Los Angeles. Writing in the Los Angeles Daily News, Shane Phillips lays out the case against parking requirements.   Los Angeles, he says, allocates ten times the area of Manhattan to car storage, with profound effects on urban form, transportation costs and equity.  Much of this is a product of the city’s zoning code, which requires new commercial and residential development to set aside vast amounts of otherwise productive land for car storage.   As Philip explains, the overallocation of land to parking comes with a range of detrimental effects:

This massive parking supply feeds our car dependency, and it contributes to the same destructive consequences: environmental degradation and greenhouse gas emissions; long commutes; poor health; dangerous streets; car-oriented architecture; financial burdens for those who own a car; and limited mobility for those who don’t, or can’t.

The hopeful message here is that simply repealing parking mandates can go a long way to addressing this problem. San Diego repealed many of its parking requirements, with the result that developers stepped up and built more housing, which helps ease supply shortages.

Climate Kids v. Biden.  President Joe Biden visited Portland to tout his infrastructure plans, and local youth climate activists had a message for him: Climate leaders don’t widen freeways.  Taylor Griggs writing at Bike Portland described the activists preparations for the presidential visit.  Their position:

“Youth activists will make our message clear yet again: Climate leaders don’t widen freeways. Biden must choose which side he’s on – futures, or freeways. Because he cannot be on both.”

There’s little question that the nation’s infrastructure can use repair and refurbishment.  But as these climate advocates argue, this shouldn’t be an excuse for building vast new roadways that simply encourage more sprawl, car dependence and carbon emissions.

In the news

City Observatory’s Joe Cortright is featured in The New York Times article, “Can Portland Be a Climate Leader Without Reducing Driving?”  The answer to the title headline is decidedly “no.”

Times reporters Brad Plumer and Nadja Popovich call Cortright a “fierce critic” of Portland-area highway expansion projects.

The Week Observed, April 29, 2022

What City Observatory did this week

The folly of the frog ferry.  One bane of transportation policy discussions is the tendency to believe that miracle technical fixes—self-driving cars, personal aircraft, the Segway, or Elon Musk’s car tunnels–are going to overcome the physics, geometry and economics that make transportation a hard problem.  The latest iteration of this fixation is in Portland, where water transport advocates have been pushing for a “Frog Ferry” to connect Vancouver Washington and Portland Oregon.  Ferry promoters have a clever video, asking people to imagine a faster trip by boat, but the trouble is ferries are competitive with either private cars, or importantly, existing transit buses. There is a river route, but it turns out to be roughly twice as long as traveling by the roadway.

An even the fastest water ferries can’t ply the distance between the two cities as quickly as today’s buses running mixed traffic.  In fact, the ferry’s likely to be 20 minutes slower than the 40 minute bus ride, and leave you at the riverside, rather than near your destination.  A slow boat to nowhere isn’t going to be a competitive mode of transportation in the Rose City.

Must read

The hidden cost of “free” transit. There’s a decided inequity in the way we finance urban transportation. Almost everywhere, its free to store private vehicles in the public right-of-way. But in most cases, you can’t set foot on a public bus without paying a fare for each trip. Little surprise that eliminating transit fares seems like a good way to rectify this inequity and it seems like a great way to entice people out of cars, reducing driving and pollution.Writing at Bloomberg, David Zipper challenges this notion, saying that the real world evidence on this environmental benefit is weak:

 After more than a decade of transit agencies around the world experimenting with free trips, it’s far from clear that dropping fares delivers an environmental upside. It boils down to this: If fare-free transit doesn’t substantially reduce driving, it’s not mitigating emissions or slowing climate change. And all signs suggest that it doesn’t.

And in a world of limited resources for transit, eliminating fare revenue reduces the resources needed to pay drivers, and fewer driver hours means less transit service.  When transit service provides value for money (access to important destinations and good frequency) riders are happy to pay. Free service that comes less often is not a good deal for them, or a benefit to the environment.  And perhaps the best way to promote greater equity and reduce driving would be to ask cars to pay their way.

A YIMBY Victory in San Francisco. The deep blue bastion of San Francisco may seem like an unlikely place for political trend spotting, but a special election runoff for a vacant State Assembly seat signals an important shift. KQED reports that the race pitted two long time San Francisco Supervisors—Matt Haney and David Campos. Both are strong progressives, but the issue that separated the two was YIMBY v. NIMBY. Haney has supported a wide range of efforts to expand housing supply; Campos has spoken out in favor of affordable housing, but voted against proposals that expand market rate housing. For many San Francisco voters, however, housing has become a dividing line between the two candidates. In a bid to drive down the city’s sky-high rents, Haney has consistently advocated for an increase in overall housing construction, while Campos has pushed to prioritize affordable developments over anything else . The city approved more than 2,000 housing units per year in Haney’s district, compared to fewer than 200 per year in Campos’s district.

Krugman: Celebrate Earth Day by making it easier to build housing in cities.  San Francisco’s special election caught the eye of New York Times columnist and economist Paul Krugman, who singled out housing issue and its national implications. If the YIMBY victory in San Francisco foreshadows a national shift, he says, the implications could be positive. Not only are cities greener (people use less energy and create fewer greenhouse gases heating and cooling buildings and driving), but cities are more productive. And our failure to build enough housing to meet demand for urban living is a key driver of our national and local housing affordability problems. As Krugman notes:

. . . cities have become highly desirable places to live and work . . . but they’ve become increasingly unaffordable, largely because of local-level opposition to new construction. . . .

Allowing greater density, Krugman argues:

“. . . would be good for the economy. Some people are willing to pay very high prices for urban housing because they’re more productive in big cities. So limiting density makes America poorer by preventing workers from making the best use of their talents.” 

New Knowledge

The rise in E-Commerce during the pandemic.  The Census Bureau released its latest estimates of total US retail sales for 2020, and unsurprisingly, they show the sharp rise in e-commerce during the Covid-19 pandemic.  Total US retail sales surges to about $5.6 trillion in 2020, and e-commerce sales made up about $815 billion or 14 percent of the total.

Of course, the pandemic produced wide variations in retail sales trends across different sectors of the economy.  Sales at food stores increased noticeably, while sales for gas stations, clothing stores and electronics and appliance retailers all fell by double-digit amounts.  E-commerce rose 43 percent year over year from 2019 through 2020.

US Census Bureau, Annual Retail Trade Survey (ARTS), 2020.  https://www.census.gov/programs-surveys/arts/data/tables.html

In the news

Streetsblog USA republished our City Observatory commentary on the ways that tax evasion and sprawl fuel the demand for highway widening in Portland.

The Week Observed, May 6, 2022

What City Observatory did this week

Ten questions that deserve answers before making a multi-billion dollar decision. The Portland metro area is being asked by the Oregon and Washington Departments of Transportation to give the go ahead to a $5 billion, 5 mile long freeway widening project.  It would be one of the biggest infrastructure investments in the region’s history, but as they’re being rushed to judgement, there are ten as yet unanswered questions that are fundamental to making a good decision on this project.  Despite spending more than two years and tens of millions of dollars on reviving the failed Columbia River Crossing, the two state DOTs have consciously chosen not to provide answers to some of the most basic questions, including: How much will the project cost? who will pay for it?  What tolls will be charged on this bridge (and on parallel routes)?  The two state DOTs have also even concealed what the bridge would look like–although City Observatory has obtained 3D renderings via a public records request, that show a giant freeway bridge and its elevated approaches would loom over much of downtown Vancouver.

A giant new I-5 bridge would loom over downtown Vancouver and its redeveloping waterfront–views that have been hidden by ODOT and WSDOT.

The time to get answers to these ten very basic questions is before the two state’s irrevocably commit to spending billions of dollars on a project which, due to flawed traffic projections and a failure to correctly analyze toll impacts on traffic, may be wastefully over-sized.

Must read

Another reason to love carbon pricing. The Frontier Group’s Tony Dutzik is a long time proponent of carbon pricing (as is City Observatory).  Looking at the the energy use of bitcoin mining, Dutzik shows that there’s an important lesson for climate policy.  Bitcoin is “created” by the process of energy intensive computation, which itself leads to additional greenhouse gases. In response to this concern, some miners have shifted to greener power sources, like wind. But as Dutzik points out, this is a classic example of the Jevons Paradox:  When something becomes cheaper, people use more of it.  So while bitcoin miners may make use of cheaper wind energy, that frees up dirtier energy to be used for other purposes.

. . . when it comes to the climate, the amount of clean energy we use is irrelevant. It’s the amount of dirty fossil fuels we burn that matters.And even if we produce enough clean energy to meet our needs, our progress toward a stable climate will be slower than needed if our consumption of energy expands without limit.

The simple minded idea that more clean energy necessarily means less dirty energy is simply wrong.  The advent of more (and cheaper) green energy, just stimulates more overall demand for energy, and as long as fossil fuel doesn’t become progressively more expensive, pollution doesn’t decrease, or decrease very much.  The economic solution is to use market forces, via a carbon tax, to push up the prices of fossil fuels, so that people use progressively less of them.Flying cars will damage cities and democracy.  While the Jetson’s billed flying cars as an travel mode for all, the reality of this technology promises to be dramatically less egalitarian and more damaging to the environment writes the Center for American Progress infrastructure expert Kevin DeGood.

General aviation already hinges on substantial subsidies from the public that chiefly benefit a small number of high income users.  Moreover, flying cars will be noisy, energy inefficient and face serious capacity constraints.  But the key problem of flying cars is that they produce negative social and environmental consequences:

Unfortunately, flying cars represent the technological apotheosis of sprawl and an attempt to eradicate distance as a fact of life for elites who are wealthy enough to routinely let slip the bonds of gravity. Proponents offer a utopian vision of seamless convenience and efficiency that delivers broad-based societal benefits. The inevitable reality is that flying cars will confer advantages on direct users while exacerbating the geographic isolation of elites—a spatial manifestation of deepening inequality that undermines the shared experiences that are necessary to sustain democracy.

Boeing moves its headquarters, again.  Before there was Amazon’s HQ2 beauty contest, one of the highest profile corporate headquarters relocations was Boeing’s 2001 move of its corporate headquarters from Seattle to Chicago, ostensibly to enable the company to become smarter and more capable by being enmeshed in that financial and professional service center.  While it was viewed as a debacle for Seattle at the time, the Emerald City economy has flourished (and boomed) since then, and Boeing moved only a few hundred jobs to Chicago.  The move to the Washington, DC metro area puts the company’s headquarters in the same town as Amazon’s HQ2:  Arlington, Virginia.  The federal government is arguably Boeing’s single largest client, and as with so many industries, being close to the source of policy-making may be a critical factor in corporate success.

New Knowledge

Average trips by distance for States and Counties.  Its an oft-repeated maxim that most travel involves only very short trips, and a new data source from the US Department of Transportation confirms that view. The data, “Distribution of Trips by Distance” contains estimates for all US states and counties, by month, from 2019 onward of the number of trips taken by distance traveled.  It’s a huge database, on a given day in

The estimates are generated from scaling up a sample of anonymized location data from mobile devices to detect travel patterns.  Home locations are imputed based on long duration stays in particular locations; trips are defined as being away from a home location for more than ten minutes.  The data are mode-blind (but probably bottom censor some very short walking trips).  A fuller description of the methodology is provided at the US DOT website.

Here’s an example of how the dataset works.  We’ve focused on Multnomah County (which contains Portland) Oregon.  It has about 820,000 residents, who took about 2.6 million trips per day in February, 2022.  The chart below shows the trend in the number of trips by trip distance from 2019 onward; the decline in travel associated with the Covid-19 pandemic, beginning in the second quarter of 2021 is quite evident.

The website shows the distribution of trips by trip distance; a majority of all trips (55 percent) were less than three miles. That’s powerful evidence that non-auto modes of transportation could be used for many daily trips.  Those very short trips don’t account for much total mileage; it’s likely that more than 90 percent of all miles traveled on a given day are accounted for by trips that are more than three miles long.People in cities tend to take more shorter trips and fewer longer ones than their suburban counterparts.  In Portland, for example, Multnomah County residents take about 2.5 under one mile trips for every 10-25 mile trip they take.  Residents in suburban Clackamas County take about 1.6 under one mile trips for every 10-25 mile trip; and in further outlying Yamhill County, the ratio is 1.4.

Historic travel patterns provide some useful insights about urban transportation, but it’s important to recognize that trip lengths are a product of the built environment and the transportation system.  Sprawling, car-dependent development patterns increase average trip distances.  If we chose different travel modes, the distribution of economic activity would likely be different, and the pattern of trip differences would differ as well.

Bureau of Transportation Statistics, Distribution of Trips by Distance:  National, State and County level.  2022.

The Week Observed, May 13, 2022

What City Observatory did this week

Just Say “No” to freeway widening zealots.  George Santayana meet David Bragdon:  Those who don’t learn from history are doomed to repeat the failures of the past.  A year ago, we published this commentary by David Bragdon, now Director of the Transit Center, but a decade ago, President of Portland’s Metro regional government.  He warned that the region’s leaders were on the brink of repeating exactly the same mistakes of trusting two state highway departments pitching a multi-billion dollar freeway widening project.  The project, re-christened the “Interstate Bridge Replacement” is marching along exactly the path Bragdon foretold, the project claims to be reducing the number of lanes it will build, but is still planning a structure wide enough for ten or even twelve lanes of traffic.

The bridge width lie. Next time you see a story about the “Interstate Bridge Replacement” project, see if you can spot the project’s misleading talking points.  The whopper they’re asking you to repeat now is the false claim that they’re adding only a single so-called auxiliary lane to the existing I-5 crossing.  The reality is that their plans for a “narrower” bridge call for a 164 foot wide structure, enough to easily carry 10 or even 12 lanes of traffic.

This is the same lie ODOT and WSDOT used to try to sell the failed Columbia River Crossing a decade ago.  And this lie is just on the top of a pile of others, notably the name of the project, while it’s called a “bridge replacement” it’s really a 5-mile long, $5 billion highway widening project that mostly involves rebuilding every interchange for miles north and south of the bridge, and building enormous elevated highways in downtown Vancouver and across Hayden Island–facts concealed because the two DOTs have refused to release any renderings showing what the project would look like to people standing on the ground anywhere.  No one should take at face value claims about the number of lanes to be built:  Demand to look closely at the actual plans, which reveal they’re actually going to build a mammoth structure that can easily be re-striped, once built, to carry as many as twelve lanes of traffic.

Must read

How single staircase buildings could make multi-family housing more affordable, livable and interesting.  This story has that “one weird trick” vibe:  A fairly arcane building code regulation is primarily responsible for making US multi-family housing more expensive, more boring and less livable than in the rest of the world.  It’s the “dual staircase” rule:  Most multi-story buildings have to offer each dwelling unit two different staircase accesses.  That’s why multi-family buildings in the US are universally dominated by long, sterile, windowless corridors.

Flickr User Oatsy40

The requirement is a huge space waster, and makes it impossible to have apartments with cross flow ventilation.  It’s especially onerous for small lots, and precludes buildings with a single, central stairway with all apartments abutting a common stair landing. In theory, dual access provides greater safety, but there’s virtually no evidence showing life safety is any greater in such buildings. Allowing single staircase buildings on up to five story structures–which is common in the rest of the world, would give architects vastly more flexibility to design interesting and more affordable apartment buildings.

The real housing speculators, BC edition.  There’s a persistent search for villains in the face of high housing prices, and in Canada, it’s been fashionable to blame foreign buyers for home price inflation.  But an insightful analysis from the Sightline Institute shows, to paraphrase Pogo, “we have met the enemy, and he is us” (or maybe our parents).  Even in places like Vancouver, foreign ownership is a vanishingly small fraction of total home ownership.  Most homes are owned by Canadians, especially older, wealthier ones, who benefit from an array of public policies, notably a broad exemption from capital gains taxes on increasing home values.  Likewise, the policy establishment is dominated by homeowners; Sightline estimates that 93 percent of the members of the provincial assembly are homeowners, and a majority of these own more than one property.  As we’ve pointed out at City Observatory, the real burden of rising home prices represents an intergenerational transfer of wealth from the young to the old, and homeownership subsidies serve mostly to magnify the wealth gap between homeowners, who get generous support, and renters, most of whom get little or nothing. Blaming foreigners is politically convenient, but dodges the issue.

Austin:  A swirl of sprawl, housing inflation, gentrification and “democracy.”  Austin Texas is so hot right now.  The economy is booming, people and companies are moving in, and house prices are exploding.  The famously liberal oasis in the heart of Red Texas aims to have progressive policies, and has been struggling to revamp its zoning code to allow more housing construction to match burgeoning demand.  Writing at Bloomberg, Megan Kimble explains that while well-intended, these efforts have hung up by a state law that requires individual notifications to property owners in advance of zoning changes.  In one case, plans for a denser, mixed income development in the city were thwarted by neighboring homeowners:

. . . the opponents represented 31% of the land within 200 feet of the property, triggering a state law that would require Austin City Council to pass any rezoning with a three-fourths majority rather than a simple two-thirds majority. Perhaps reading the writing on the wall — namely, that city council did not have the nine votes to pass the rezoning — the development company withdrew its request.

Handing out sweeping veto authority to handfuls of current residents is a sure-fire recipe for making sure that little new housing gets build, especially in more central, and walkable locations.  The practical effect of this policy is to further push up prices, create more displacement, and drive housing development to an ever-expanding suburban fringe, locking in more sprawl and car dependency and undercutting efforts to promote transit and reduce greenhouse gases.

New Knowledge

How a loss of face-to-face interaction hurts innovation.  There’s a lot of debate and speculation about the ability of highly innovative firms to flourish in an environment with extensive remote work.  While some people and firms are claiming they can be just as productive working at a distance, it seems premature to make that claim, as many of the effects may be extremely long-term in nature.  A new study from Japan looks back at the impacts of the Influenze epidemic of the 1920s, and finds that the loss of opportunities for face-to-face interaction has significant long term impacts on innovation.

As with the Covid pandemic, social distancing to fight influenza dramatically reduced the amount of face-to-face interaction in society and workplaces. The authors of the study used Japanese patent application records to check the growth of patenting in different industries, and found that in the wake of the early 1920s influenza epidemic, patent activity fell fastest in collaboration-intensive industries (those industries in which multiple researchers jointly file for patents).

As with other patent-studies that have looked at “star-scientists” and network effects in inventive activity, this study also looked at how the influenza pandemic and the associated decline in face-to-face activity influenced longer-term productivity of researchers.  They found that early career collaboration is one key to later productivity (as measured by patents) and that the decline in collaboration associated with the pandemic depressed life-long productivity of researchers.  They summarize:

‘. . . the results show that the number of patent applications declined by 19% during the pandemic in the collaboration intensive fields. We further find that the decrease in patent applications in the collaboration intensive fields during the pandemic was mainly driven by the decrease in new entries into patent applications. These findings suggest that face-to-face communication indeed contributed to innovation by collaborative work. In addition, they also reveal that opportunities of technical guidance, communication, and knowledge exchange with seniors and colleagues in the early career of an inventor were especially important.

Hiroyasu Inoue, Kentaro Nakajima, Tetsuji Okazaki and Yukiko U. Saito, The Role of Face-to-face Contact in Innovation:
The Evidence from the Spanish Flu Pandemic in Japan, RIETI Discussion Paper Series 22-E-026 March 2022, https://www.rieti.go.jp/jp/publications/dp/22e026.pdf

In the news

Oregon Public Broadcasting cited City Observatory’s Joe Cortright on the battle over the Interstate Briddge freeway widening project in Portland.
As part of his three-part series on solutions for congestion relief at Planetizen, James Brasuell cited our commentary on Louisville’s experience showing the even modest tolls eliminated traffic congestion on I-65.
Bike Portland directed is readers to the ten un-answered questions we raised confronting the controversial bridge project.

The Week Observed, May 20, 2022

What City Observatory did this week

Another exploding whale:  The cost of the I-205 bridge project doubles in four years. Famously in the 1960s, the Oregon State Highway Department tried to dispose of the carcass of a whale that had washed up on an Oregon beach with several cases of dynamite. They predicted that the whale would be atomized, but in one of the most-watched youtube videos in history, the results are a disaster, with onlookers fleeing to avoid being crushed in a rain of blubber.  While it no longer blows up whales, the Oregon Department of Transportation has repeatedly blown up project budgets, with similarly embarrassing and even more devastating results.

In four years, the price of widening this bridge doubled from $248 million to nearly $500 million

Virtually every major ODOT highway project has ended up costing twice as much as its initial estimates.  The latest example is the proposed seismic upgrade and widening of the I-205 Abernethy Bridge over the Willamette River.  In 2018, ODOT told the Legislature, the project would cost $248 million.  Bids opened earlier this spring showed the actual cost of the project was going to be nearly $500 million.  No one should trust ODOT to make reliable project cost estimates, any more than they should let them use dynamite for whale disposal.

Must read

Denver drops plans to widen I-25.  Score one victory for freeway fighters in the Mile High City.  According to the Colorado Sun, the Colorado Department of Transportation has dropped plans to widen Interstate 25, the principal North-South route through the city.  Instead, they’ll focus on improving rail transportation and redeveloping rail yards near the city center.  Colorado has adopted some forward looking climate policies, that require some consideration of the effect of new highway capacity on increasing travel and climate pollution.  This is a positive step, but like so many battles, isn’t final:  CDOT could revive this project in some future year.

America’s increasingly deadly streets.  The latest data on traffic deaths shows that, far from making progress toward “Vision Zero,” the carnage on US streets and roads continues to increase.  The number of traffic deaths grew 10.5 percent  to nearly 43,000 according to the data compiled by the National Highway Traffic Safety Administration.  Traffic deaths increased to a 16-year high.  Fatalities increased in 44 of the 50 states, and the increase in deaths was higher for some vulnerable road users, with pedestrian deaths up 13 percent over the past year.

The miracle of “reduced demand”.  Regular City Observatory readers are familiar with the term ‘induced demand”–the idea that building more roadway capacity induces more automobile travel. While well documented in the scientific literature, the concept is utterly baffling to billionaire car and tunnel builder Elon Musk, who attempted to refute induced demand last week, by expressing his disbelief for its logical corollary:  if we reduce road capacity that should reduce traffic.  StreetsblogLA has the refreshing news that in spite of Musk’s skepticism, that is also true.  As freeway removal projects around the world (and even temporary closures) have shown, removing highway capacity has the effect of reducing car travel (and pollution and crashes).  Kea Wilson offers a succinct telling of the three reasons that reduced demand works.  Her article is ought to be “Musk-reading” this week.

In the news

StreetsblogLA cited City Observatory’s analysis of “reduced demand” in its article challenging Elon Musk.

The Week Observed, June 10, 2022

What City Observatory did this week

Oregon DOT’s “reign of error”—chronic cost overruns on highway projects.  The Oregon Department of Transportation is moving forward with a multi-billion dollar freeway expansion plan in Portland. That poses a huge risk to state finances because the agency has a demonstrated track record of cost overruns.  We show that virtually all of ODOT’s biggest projects have experienced cost overruns of more than 100 percent.

ODOT is compounding the financial risk by proposing to issue toll-backed bonds to pay for these projects, and the agency has no current experience in planning, financing or operating tolled facilities.  The agency has also failed to acknowledge its record of huge cost-overruns; even a management audit by McKinsey and company designed to address the problem had a 3x cost overrun, and conspicuously excluded ODOT’s most expensive project (and biggest cost overrun) from its analysis.

Must read

The safest place to live is . . . a big city.  Bloomberg’s Justin Fox tackles head on the popular myth that life in big cities is somehow more dangerous that in suburbs or smaller towns.  Regardless of what you see or hear in the media, big cities are safer than one or two decades ago; New York City has a very low murder rate.  And when you include statistics on automobile crash deaths, cities look even safer.  Here’s Fox’s data showing death rates per 100,000 for different places.

The key finding here is that overall death rates from murders and car crashes are no higher in the central counties of large metro areas (the red line) than they are in non-metro, small metro and medium metro areas.  And strikingly, the death rate in New York City (the gray line at the bottom of the chart) is markedly lower than anywhere else, a trend that persists even in Covid pandemic.  The data suggest it’s time to reconsider the common belief that big cities are somehow more dangerous:  they aren’t.

Bringing back single room occupancy buildings.  Through much of our history, a boarding houses and rented rooms were a major part of the housing stock, especially for single persons of modest means.  But for decades cities have been restricting, prohibiting or even demolishing single room occupancy (SRO) dwellings, partly out of concern for often unpleasant or unhealthy living conditions, but perhaps just as much as yet another form of exclusionary zoning (eliminating or keeping way poor persons by prohibiting housing they can afford.  Jake Blumgart writes at Governing about Philadelphia’s effort to re-legalize SRO housing in more places throughout the city.  It’s actually a modest proposal, just legalizing SRO’s in multi-family and commercial zones, not the city’s single-family districts.  But even that has prompted resistance, with Councilors in some Philadelphia wards seeking to continue to ban SRO’s in their neighborhoods:

District councilmembers represent specific swathes of the city and tend to be most responsive to neighborhood associations and homeowner groups. In a city like Philadelphia, owning a home and car doesn’t mean you are rich by any means, but you have more than many of your neighbors, you are more likely to vote, and more likely to have your voice heard. That voice is likely to express a desire for other single-family neighbors, not rooming houses. Multiple council sources have told Governing that three district councilmembers planned to introduce amendments to Green’s bill that would carve their neighborhoods out of his legislation.

Decent, if modest, housing is clearly preferable to living on the streets, but as with so much of our housing crisis, our wounds are self-inflicted, and amplified by our catering to NIMBY tendencies.

Stop requiring parking everywhere.  New York Times columnist Farhad Manjoo channels his inner Don Shoup in this column, making the case that parking requirements are inimical to building the kind of affordable, livable and sustainable urban neighborhoods we most desire.  It’s an oft-told story, but Manjoo makes it clear:

. . . by requiring parking spaces at every house, office and shopping mall — while not also requiring new bike lanes or bus routes or train stations near every major development — urban-planning rules give drivers an advantage in cost and convenience over every other way of getting around town.  . . . There are other ways parking wrecks the urban fabric. It creates its own sprawl— the more endless, often empty parking lots between businesses, the less walkable and more car-dependent the city becomes. And requiring parking worsens inequality. Because people whose income is less tend to drive less and use transit more, they’re essentially being forced to pay for infrastructure they don’t need — while wealthier car drivers get a break on the true costs of their car habit.

 

New Knowledge

The persistent of the racial wealth gap.  Few economic facts are as stark and enduring than the gap in wealth between Black and white US households.  A new study published by the National Bureau of Economic Research provides a detailed historical record of wealth trends that illustrates how wide–and intractable–the wealth gap has become.

The end of slavery, thanks to the Emancipation Proclamation eliminated the single biggest obstacle to Black wealth, and had immediate an important effects.  Black wealth, which was negligible in relation to typical white household wealth gained substantially in the first few decades after Emancipation.  But the pace of those gains slowed dramatically. Between 1860 and 1900, the gap between Black and white wealth fell from a factor of 50 to about a factor of ten.  Since then the pace of improvement has been muted.  Black wealth has converged on white wealth, but only slowly and since 1980, there’s been little change in the disparity.

The paper’s analysis shows that differential wealth holdings and capital gains play an important role in maintaining the racial wealth gap.  White households, as a group, own more equities, and enjoy higher average returns, meaning that the wealth gap declines slowly if at all.

. . . as the racial wealth gap decreases, convergence slows and differences in returns on wealth and savings begin to matter more for the shape of convergence. Given existing differences in the wealth-accumulating conditions for white and Black individuals, our analysis suggests that full wealth convergence is still an extremely distant or even unattainable scenario. Furthermore, rising asset prices have become an important driver of racial wealth inequality in recent decades. The average white household holds a significant share of their wealth in equity and has therefore benefited from booming stock prices while the average Black household, for whom housing continues to be the most important asset, has been largely left out of these gains.

Given the overall increase in inequality in US wealth in the past few decades, the author’s predict that there is likely to be little improvement in the racial wealth gap. Absent some significant changes in public policy (regarding either the tax treatment of capital gains, or reparations), the Black-white wealth gap is likely to diverge in the future.

Ellora Derenoncourt, Chi Hyun Kim, Moritz Kuhn & Moritz Schularick, Wealth of Two Nations: The U.S. Racial Wealth Gap, 1860-2020, NBER Working Paper 30101, June 2022.

The Week Observed, June 17, 2022

What City Observatory did this week

There’s nothing green about free parking, no matter how many solar panels you put on the garage.  The US Department of Energy’s National Renewable Energy Laboratory brags about its sustainable parking garage, festooned with solar panels.  But the garage, designed to hold about 1,800 cars is essentially fossil fuel infrastructure, a problem exacerbated by the lab’s suburban location, and policy of not charging anyone for parking. We first wrote about the parking garage in 2017, and now revisit it to see how it has performed.

Unsurprisingly, an event smaller share of the lab’s commuters travel to the lab in anything other than single-occupancy vehicles than before the garage was built.

Must read

The real villains in gentrification. There’s a lot of finger-pointing and virtue-signalling in the arguments over gentrification, but as The Atlantic’s Jersualem Demsaas explains, most people are looking in the wrong direction when assigning blame for gentrification (and rising home prices).  The real cause of gentrification is pervasive restrictions on increasing density in higher income neighborhoods, chiefly through restrictive single-family zoning.  These limits effectively displace demand to other places (chiefly lower income neighborhoods), in addition to excluding lower income households from higher resource areas.  And overall, the constriction of supply tends to drive up home prices, which works to the disadvantage of low income households.

Local governments have, in particular, chosen to respect the class interests of wealthy homeowners by giving them the power to reject the construction of new and more affordable types of housing, in effect allowing them to economically segregate their neighborhoods. . . In genuflecting to the class interests of wealthy homeowners, local officials have, then, set the stage for gentrification. Yes, in a narrow sense, gentrification happens when young, college-educated, and predominantly white people move to racially and economically diverse neighborhoods. But notice how insidious this framing is and who it leaves out: the homeowners and city officials who made equitable growth impossible. This framing foments conflict among young newcomers and lower-income communities of color and turns a structural problem into an individual one.

This is a deeply insightful and precise analysis of the root causes of gentrification and the political misdirection of blame away from exclusionary zoning that serves the financial interests of wealthier and whiter homeowners:  It is a must, must read for anyone who cares about cities, housing affordability or equity.

Washington State simply ignores climate issues as it plans more freeways.  The Urbanist’s Ryan Packer takes a close look at Washington State’s climate policies, as implemented by its transportation department.  Governor Jay Inslee portrayed himself as a “green” candidate in the 2020 Democratic presidential primaries, but the policy of his administration makes it clear that the state is turning a blind eye to the role of highway expansions in increasing greenhouse gases.  The state transportation agency, WSDOT looked at three scenarios for transportation spending, and concluded that whether it spent a lot or a little on more highways had essentially no impact on climate emissions.  Packer writes:

But when it comes to analyzing the environmental impact of these scenarios, things come off the rails. WSDOT told the commission that its initial analysis concluded the three scenarios had essentially the same impact on the state’s greenhouse gas emissions. In other words, according to WSDOT, spending $10 billion on expanded highways (in the “maintain and innovate” scenario) would have exactly the same impact as spending more than double that (in the “maintain and expand” one).

All this flies in the face of the demonstrated science of induced demand (more roadways stimulate more driving and increase emissions), something that WSDOT claims it understands in other technical work.  What this shows is that when it comes to actually spending money, highway departments are still in deep denial about the scientific realities of both climate change and induced travel.

The problem with district representation in city government.  Many cities elect their governing bodies by district, and by rule or tradition, systems of “aldermanic privilege” or “courtesy” mean that nothing happens in a district (development approvals, zone changes and so on) without the explicit approval of the person elected from that area.  It’s a system that inherently tends to favor “NIMBY” politics (Not-in-my-backyard), as a person elected by a district will be more sensitive to local concerns.  And it also creates deal-making opportunities for city councilors.  Trading development approvals for political or financial favors often becomes the norm.  In St. Louis, three Aldermen were indicted by Federal prosecutors for accepting bribes in connection with land sales, tax abatements and rezonings.  According to the St. Louis Post-Dispatch, the indictments have:

. . . shined a light on a practice often referred to as “aldermanic courtesy” and the direct involvement of city alderman in many of the bureaucratic functions of city government.  Federal prosecutors zeroed in on lucrative tax abatements and the sale of property owned by the Land Reutilization Authority, the city’s land bank. But aldermanic influence can also affect which streets get repaired, where speed bumps are installed and whether the planning department even takes up a rezoning request.

This problem is endemic to ward-based election systems:  elected officials have a narrow base of support, and are naturally more interested in serving their districts than the city at large.  This coupled with a tendency toward “log-rolling”—deferring to the local interests of other members, as they defer to your local interests—undermines the political support for consistent, rules-based administration.

New Knowledge

The persistent racial gap in Covid death rates.  A recent New York Times story claimed that Covid-19 death rates, which have previously been higher for non-white, non-Asian populations, have either converged with white death rates, or actually decreased below them.

The trouble is, that isn’t true.  The New York Times statistic failed to adjust for the different age composition of each racial-ethnic group.  The aggregate death rate of non-white Americans has declined more than for whites, but this is chiefly an artifact of the much younger average age of this group.  Tyler Black posted on Twitter age-specific death rates by race and ethnicity.  They show a persistent gap in death rates.

Black computes the “odds-ratio”–the likelihood that a non-White/Asian person will die of covid relative to the likelihood that a White/Asian person will die of Covid.  The horizontal axis shows the odds-ratio for each age group.  The different colored lines correspond to the three years of the pandemic (2020, 2021 and 2022).  Overall, the disparity is lower in 2022 than it was in 2020, but its still the case that age-specific death rates are higher for non-white/Asian persons in each age group.

Many factors influence Covid-19 infection rates and mortality; age is clearly a principal risk factor, and failing to control for it in one’s analysis is a serious flaw.  Too often with Covid-19 statistics there’s been an undue eagerness to jump to unwarranted conclusions based on fragmentary or poorly analyzed data (at City Observatory, we spent a considerable amount of time debunking early claims that density was a key risk factor for Covid spread).

In the News

Willamette Week quoted City Observatory Director Joe Cortright on the proposed $5 billion Interstate Bridge freeway widening project:  ““The project’s traffic forecasts are inaccurate, the cost estimates are based on decade-old engineering work, and the selected high bridge option is the riskiest, most expensive and least affordable approach to solving this problem.”

The Week Observed, June 24, 2022

What City Observatory did this week

The economics of fruit, time, and place.  It’s berry time in Portland, and that got us thinking about how special local products are in defining quality of life.  Recently, Paul Krugman, fresh off a European vacation, waxed poetic about the fleeting joy of summer fruit, and true to form, made an economic argument about how this illustrates the value of such perishable time-limited experiences. Here in Portland, we’re in the midst of the brief season of Hood strawberries, a fragile, juicy fruit that puts the crunchy industrial strawberry to shame, and which is available only for a few weeks just prior to the Solstice.

You can only enjoy the Hoods for a few weeks, and because they don’t travel, only in a few places. More broadly though, as Jane Jacobs pointed out, it’s that kind of distinctive, highly local attribute that’s one thing that places can’t have competed away. In an era of globalization and technology that makes so much of our lives ubiquitous and indistinguishable from place to place, it’s the little, local, time-limited things that will matter more and more.

Must read

The Gospel of Induced Demand.  We’re frequently at a loss to explain why transportation departments remain in deep denial about the fundamental science behind the idea of induced demand:  Building more roads, making it easier to drive, especially in dense urban settings, leads people to drive more.  Noting that its a “fundamental law” of road congestion has made little progress with engineers.  Perhaps now it’s time to elevate induced demand to a more revered status.

In a new article in Transfers Magazine, Nicholas Klein, Kelcie Ralph, Calvin Thigpen, and Anne Brown do this, in a way, in the “gospel of induced demand.”

We argue that part — though certainly not all — of the problem is that the public largely misunderstands induced demand. Given this misunderstanding, transportation planners, engineers, and other practitioners must become evangelical about induced demand. They should spread the gospel that widening roadways to mitigate congestion is almost certain to fail, and that building new transit to fight congestion will likewise be disappointing. We hope a greater public understanding of induced demand will help reorient transportation investments away from the idea that construction solves congestion.

The science is now quite clear.  The big challenge is to educated the public, in the author’s word to evangelize people into understanding that the simply minded notion that we can somehow build our way out of congestion is simply wrong.  We heartily agree, but we also know that this teaching will be strongly resisted by those who make a living building roads.

What can we learn from Europe about making roads safer.  The Urban Institute’s Yonah Freemark has an interesting trans-Atlantic perspective on road safety.  While road deaths are spiking in the US, they’ve been going down in most European countries.  Freemark takes a close look at France, which until a little over a decade ago had a higher road death toll (per mile/kilometer traveled) than the US.  That disparity has since reversed, and now France is notably safer.

There’s no single explanation for the difference, but Freemark points to several policies that contribute to safer streets and roads:  tougher speed limits, automated enforcement, and smaller vehicles.  Plus, there’s much more emphasis on pedestrianized areas where people on foot and bike are prioritized over cars.

The Left NIMBY meltdown.  Economist Noah Smith takes a look at the state of politics in the housing debate in his Substack newsletter, in a piece calling out the increasingly strident and decreasingly logical arguments left-wing NIMBYs make against policies to make it easier to build more housing in cities.  It’s getting harder, Smith argues, for Left-NIMBY’s (progressives who oppose liberalizing zoning) to make a coherent case.  He argues:

In the past couple of weeks, the Left-NIMBYs had a bit of a meltdown. And it’s illustrative of what a dead-end Left-NIMBYism is, and how more people are moving to the side of the YIMBYs. Which in turn tells us something about the emerging political economy of the United States.

Academic research has discredited oft-repeated, but simply wrong, claims that building more housing somehow leads to higher rents and displacement (the opposite is true).  And strategies to promote “public housing in my backyard (PHIMBY)” have inevitably foundered because these progressives really don’t want any new housing anywhere. Evidence from San Francisco, and other cities, shows that the politics are shifting in favor of more housing.

New Knowledge

The illusory carbon benefits of corn-based ethanol.  One of the most prominent technical fixes for automobile carbon pollution is the notion that bio-fuels will reduce carbon emissions from cars and trucks.  This idea is behind the “renewable fuel standard” adopted by the federal government, requiring refiners to mix ethanol with fossil-derived gasoline and diesel fuel.

In theory, plant-based ethanol doesn’t add net carbon to the atmosphere, because plants fix carbon from the air as they grow, and burning that carbon simply returns the carbon from photosynthesis.  Most ethanol in the US is derived from corn, and farming corn requires extensive fossil fuel inputs (both for fuel for tractors and other agriculture machinery, and for fertilizers).  In addition farming ties up valuable agricultural land, and the increased demand for corn tends to drive up the prices of corn and related commodities.

A new study from the University of Wisconsin takes a close look at the overall carbon balance of the renewable fuel standard in the US.  It finds that there’s little evidence that substituting ethanol for fossil fuels actually reduces carbon emissions.  The authors’ best estimate is that the net effect of the bio-fuels mandate is to actually increase carbon emissions about 24 percent over what they would be otherwise.

The RFS [Renewable Fuel Standard] increased corn prices by 30% and the prices of other crops by 20%, which, in turn, expanded US corn cultivation by 2.8 Mha (8.7%) and total cropland by 2.1 Mha (2.4%) in the years following policy enactment (2008 to 2016). These changes increased annual nationwide fertilizer use by 3 to 8%, increased water quality degradants by 3 to 5%, and caused enough domestic land use change emissions such that the carbon intensity of corn ethanol produced under the RFS is no less than gasoline and likely at least 24% higher.

The key reason for the poor net carbon performance of the renewable fuel standard is the land use changes triggered by the policy.  Rising prices increase land under cultivation, and the net effect is to increase total carbon emissions.  The author’s re-estimate the results of three earlier studies, undertaken by the EPA, the  California Air Resources Board, and Argonne National Labs.  They find that once land use changes (LUC) are allowed for, the net effect of the RFS is to increase carbon emissions.

Many states count the substitution of ethanol for gasoline as a reduction in their calculation of emissions reductions for purposes of establishing progress toward greenhouse gas reduction goals.  This research suggests that those savings are largely, if not entirely, illusory.

Tyler J. Lark , Nathan P. Hendricks,Aaron Smith, Nicholas Pates , Seth A. Spawn-Lee, Matthew Bougie, Eric G. Booth, Christopher J. Kucharik, and Holly K. Gibbs, “Environmental outcomes of the US Renewable Fuel Standard,” Proceedings of the National Academy of Sciences, February 14, 2022,
https://doi.org/10.1073/pnas.2101084119

In the News

City Observatory director Joe Cortright is quoted in Willamette Week‘s story, “Critics warn new Interstate 5 Bridge would loom over Vancouver waterfront”

The Week Observed, July 1, 2022

Must read

The most gas guzzling states. The sting of higher gas prices depends on where you live, not so much because of the variation in prices, but because in some states, you just have drive a lot more.  The website Quotewizard took a look at federal data from the energy and transportation departments, and calculated the average change in fuel consumption per capita since late last year; it also ranked states on fuel consumption per driver and per person.

In this study, Texans use about 60 percent more gasoline per capita (240 gallons) than people who live in Massachusetts (150 gallons). In general, higher gas prices have reduced consumption since last October by between 5 and 10 percent.  In the short run, people have few options for reducing gasoline consumption, but over time, responses tend to be greater.

Barack Obama:  YIMBY.  In a recent speech to the American Institute of Architects, former President Barack Obama sounds off with strong urbanist and YIMBY arguments, questioning “progressive” housing policies that have hurt affordability and integration.

“Frankly, some very well-intentioned laws and regulations at the local level, often generated from the left and from my own party, sometimes are inhibiting the creation of affordable housing and powering NIMBY attitudes and make it very difficult to integrate communities and allow people to live close to where they work. The most liberal communities in the country aren’t that liberal when it comes to affordable housing”

Official portrait of President Barack Obama in the Oval Office, Dec. 6, 2012. (Official White House Photo by Pete Souza)

He even invokes the classic Jane Jacob/Robert Moses dichotomy on urbanism, with a clear nod to Jane Jacob’s view of the organic nature of urban development.  Obama even decries the ravages of sprawl:

“Sprawl in America is not good for our climate,” he said. “And so we have to think about creating livable density that allows us to take mass transit and take bicycles.”

New Knowledge

Rent control reduces the amount of rental housing. Cities around the world are always trying new variations on rent control, hoping that they can overcome some of the fundamental economic problems with capping rental prices.  In Spain, Catalonia implemented a so-called “second generation” rent control scheme that set reference prices for apartments.

A new study looks at the impact of this rent control on apartment rents and the number of apartments that are rented out.  It has some interesting findings.  The author’s report some reductions in rents, but also find that many apartment rental contracts are set for higher amounts that allowed by the rent control law.  It is invariably difficult to enforce rent controls, as landlords are always looking to get more money, anda in a tight market, some prospective tenants will be willing to bid more than the regulated price. (That’s why you frequently find payments for “key money” in rentals of regulated apartments.

One of the unfortunate assumptions of rent control is that landlords have no choice but to continue to rent their apartments:  That’s not the case.  They can choose to occupy them themselves or sell them to others who will occupy them.  And that’s exactly what this paper finds happened in Catalonia:  The restriction on rental prices led to a reduction in units for rent.  Overall, the authors found that there was a very elastic supply response to rent restrictions:  an estimated elasticity of about four means a one percent reduction in rents led to about a four percent reduction in apartment supply.  The reductions appeared to be in larger units, where the rental regulation constraint was greatest.

Catalonia represents one of the interesting experiments in rent control.  Also, as in Germany, this rent control regime has been ruled unconstitutional after a few years of operation, meaning that scholars will now have the opportunity to observe what happens when rent control is removed.  This should provide additional insights into how housing markets work.

Joan Monràs and José García-Montalvo, The effect of second generation rent controls: New evidence from Catalonia, Universitat Pompeu Fabra,Economics Working Paper Series, Working Paper No. 1836. Updated version: April 2022

In the News

City Observatory director Joe Cortright is quoted in the Portland Business Journal’s reporting on the proposed Interstate Bridge Replacement project.

The Week Observed, July 15, 2022

What City Observatory did this week

A Bridge too low.  The Oregon DOT is fundamentally misrepresenting the process and legal standards for setting the height of a proposed new multi-billion dollar I-5 bridge across the Columbia River between Portland and Vancouver.  Ignoring the Coast Guard’s determination that a new bridge must provide 178 feet of navigation clearance likely dooms the I-5 Bridge Project to yet another failure.

The Oregon and Washington DOTs have again designed a I-5 bridge that’s too low for navigation.  In their rush to recycle the failed plans for the Columbia River Crossing, the two state transportation departments have failed to address Coast Guard navigation concerns. State DOT PR efforts are mis-representing the approval process:  The Coast Guard alone, decides on the allowable height for bridges, and only considers the needs of navigation.

Those who refuse to learn from the past are condemned to repeat its mistakes.  That is certainly the case for Portland’s Metro Council which voted on July 14 to wave on the Oregon and Washington highway department plans for a multi-billion dollar freeway widening project packaged as a “bridge replacement.”  Then-Metro Council President David Bragdon weighs in with a warning for the Council, arguing that they have been systematically lied to by highway advocates, and that they will rue their decision.

Must read

Are state highway forecasters committing scientific fraud?  The statistical foundation of every highway expansion project in the country is a forecast of future traffic.  Planners build models incorporating demographic, economic and spatial data and attempt to predict how roads will perform, and what pollution will result if projects are (or aren’t built).  Typically, models are built by and for state highway departments, and are a key part of their efforts to justify and sell bigger roadways.  And unsurprisingly, these DOT models routinely produce just the results their sponsors want:  not widening a road will lead to cataclysmic gridlock; widening as planned will lower traffic congestion and speed travel times.  Its well known that these models have serious structural limitations and flaws–chief among them the inability to model induced demand, and the models themselves are impenetrable black boxes that are almost never available for external scrutiny and verification.  Ben Ross of Maryland alleges that dodgy models and opaque processes have produced scientific fraud in Maryland.  He and colleagues point to inexplicable anomalies in model results, which as usual, reach just the conclusion the state DOT wants.  They’re asking the Federal Highway Administration to independently verfify the modeling.

Dangerous by design, 2022.  Driving may be down, but the number of pedestrians killed on America’s streets and roads is going up. That’s no accident, according to the latest version of Smart Growth America’s report DAngerous by Design.  The nation’s road systems are built in a way that encourages speeding and other risky driving behaviors that put vulnerable road users at greater risk. As roads became less crowded in the during the pandemic, drivers drove faster and more recklessly, with predictable results.  The report’s data show that crash rates and fatalities are systematically higher in a series of Sunbelt metros with hevaily auto dominated road systems.

 

Tragically, this problem is getting worse.  Even though SGA’s reports have shown this geographic pattern for years, the traffic fatality rate has continued to increase in every one of the top 20 metros shown on the map above.  Once again, the report signals that traffic safety can’t be achieved by largely symbolic “Vision Zero” pledges or vapid public relations campaigns.  We have to change the way our streets and roads are built and used.  Until then, we can expect more of the same.

New Knowledge

The importance of two way streets.  One of the key adaptations of the urban road system to automobiles has been the creation of one-way streets. Prior to the automobile, pedestrians, vehicles and horses all mixed and traveled in both directions and all streets were “two-way.” But to facilitate faster car travel, engineers created one-way streets.  One-way streets may be faster, but they come at a cost to the urban fabric.  For example the “back sides” of one way streets (corners facing away from the direction of vehicle traffic) essentially disappear from view and lose value

A new paper from Geoff Boeing and William Riggs points out another cost to one-way street systems:  Longer trips.  By definition, one way streets require at least some out of direction travel to go between most pairs of destinations.  Boeing and Riggs have applied some big data to this question, and they estimate that in the typical city, a system of one way streets results in 1.7 percent percent more vehicle miles of travel due to more circuitous routes.

Applied to San Francisco, their estimates show that converting one-way streets to two-way operation would likely reduce vehicle miles traveled and associated pollution:

If San Francisco converted its one-way streets to two-ways, the new network efficiencies could improve travel distance minimization by 1.7%. This allows us to sketch out a simple back-of-the-envelope ceteris paribus estimate of surplus fuel consumption and greenhouse gas emissions, assuming an average US fuel economy of 10.5 kilometers per liter and 2.4 kilograms of CO2 released per liter of gasoline combusted. All else equal, a citywide two-way conversion policy in San Francisco could reduce annual VKT by 27 million kilometers, fuel consumption by 2.6 million liters, and carbon dioxide emissions by 6,200 metric tonnes (6.2 million kilograms)—just for intra-city trips that begin and end within San Francisco proper.

There are already many urban design and safety reasons to favor two-way streets in urban settings.  They tend to calm traffic, and create environments more conducive to walking and biking.  The Boeing & Riggs analysis also suggests a noticeable opportunity to shorten trips and reduce pollution.
Boeing, G., & Riggs, W. (2022, April 23). Converting One-Way Streets to Two-Way Streets to Improve Transportation Network Efficiency and Reduce Vehicle Distance Traveled. https://doi.org/10.31235/osf.io/fyhbc

In the News

Bike Portland quoted City Observatory Director Joe Cortright’s testimony to the Portland City Council on the proposed I-5 bridge replacement/freeway widening project.

The Week Observed, July 22, 2022

What City Observatory did this week

Failing to learn from the failure of the Columbia River Crossing.  Last week, Portland’s Metro Council voted 6-1 to wave on the Oregon Department of Transportation’s plan for a multi-billion dollar freeway widening project branded as a bridge replacement.  In doing so, the Council is ignoring the lessons of the project’s decade-old doppelganger, the failed Columbia River Crossing.

In this commentary, then-Metro President David Bragdon explains how the Oregon and Washington highway departments systematically lied to and misled regional officials to advance the project, something they’re repeating chapter and verse again to sell the “Interstate Bridge Replacement Project.”  Those who don’t learn from the mistakes of of history are condemned to repeat them.

Must read

Killing the Smart City.  Was there any idea more overwrought than the Google-backed Sidewalk Labs plan to turn Quayside—part of Toronto’s waterfront—into a tech-laden “Smart City?”  After years of hype, millions spent on plans of questionable merit on the turgid and non-sensical promise of “building a city from the Internet up,” the project collapsed of its own weight in 2020.  It predictably blamed the pandemic, but that’s not the real story. Writing at the Technology Review, Karrie Jacobs diagnoses our fascination with—and the flaws of—these sweeping top-down visions of how urban areas are transformed.  Sidewalk Labs smart city plans had the conceit that it was new and better technology that was the key to cities improving, rather than building a better place for people.

The real problem is that with their emphasis on the optimization of everything, smart cities seem designed to eradicate the very thing that makes cities wonderful. New York and Rome and Cairo (and Toronto) are not great cities because they’re efficient: people are attracted to the messiness, to the compelling and serendipitous interactions within a wildly diverse mix of people living in close proximity. But proponents of the smart city embraced instead the idea of the city as something to be quantified and controlled.

We’ve made this mistake before:  The story of twentieth century urbanism was conforming cities to the latest technology—the private automobile.  That tech-led approach to development has ravaged cities and damaged the planet; Toronto, and the rest of us were lucky that Sidewalk Labs effort failed.

External costs.  Climate change is, without doubt, the most global crisis humanity has ever confronted.  Our standard way of thinking about the costs and benefits, which focuses on individuals, neighborhoods, communities, and occasionally nations, misses the fact that the environmental consequences of our decisions are profoundly global, and not local.  Climate change is real to us when we see the effects locally, as when last summer, temperatures in Portland hit 117 degrees.  What that insular perspective misses is that the cumulative effects of climate change are larger and more widespread than any of us individually experience. A new study puts some numbers to this idea.  It estimates that the global cost of US carbon emissions is $1.9 trillion over the past quarter century.

Rising renter incomes and higher rents.  There’s no question that our constrained housing supply (especially in cities) plays a key role in causing housing affordability problems.  But housing prices and rents are driven by both supply and demand.  One of the key elements of demand is household incomes, and in the past couple of years, rising incomes have helped push up rents.  While housing discourse focuses on low and moderate income renters who are cost burdened, its the case that incomes have risen substantially for households renting market rate housing.  And if your income goes up, you are able, and might be willing to bid more for housing.  A new study from RealPage suggests that’s exactly what is happening:

Between 2016 and 2019, market-rate apartment renter incomes climbed around 4.5% annually up to a high of $66,250 in 2019. Incomes among lease signers then dropped 1.9% in 2020 due to the pandemic before surging up 7.7% in 2021 and another 7.1% so far in 2022 up to a new high of $75,000.

RealPage’s study aligns with previously published research from Harvard’s Joint Center for Housing Studies, which showed the majority of net new renter households over the last decade had incomes above $75,000.

What makes the pandemic-era results al the more remarkable is the sheer volume of renters coming in with higher incomes. Roughly 1 million net new households entered the professionally managed, market-rate apartment sector since the start of 2020 – the biggest wave of demand in RealPage’s three decades of tracking apartments. Those numbers are even more impressive given that they occurred at the same time demand flowed heavily into for-sale homes and single-family rentals, as well.

These data don’t mean that housing affordability isn’t a problem, in fact, just the opposite:  the fact that well off households have more income, and more such households are bidding for apartments means that those with lower incomes face even greater competition (and therefore, higher rents) in the marketplace.  Ultimately, the policy solution has a lot to do with supply, and importantly, more supply at every tier of the marketplace, because if the demand of higher income households isn’t met by new construction, they’re likely to further bid up prices of existing housing.

New Knowledge

Estimating the size of our housing production deficit.  Up for Growth has an ambitious new report attempting to comprehensively measure the nation’s shortage of housing, suggest policy solutions for increasing housing supply and estimating economic, social and environmental benefits of doing so.

The report estimates that nationally, we’ve “underproduced” about 3.8 million housing units.  And the report offers a reasonable set of estimates of how big a housing gap there is in each state and large metropolitan area.  That’s a valuable insight, but as the report makes clear, this isn’t just simply about producing more housing units in the aggregate:  we need more housing in the high opportunity locations that would enable people to advance economically and live more sustainably.

The report sketches out a “new foundation” for growth that targets a graduated series of infill development strategies for different communities and neighborhoods.  It recommends that new housing development be focused in neighborhoods with high levels of economic opportunity, that are job-rich and housing poor, and that have adequate infrastructure (indicated in part by high levels of walkability).  The report identifies census tracts that meet these criteria.  It recommends density increases based on the current development pattern of an area:  “missing middle” housing in low density single family areas; low- or mid-rise apartments in somewhat denser areas, and high rises in the densest neighborhoods.

The Housing Underproduction Report also makes key connections to equity, social justice and sustainability and housing supply:  many of our worst social problems (racial wealth disparities and limited intergenerational economic mobility) stem, at least in part, from constraints on housing supply.  Overcoming housing underproduction is a key strategy for rectifying these disparities.

More of the same housing policy drives poor climate outcomes. We intentionally designed A Better Foundation to realize tangible climate benefits while increasing housing availability and affordability. Key to this framework is locating new housing in areas with high concentrations of jobs and community assets, and in walkable neighborhoods with generous pedestrian or transit infrastructure. This method increases land efficiency, lowers vehicle miles traveled, and decreases the social cost of carbon.

And the “better foundation” pattern of economic growth pays economic and fiscal dividends as well.  More intensive infill development reduces the need for expensive infrastructure investment, and lowers housing and transportation costs (more supply can be expected to reduce housing costs; more concentrated development will reduce vehicle miles traveled).

Mike Kingsella and Leah MacArthur (Editors), Housing Underproduction in the US, Up for Growth, July 2022, https://www.upforgrowth.org/sites/default/files/2022-07/UFG_Underproduction_Report_Spreads_1.pdf

The Week Observed, July 29, 2022

What City Observatory did this week

Fix it Last.  The Oregon Department of Transportation claims that it has a “Fix-it” first policy–prioritizing spending funds to preserve existing roads and bridges.  But their actual budget priorities make it clear that they routinely short change maintenance and repair in favor of costly and ineffective road expansion projects.  While ODOT routinely blames the Legislature for this problem, it is actually a combination of ODOT systematically understating the costs of major capital projects, and then administratively reallocating funds that could be used for repairs to fund expanded highway

ODOT’s own federally required “Transportation Asset Management Plan” shows that the state is spending $320 million less per year than is needed to maintain the state’s roads and bridges at the current state of repair.  And ODOT has cut funding for maintenance in its current budget, while at the same time proposing to spending billions on Portland area freeway widening projects that will plunge the state deeply into debt.  ODOT continues to violate its own stated policies, and then tries to use the maintenance deficit it created (and ignored) to build a case that it be given even more money.

Must read

The Climate bill:  All EVs and no urbanism.  The big news out of Washington this week is that Democrats have reached a compromise with West Virginia Senator Joe Manchin about the outline of a pro-climate spending bill.  It heavily focuses on subsidies to manufacturing electric vehicles.  The Urban Institute’s Yonah Freemark has a quick take on the policy direction, fretting that the draft legislation would do nothing to:

—Encourage mode shift out of cars (it provides $0 for e-bikes & nothing for transit electrification);
—Encourage more environmentally sensitive land uses; and
—Reduce the resource intensity of the transport sector.
. . . one of the primary impacts of this bill is simply reaffirming the automobile dependency of the US, giving people huge incentives to invest in cars—and doing almost nothing for an alternative system.

Wealth determines homeownership in California.  Our conventional way of describing housing affordability is to compare home prices to incomes, and by any measure, housing has become less affordable, especially in California. But Metrosight Economist Issi Romem points out that it’s now wealth, not income, that determines home ownership.  California has the lowest homeownership of any state, and somewhat paradoxically has a higher fraction of homes that are owned free-and-clear (i.e. with no outstanding mortgage).  It’s because the California housing market (and state law, in the form of Proposition 13) have enshrined accumulated wealth.  Few young adults can afford homes without financial assistance from previous generations (something that skews homeownership to whiter, wealthier, long-time residents).  As Romem explains:

The growth in free-and-clear ownership can be seen across all regions of the state. The trend extends to even the youngest households, with residents 24 and younger. In addition to reflecting the increasing concentration of the nation’s wealthy in California, this growth might be the result of more young adults living with their parents – or more young adults living in homes their parents bought them.

Such inherited or gifted houses lock in wealth for future generations, and in some cases keep a property itself in the family for generations. That’s particularly common in California owing to property tax laws – Prop. 13 and its extensions – and the practice keeps a sizable share of homes off the market interminably.

As Romem notes, neither rising incomes nor reduced down-payment requirements are effective in counteracting this dynamic. In California, supply restrictions, coupled with generous subsidies to long-time owner occupants have combined to make housing wealth a root to riches, but only for those with the wealth or family connections.

Oregon largely eliminates parking requirements.  Coming on the heels of statewide legislation legalizing four-plex housing in most of the state’s single family residential areas, Oregon has adopted new rules eliminating or greatly reducing parking requirements in cities.  Sightline’s Michael Andersen describes new rules adopted by the Land Conservation and Development Commission that give cities three options for phasing out or reducing parking requirements.  As Anderson explains:

In some situations—within a half-mile of relatively frequent transit, for homes of 750 square feet or less, and for homes meeting affordability targets—minimum parking mandates will no longer apply for jurisdictions within Oregon’s eight largest metro areas.  This doesn’t prevent parking lots from being built, but it does remove the current prevailing requirements to construct a specific number of stalls: one stall per bedroom, for example, or three per 1,000 square feet of retail space.

As Don Shoup observed years ago in his magisterial book, The High Cost of Free Parking, parking requirements essentially trump zoning for high density housing:  Even when zoning codes technically allow apartments, the requirement to provide “free” off-street parking for each dwelling makes higher density construction uneconomical.  When these units aren’t built, housing supply is constrained, and rents are pushed up higher.  Oregon’s move is an essential part of a a comprehensive strategy to improve housing affordability.  Other states should take notice.

New Knowledge

The Covid Pandemic and the Donut Effect. The shift to work from home during the Covid-19 pandemic has triggered some noticeable changes in urban housing and labor markets.  This study charts how increased work from home has affected population, rents and home prices between cities and suburbs in the first year following the pandemic.

The authors find that in the nation’s largest cities, increased work from home has triggered a relative decline in demand for central locations relative to suburban ones.

Rental rates in the central business districts (CBDs) of the largest 12 US metros have fallen almost 20 percentage points relative to the change in the bottom 50% of zip codes by population density.3 Similarly, home price growth in CBDs have realized losses of around 15 percentage points compared to changes in such lowdensity zip codes. Migration patterns as measured by the USPS show a similar pattern of reallocation. CBDs of the top 12 US cities have seen net population and business outflows cumulating to about 15% of their pre-pandemic levels. In contrast, the bottom 50% of zip codes by density have gained about 2% of their pre-pandemic stock for population and businesses.

This study provides a useful baseline of how the short-term effect of work from home.  The authors point out that their findings don’t necessarily apply to all  metro’s just the 12 largest.  They also show that the migration effect is largely within metropolitan areas, rather than across them:  to the extent people are moving as a result of increased work from home it seems to be primarily about people moving from the city center to the suburbs of that metropolitan area.

Nicholas Bloom Arjun Ramani,  The donut effect of Covid-19 on cities, Centre for Ecoomic Performance, Working Paper, No.1793 September 2021, https://cep.lse.ac.uk/pubs/download/dp1793.pdf

In the News

City Observatory director Joe Cortright is quoted in The Oregonian’s coverage of the proposed Interstate Bridge Replacement Project (in reality, a $5 billion, five-mile long freeway widening project).

The Week Observed, November 18, 2022

What City Observatory did this week

The Rose Quarter’s Big U-Turn: Deadman’s Curve?  The redesign of the I-5 Rose Quarter project creates a hazardous new hairpin off-ramp from Interstate 5.  This  supposed “safety” project may really creating a new “Deadman’s Curve” at the Moda Center.  A key part of the project’s re-design is moving an off-ramp about a quarter mile south, but in doing so, the route now requires southbound freeway traffic to do a 180 degree (really 210 degree) turn and travel Northbound on local streets before connecting to arterials that lead to other destinations.  In addition to more vehicle travel and circuitous routing, the new hairpin off-ramp creates hazards for those not traveling in cars. 

 For example, bike riders on Portland’s North Williams bikeway—one of the city’s busiest—will have to negotiate two back-to-back freeway ramps that carry more than 20,000 cars per day.  The $1.45 billion highway widening is advertised as a “safety” project, but it’s likely to make this part of Northeast Portland more dangerous for many travelers.

ODOT reneges on Rose Quarter cover promises.  In the 1950s, 1960s and 1970s, the Oregon Department of Transportation built three highways through Portland’s predominantly Black Albina neighborhood, destroying hundreds of homes and leading catastrophic population loss.  Now ODOT is back promising that a freeway widening project will somehow repair this damage because it will include “capping” 3 blocks or so over the freeway.  The caps have been advertised as way to heal the neighborhood, but the project’s recently released Supplemental Environmental Impact Statement shows the agency is reneging on its glib sales pitch.

It trumpeted “highway covers” as a development opportunity, falsely portraying them as being covered in buildings and housing—something the agency has no plans or funds to provide. The covers may be only partially buildable, suitable only for “lightweight” buildings, and face huge constraints.  ODOT will declare the project “complete” as soon as it does some “temporary” landscaping.  The covers will likely be vacant for years, unless somebody—not ODOT—pays to build on them. ODOT isn’t contributing a dime to build housing to replace what it destroyed, and its proposed covers are unlikely to ever become housing because they’re too expensive and unattractive to develop.

In the News

Bike Portland featured City Observatory’s analysis of the I-5 Rose Quarter project in its story on the release of the Oregon Department of Transportation’s Supplemental Environmental Impact Statement.  The public will have until January 4, to provide comments; It appears ODOT has strategically timed the comment period to overlap the holidays to squelch public input on the project.

 

The Week Observed, and our regular features (Must Read and New Knowledge) will return in December.  Happy Thanksgiving to all.

 

The Week Observed, November 11, 2022

What City Observatory did this week

Risky bridges.  The Oregon and Washington highway departments are blundering ahead with a $5 billion plan to widen I-5 between Portland and Vancouver, and are making many of the same mistakes they made with the failed Columbia River Crossing a decade ago.  A key difference: last time, there was some adult supervision of the DOTs:  A series of independent, outside experts looked at the project’s engineering and financial details, and found serious errors that needed to be fixed.  The project was delayed because the “open-web” design proposed by state DOTs was found to be unbuildable.  Similarly, outside financial consultants found that toll levels would need to be doubled to pay for a portion of the project.  The revived project hasn’t been subjected to any independent, outside review, yet state and federal authorities are being asked to sign a virtual blank check for this project.  You’d insist on getting an independent, professional inspection before buying a used car or an old house:  Oregon and Washington legislators would be well advised to get similar advice before spending billions.

State DOT misrepresents the Coast Guard Bridge Permit Approval Process.  Two Oregon DOT officials testified to the Legislature that their proposed $5 billion I-5 Bridge Replacement was grandfathered under more old Coast Guard regulations and didn’t have to comply with more stringent rules adopted since the Columbia River Crossing debacle.  The Coast Guard has stated that the new project must meet the new rules, which include addressing the project’s navigation clearance prior to the Environmental Impact Statement.

While state highway officials have claimed that the Coast Guard’s June 2022 preliminary determination that the new bridge would have to have a 178 foot vertical navigation clearance was just an early draft, Coast Guard regulations–that the US DOT has agreed to–make it clear that the determination is meant to exclude non-complying alternatives from inclusion in the EIS, and that the two highway agencies are proceeding “at their own risk” if they submit a design that doesn’t comply.

Must read

Mitigation for freeway widening through cities is lipstick on a pig.  We highlight another tweet-storm from Center for American Progress transportation expert Kevin DeGood.  He looks at how Texas DOT is proposing to offset the damage its multi-billion dollar I-45 freeway project will do to Houston neighborhoods.  The freeway will demolish 1,000 homes, 300 businesses and 2 schools and other facilities.  TxDoT claims it will remedy these harms with some public space (freeway covers, an “event lawn”).  While those are nice amenities, they don’t remedy the damage done to these neighborhoods, DeGood points out:

Neighborhood connectivity and Improving pedestrian and bike facilities is great. It’s also not a meaningful remedy for the harms of the NHHIP (the I-45 highway widening project).

Not only that, but the plan is to have the municipal government rather than the highway department bear the costs of what are advertised as “reparations.”  Again, DeGood:

And how does Central Houston envision paying for these improvements? Tax increment financing (TIF).
Question.
Why should locals shoulder the cost burden to remedy harms created by TxDOT? Shouldn’t TxDOT pay for TxDOT-caused harms?

Simply constructing amenities atop a freeway doesn’t undo the harm done by the wholesale demolition of neighborhoods, or the creation of car-dominated toxic environments around freeways.

The highway building lobby aims to capture the “Reconnecting Communities” program.  As we all know, urban highways have devastated city neighborhoods around the country.  A tiny part of last year’s infrastructure bill dedicated a billion dollars to try and repair that damage.  It could be a source of funding to remove roadways and restore neighborhoods.  But around the country, state DOTs are scheming to use the money to widen or expend highways.  As Streetsblog explains:

Perversely, some agencies are even attempting to expand highways using the program’s funds — while using nominal equity improvements as political cover to justify the additional lanes.

It calls out two projects as particularly egregious: The Oregon Department of Transportation’s Rose Quarter I-5 freeway project, which could as much as double freeway traffic.  Another project in Tulsa calls for a study that would “widen underpasses” for Interstate 244, the roadways the ultimately destroyed the cities famous Black Wall Street district.  “Reconnecting communities” shouldn’t turn out to be a “crime victims compensation” fund that gives the money to the criminals, but that is exactly what could be happening.

In the News

City Observatory’s analysis of the convoluted and dangerous new off-ramp design for the I-5 Rose Quarter project was featured in Bike Portland.

The Week Observed, February 3, 2023

What City Observatory did this week

Groundhog’s Day for Climate.  So you think you’re not Bill Murray in the classic “Groundhog’s Day?”  Oregonians, ask yourself:  are we anywhere closer to seriously addressing the climate crisis than we were a year ago?  Greenhouse gas emissions are still increasing, chiefly because we’re driving more, and our policies are still subsidizing even more driving, including billions of dollars for freeway widening projects.
City of Portland raises big questions about the I-5 Rose Quarter freeway widening project (translated).  Last month was the deadline for comments on the supplemental environmental analysis for the proposed $1.45 billion I-5 Rose Quarter freeway widening project.  At City Observatory, we’ve documented the myriad problems with the project, including increasing pollution, worsening safety, failing to solve congestion, and further undermining neighborhood livability.
The City of Portland’s official comment letter echoes many of these concerns, but in a stilted bureaucratic dialect that may not be intelligible to all readers.  As a public service, City Observatory offers its translation.

Must Read

Why just electrifying cars is an environmental disaster. Curbed’s Alissa Walker points out that the idea that a one-for-one replacement of internal combustion vehicles with ever more massive electric vehicles poses some serious environmental costs.  Vehicle sizes have been creeping ever upward–now topping more than two tons–and electric vehicles are heavier still than their fossil fueled counterparts, due to the prodigious weight of batteries.  Most of the electricity used to power these vehicles is used, not to move a few hundred pounds of human occupants and their cargo, but rather to transport the huge vehicles and their batteries.  Because batteries require substantial natural resources, especially lithium, which has to be mined, larger vehicles have a higher environmental impact.  The transition to EVs could be an opportunity to have smaller, more efficient vehicles, including e-bikes.  That would have a far smaller environmental impact, and lead to a much faster and surer reduction in greenhouse gas emissions.

Why are US roads so deadly?  An illuminating chart from the League of American Bicyclists shows how road safety in each of the 50 US states (red) compares to major nations around the globe (blue).

France, Germany, Britain, Japan and Canada all of per capita road death rates lower than any US state.  Only a handful of nations–all in Latin America, have per capita road death rates as high as typical US states.  The high income, developed nations of the world all have vastly lower death rates.  It’s likely we have a lot to learn from the rest of the world about transportation safety.

New Knowledge

Black drivers are stopped more often for speeding and pay higher fines.  A new study looks at the variation in traffic enforcement by race.  It uses a novel source of data that lays bare the racial bias in policing.

The authors use data from the automated tracking of Lyft drivers in Florida, which record location and vehicle travel speeds.  As a result, after controlling for a variety of factors, including vehicle type, time of day, roadway, and the actual speed the vehicle was traveling, the study can tease out the difference in the likelihood that a driver will be cited for speeding, and, once-cited how high fine they will pay.

The bottom line of the study, unsurprisingly, is that minority drivers are more likely to be stopped, and once stopped pay higher fines than white drivers.  The data show that minority drivers are about 24 to 33 percent more likely to be cited and pay fines that are about 23 to 34 percent higher than white drivers.

Aside from the headline finding, there are other interesting details.  The Lyft data showed that being cited  for speeding actually led to a change in driver behavior.  Drivers were much less likely to exceed the speed limit in the eight weeks after getting a ticket than they were before.  The effect was esssentially the same for White and Minority drivers.

 

Pradhi Aggarwal, Alec Brandon, Ariel Goldszmidt, Justin Holz, John A. List, Ian Muir, Gregory Sun, and Thomas Yu, “High-frequency location data shows that race affects the likelihood of being stopped and fined for speeding, ” Lyft, December 9, 2022

 

The Week Observed, January 27, 2023

What City Observatory did this week

Driving stakes, selling bonds, overdosing on debt.  The Oregon Department of Transportation is following a well trodden path to push the state toward a massive highway expansion project. For example, Oregon DOT has kicked off the half billion dollar I-205 project with no permanent funding in place, instead relying on short term borrowing.  It’s planning the same for the $1.45 billion I-5 Rose Quarter project.
Taking a page from the Robert Moses playbook, they’re planning to drive stakes on three billion-dollar-plus highway expansions, and then issue bonds to pay their costs.  The bonding will obligate the state to pay off these projects ahead of every other transportation priority for the next several decades.  Debt is a powerful drug, and Oregonians should be wary of the financial burden the highway builders are pinning on them.

Must Read

Induced Distance:  Earlier this month, we and others criticized the latest iteration of the congestion cost scare-mongering reports, this one from Inrix. These fictional and inflated congestion cost estimates are used as a rationale for road-widening projects, which are inherently self-defeating, because of induced demand:  building more roadway capacity simply induces more driving.  But it’s actually worse than that. Lloyd Alter, writing at Treehugger, adds another dimension to these critiques, pointing out that building more roads also leads to greater distances between destinations.  The more land we use for roads, parking lots, yards, and setbacks, the further away are all of the places we want to travel to.  Alter credits Bart Hawkins Kreps with coining the term, “induced distance”

It’s widely recognized that if you keep adding more traffic lanes, there will be more traffic—that’s ‘induced demand.’ Just as important is that as we clear space for wider roads and more parking, we push destinations farther apart—that’s ‘induced distance.’ As bad as induced distance is for drivers, it’s even worse for the pedestrians who now need to walk much farther to get around in their formerly compact cities.”

This is one of those instances in which a picture is worth a thousand words, from Twitter:

That’s true in a physical sense, but to add to it, we also need to acknowledge that car-centric transportation systems tip the economic balance in favor of large scale uses and penalize smaller scale firms (for example, enabling big box retail, and killing off local, mom and pop stores).

How inclusionary housing makes housing less affordable.  San Francisco has some of the most expensive housing in the United States, and in a seeming paradox, some of the nation’s most stringent affordable housing regulations. The city requires developers of new apartment buildings to set aside a portion of new apartments to be rented at rates affordable to low and moderate income households.  But these requirements make much new development uneconomical.

A new study prepared for the city by real estate consultants at Century| Urban, shows that for rental housing, the city affordability requirements make development uneconomical for 19 of 20 studied pro-formas, and for the the one-in-twenty that came close to pencilling out, it didn’t comply with the city’s affordability standard.  Housing advocates like to take credit for the relative handful of affordable units that do get built under inclusionary requirements, but what they and others miss is the vast amount of housing that simply doesn’t get built at all because of these requirements, and this constraint on housing supply actually makes affordability worse–for everyone.

There’s no such thing as affordable housing.  Much housing discourse is an argument over whether we should be building affordable housing or market rate housing.  That misses the critical point that affordability isn’t determined by the characteristics of a particular house or apartment, but by the overall balance between supply and demand in a city.  What determines whether a particular home is affordable has to do with how many homes we have.  As Daniel Herriges of Strong Towns explains:

Developers don’t build affordable apartments or unaffordable apartments. They build apartments. Some are, no doubt, nicer than others, but this alone doesn’t make them expensive or inexpensive. That only happens when those apartments are sold or rented. At that point, the price is determined in a transaction that is influenced by market forces, public policy, or both.

And here’s a tangible example.  Two similar Victorian homes (one in Scranton, another in San Francisco) sell for wildly different prices.

The Scranton one is, by any standard pretty affordable.  But that’s almost entirely because it is in a market where supply has more than kept up with demand.

We’ve made a similar case looking at the variation in prices among the millions of ranch homes built in the US in the 1950s and 1960s.  The reason some are affordable and others are not, as Herriges points out, has little to do with the structure, as built.

In the News

Pennsylvania’s Center Square cited City Observatory’s analysis that subsidies for electric car purchases tend to disproportionately benefit high income households.

Note:  An earlier version of this post mistakenly identified the location of the Scranton home.  Our apologies!

 

The Week Observed, March 10, 2023

What City Observatory did this week

Why does a $500 million bridge replacement cost $7.5 billion?  For the past several years, the Oregon and Washington highway departments have been pushing for construction of something they call the “Interstate Bridge Replacement” project, which is a warmed-over version of the failed Columbia River Crossing.  The project’s budget has ballooned to as much as $7.5 billion.  New documents have come to light that show that the “bridge replacement” part of the Interstate Bridge Replacement only costs $500 million, according to new project documents. So why is the overall project budget $7.5 billion?

We dig into the limited information provided by the state DOTs.  There are two big answers to the question as to why a $500 million bridge ends up costing $7.5 billion.  The shortest and simplest answer is that it is really a massive freeway-widening project, spanning five miles and seven intersections, not a “bridge replacement.”  A longer (and taller) answer is that the cost of the river crossing is inflated because of the need to raise the roadway to clear a 116-foot navigation channel.  This fixed span requires half-mile long elevated viaducts on both sides of the river, and the need to have interchanges raised high into the air make the project vastly more complex and expensive.  As then-Congressman Peter DeFazio observed of the project some years ago, ““I kept on telling the project to keep the costs down, don’t build a gold-plated project . . . They let the engineers loose, . . .  it wasn’t overseen: they said solve all the problems in this twelve-mile corridor and they did it in a big engineering way, and not in an appropriate way.”

Must Read

USDOT and TXDOT “settle” Houston I-45 civil rights complaint.  It’s well established that freeway construction has been an engine of segregation in US cities.  Belatedly, the US Department of Transportation is acknowledging this fact, and has spent the last two years investigating the whether the Texas Department of Transportation’s plan to widen I-45 through Houston complies with civil rights laws.  USDOT has entered into an agreement with TXDOT, which includes allocating $30 million to building affordable housing, building trails and greenspaces, monitoring pollution and  (surprise!) holding more meetings.  The housing money and amenities are no doubt welcome, but that hardly undoes the damage done by this freeway and its pending expansion.  Houston Public Media quoted Tiffany Valle, an organizer with Stop TxDOT I-45:

“The agreement that TxDOT and the FHWA have voluntarily agreed upon, it’s really just more of the same,” Valle said. “They are continuing to allow the destruction of Black and brown communities, something that has been done for decades.”

Opponents of the project won’t be mollified by these superficial steps.  They will fight on, as they should.

America Walks grades the USDOT’s “Reconnecting Communities” Grants.   https://americawalks.org/reconnecting-communities-grant-awardees/.  America Walks calls out some of the positive steps–like grants to advance freeway removal efforts in Buffalo, Long Beach and Kalamazoo.  But at the same time, USDOT failed to provide funds to some other efforts that could clearly benefit from these resources, like the Claiborne Avenue Alliance in New Orleans.  And some of the project are simply providing state highway departments with a generous dose of greenwash for clearly egregious highway widening projects, as in Austin.  Too much of the money, in fact, simply went to state highway agencies, who historically caused most of the problems this program aims to solve, and who in many cases are half-hearted or duplicitous in their support for really rethinking the effects of urban highways.  Too seldom has USDOT taken the opportunity to empower local groups to fundamentally challenge the highway orthodoxy. They write:

Only three of the 39 winning planning grant applications have a community-based organization as their lead applicant. If USDOT is serious about developing reconnecting communities solutions tailored to local needs, it should aim to fund more community-based organizations so they can bring information and professional expertise from outside the normal channels of infrastructure development that in the past haven’t served residents.

Pity the poor airports–victims of climate change.  Our friends at the Brookings Institution have a new report detailing the costs airports will have to bear to cope with the effects of climate change.  Many airports are built on or near floodplains, and are vulnerable to severe weather and flooding, which are likely to become more common with climate change.  It’s likely that they’re under-prepared, and more spending is needed to harden airports against these effects.  The Brookings Report makes a good case, but largely ignores one salient detail:  Airports themselves are some of the most carbon-intensive activities in any metropolitan area.

The business model of every airport hinges primarily on sources of revenue tied directly to greenhouse gas emissions:  jet aircraft and parking garages.  We ought to have a national program that reflects the costs of climate adaptation back on big greenhouse gas emitters.  Air travel fees should not only pay for hardening airport infrastructure, but offsetting the damage from air travel related emissions on the rest of the economy as well.  Maybe if air travel paid these costs, other alternatives, like rail, would look more economically feasible.

New Knowledge

New housing helps drive down local food prices and improve product variety.  Over the past few years there’s been of spate of research confirming what economists have long held: building new housing in urban areas tends to hold down rents.  A new study from Montevideo looks at a related issue:  the effect of new housing on prices of goods in local stores.

Its long been a trope of the gentrification debate that new development leads to the demise of long-established and perhaps prosaic local stores and their displacement by twee boutiques and overpriced shops.  But focusing on prices actually charged in local markets, and counting the variety of goods available to local shoppers, this study from Fernando Borraz and his colleagues shows just the opposite:  the construction of new housing tends to lower the price and increase the variety of products on offer in local stores.

The Borraz, et al, paper is an extension of a line of research from Jesse Handbury, who observed that residents of larger cities actually face lower prices for a market basket of goods and services, because cities provide a better fit between what consumers want and what’s actually available.  It’s also more proof for Ed Glaeser’s thesis about the “consumer city—that the competitive advantage of cities stems not simply from greater productive efficiency, but because they offer residents a more diverse (and less expensive) array of goods, services and experiences.

What this paper means in practice is that the new neighbors who live in the apartments that get built in your neighborhood provide the customer base to attract more merchants, and increased numbers lead stores to compete harder for their trade.

Fernando Borraz, Felipe Carozzi, Nicolás González-Pampillón and Leandro Zipitría, Local retail prices, product varieties and neighborhood change, Centre for Economic Performance, Discussion Paper No.1822 January 2022   ISSN 2042-2695

 

The Week Observed, March 17, 2023

What City Observatory did this week

Why does a $500 million bridge cost $7.5 billion?  For almost two decades the Oregon and Washington highway departments have been saying they want to replace the I-5 bridges over the Columbia River connecting Portland and Vancouver.  Late last year, they announced that the total cost of the project could run as high as $7.5 billion.  But the project’s detailed comparison of different bridge types shows that the actual structure crossing the river will cost only about $500 million.  Here’s the detail from the project’s “river crossing” report:

The reason the project is so expensive is not because of the bridge, but because the two agencies plan to rebuild seven interchanges on either side of the river, widen the freeway to a dozen lanes, and build almost a mile of elevated viaducts to connect their high river crossing to the existing highway.  As then Congressman Peter DeFazio observed a decade ago, the “problem was thrown out to engineers, it wasn’t overseen: they said solve all the problems in this twelve-mile corridor and they did it in a big engineering way, and not in an appropriate way.”  If this project were simply about replacing the actual bridge, rather than widening the freeway, and rebuilding interchanges, it would be vastly cheaper.

Civil rights or repeated wrongs:  Houston moves ahead with a $10 billion I-45 Freeway.  We publish as a guest commentary an analysis prepared by the Center for American Progress infrastructure expert Kevin DeGood, who looks at the aftermath of a negotiated settlement between FHWA and TxDOT to a civil rights complaint against the giant freeway widening project.

As DeGood points out, the settlement offers mere procedural window dressing, largely just masking past damage, and setting the stage for another round of neighborhood destruction by freeway construction.

Must Read

More flat earth thinking from highway engineers.  Who knew that building more roads was the key to reducing greenhouse gas emissions?  IN a guest opinion for Greater Greater Washington, Bill Pugh takes a close look at the Washington DC region’s bold climate goals, contrasting them with its stated “build more highways” transportation strategy.  Throughout the region, highways agencies routinely repeat the discredited claim that they’ll reduce greenhouse gas emissions by reducing the amount of time cars spending idling in traffic.  As Pugh points out, the thousand miles or so of additional lane-miles of roadway the region’s jurisdiction’s plan to build will generate as many as 3 to 4 billion additional miles of auto travel every year–more that wiping out any gains from “less idling.”

Scapegoating (the wrong) greedy investors for housing unaffordability.  The indispensable Jersusalem Demsas once again takes out the trash on a pet housing market theory:  blaming hedge funds and investment bankers for housing unaffordability.    Demsas digs deep into the statistics that people use to blame big finance for invading housing.  Numbers focus on recent sales, rather than the entire housing stock, and more importantly, conflate “institutional investors” with all investors–which includes a lot of mom-and-pop homeowners who rent out a house or two.  Institutional finance represents only a small share of new purchases, and has even peaked in some markets.  Overall, Demsas concludes that the flow of institutional capital into single family home purchases is an effect, rather than a cause of unaffordability:

“Institutional Investors” are not why rents are so high or why homeownership is out of reach for so many. Investors are not driving the unaffordability; they are responding to it. Many different investors are all flocking into the housing market; what is most relevant is the fundamental reason they are all being drawn there. Housing is primarily unaffordable in this country because of persistent undersupply. In fact, institutional investors are entering the single-family-home market precisely because supply constraints have led to skyrocketing prices.

The imbalance of demand and supply is really what’s behind the unaffordability, and scapegoating institutional investment maybe politically popular, but misses the mark.  And, as we’ve pointed out at City Observatory, institutitonal investors are pikers in the housing market:  it is incumbent owners that racked up literally trillions of dollars in capital gains in the last few years due to growing housing unaffordability.

New Knowledge

PM 2.5 pollution across the United States.  There’s a growing awareness that fine particulates (pollution smaller than 2.5 microns) cause significant health effects.  These tiny particles can be inhaled deeply into the respiratory system, and prolonger exposure leads to higher rates of asthma and other respiratory diseases.

A new mapping project published by The Guardian looks at the geographic variation in PM 2.5 (particulate matter smaller than 2.5 microns) across the US.

Maps are available for the entire nation and show the hot spots for the concentration of these dangerous small particles.  Here, for example, is the map of the St. Louis region.

The Guardian has looked at the pattern of hot spots compared to the demographics of local neighborhoods, and–unsurprisingly–finds that there’s a strong correlation between race, ethnicity and exposure to fine particle pollution.  People living in predominantly Black and brown neighborhoods tend to be exposed to much higher levels of PM 2.5.

. . . across the contiguous US, the neighborhoods burdened by the worst pollution are overwhelmingly the same places where Black and Hispanic populations live. Race is more of a predictor of air pollution exposure than income level, researchers have found.

The Guardian’s maps and findings are based on research from scientists at the University of Washington and other institutions that have developed a sophisticated model of the generation and transmission of these particles.  Data are from the years 2011 to 2015.

Erin McCormick and Andrew Witherspoon, “America’s dirty divide:  
US neighborhoods with more people of color suffer worse air pollution,” The Guardian, March 8, 2023

In the News

Todd Litman of the Victoria Transportation Policy Institute, cited our parable of how Louisville showed how to solve traffic congestion at Planetizen.

The Week Observed, March 23, 2023

What City Observatory did this week

Oregon’s transportation finance in crisis:  Testimony to the Joint Ways and Means Committee.  On March 16, City Observatory’s Joe Cortright testified to the Oregon Legislature’s budget-writing committee about the financial crisis confronting the state’s transportation agency.  The Oregon Department of Transportation’’s traditional sources of revenue are collapsing, and will certainly decline further in coming years.  The agency is failing to maintain existing roads, and has a huge backlog of maintenance, safety, seismic and other needs that continue to grow.  In the face of declining revenues and deferred maintenance, the agency is embarking on an unprecedented spending spree for expensive megaprojects.

ODOT has shown no ability to manage project costs, with every major project incurring massive cost overruns.  The agency is moving to start construction on these projects and commit the state to paying for them without a financial plan in place.  It claims it will use toll revenues to pay for megaprojects, but has no experience collecting or accurately estimating tolls.  It is planning to take on billions of dollars in debt backed by the promise of tolls. It has used short-term borrowing—the government version of a payday loan—to get projects started while avoiding the independent, investment grade analysis that will be required to get long-term financing.  Repaying the debt incurred for these projects will take legal precedence over all other state transportation priorities, leading to further cuts in maintenance and repair, and jeopardizing every other capital construction project in the state.

The Legislature needs to inject some prudence into transportation finance by requiring a “fix it first” policy, telling ODOT to live within its means, right-sizing bloated megaprojects, and securing independent expert financial advice.

Why localism is the cause, not the solution, of our affordability problems.  Bruce Katz, author of “The New Localism” has offered a call for a national commission to make recommendations on how to solve our housing affordability problems.  While it makes sense to have such a national conversation, and Katz puts his finger on many of the key symptoms of the housing crisis, all the evidence points to an excess of localism, in terms of our obsequious deference to “local control” in permitting new housing, as the principal cause of the problem.

Must Read

Climate and the land use imperative:  An interview with Jenny Schuetz.  The latest International Panel on Climate Change (IPCC) report is out, and it continues to be grim.  We’re well on our way to a hotter, more turbulent world.  The Triangle Blog from North Carolina interview Brookings scholar Jenny Schuetz about the report’s recommendations and some key insights.  She made a couple of trenchant observations.  She noted that the IPCC report specifically calls out a series of related land use and housing policies as key strategies for reducing greenhouse gas emissions, including more density, greater infill development, more robust transit service and a greater emphasis on building bikeable, walkable neighborhoods.

In addition, Schuetz makes a critical point about how climate change has been framed largely as an individual and moral issue, rather than a collective social issue.  She says:

Americans have been misled into thinking that fighting climate change is an individual choice, about our own personal behavior (guilting people over recycling is a great example). . . . For too long, we’ve bought into the idea that each person individually can choose to live a climate virtuous life, without having policies and market structures in place to guide everyone’s actions. That’s not going to get the job done, and it’s a distraction from the collective action we desperately need.

Environmental Groups finally start talking about reducing VMT.  Writing at Slate, David Zipper notes an encouraging trend among environmental advocates.  For a long time, most environmental organizations have chiefly endorsed the “technical fix” notion of simply electrifying cars as a way to reduce transportation greenhouse gas emissions.  The limits of that strategy are becoming more and more apparent; heavy electric vehicles still require massive amounts of energy to move, and the continued (and perhaps expanded) dominance of auto-dependent development patterns would lead to increased driving, crash related injuries and deaths, as well as not addressing fine particle emissions from brakes and tires.  But for years, supporting vehicle electrification was the politically low-hanging fruit, as it didn’t challenge auto-dependent orthodoxy, and created powerful allies among automakers and their workers.  But with substantial incentives for EV purchases baked in to the Inflation Reduction Act, environmental groups are now looking harder at land use and transportation alternatives.  Zipper notes that both the Union of Concerned Scientists and the Sierra Club have made moves in this direction.  He writes:

 . . . it’s encouraging that those who have waged often-lonely fights for transit, biking, and walking are poised to gain powerful new allies. “We legitimately need to do mode shift to achieve our climate goals,” said [Sierra Club’s  Katherine] Garcia. ”That’s the message we are sharing regarding the Sierra Club’s national priorities.” You weren’t likely to hear something like that from her organization’s leaders twenty years ago.

For too long, environmental groups have been the dogs that barely bark, if at all, during debates about the calamitous damage inflicted by American autocentricism. For the planet’s sake, let’s hope everyone hears them now

One rallying point:  a $1,500 rebate for e-bike purchases, which passed the House, but was left out of the final Inflation Reduction Act, seems like a way to fight climate change, assist households of modest means, and reduce VMT.  It’s a first step in the right direction.

New Knowledge

The economic benefits of freeway removal.  Around the nation, community and neighborhood groups are looking for opportunities to remove urban freeways as a way of revitalizing damaged cities.  Their work has gotten impetus from the $1 billion allocated to the Reconnecting Communities program included in the bipartisan infrastructure law.  A new study from Duluth summarizes the economic case for freeway removals.

Duluth is studying converting its existing waterfront I-35 freeway to restore better access to the waterfront.  A study from the University of Minnesota, Duluth, looks at what the city can learn from freeway removals in other similar cities.  They present a series of case studies of mid-sized US cities that have removed or substantially reduced their urban freeway footprint in the past decade or more.  Getting rid of freeways and urban traffic has led to new investment in the nearby area, as cataloged here.

In addition, removing freeways creates construction jobs just as building them did.  In the case of the proposed Duluth project, the authors estimate that the conversion project will create 450 construction jobs and nearly $30 million in labor income.

Haynes, Monica; Bennett, John; Chiodi Grensing, Gina; Hopkins, Erin; Nadeau, Kenny; Perry, D’Lanie, Economic Effects of the Potential I-35 Conversion in Downtown Duluth,  (University of Minnesota Duluth, 2023)

 

The Week Observed, March 31, 2023

What City Observatory did this week

What are they hiding?  Oregon and Washington are being asked to spend $7.5 billion on a giant bridge:  Why won’t anyone show pictures of what it would look like?  The Oregon and Washington highway departments are using an old Robert Moses trick to make their oversized bridge appear smaller than it really is. The bridge will blot out much of the reviving waterfront and downtown in Vancouver, and put Hayden Island in the shadow of a half-mile long viaduct.

The IBR has distributed misleading and inaccurate images of the proposed bridge, attempting to make it look smaller. The agency is spending $1.5 million to create a “digital twin” computer model of the IBR, but is keeping it secret to avoid public scrutiny of its design. Computer visualizations, complete with human-scale animations, are cheap and common for construction projects, such as Vancouver’s proposed waterfront public market–but ODOT and WSDOT have steadfastly refused to provide such visualizations for the IBR.

The Color of Money:  How ODOT is using flexible federal funds to bail out the state highway fund, and permanently cut funding for transportation alternatives.  In Oregon, there are two colors of transportation money:  state highway funds that thanks to an arcane (and misinterpreted) constitutional provision supposedly can only be used to build and repair road, and federal funds, which are highly flexible and can be used for transit, biking and walking.  ODOT has hit on a gimmick to divert a billion dollars of those flexible funds to pay off a liability of the State Highway Fund (bringing roads into compliance with the Americans with Disabilities Act.

The ramps and sidewalks needed to comply with ADA can and should be paid for with state highway funds, but ODOT will instead use flexible federal dollars, and will borrow $600 million and commit future federal funds (plus hundreds of millions in interest payments) to paying back the debt, permanently precluding future decisions to use those funds for transportation alternatives.

Must Read

Writing at Vox.dot, Harvard’s David Zipper contemplates what it will take to avoid a transit doom loop.  The big decline in transit ridership during the pandemic, the contraction of the office market in the face of continuing work-at-home trends and the likely disappearance of the pandemic related bail out funds seem to be posing an existential challenge for transit agencies.  Zipper acknowledges the challenge and offers some suggestions.  First, he suggest, this may not be the time to be thinking about eliminating fares—while this might have some apparent equity benefits, the costs, in terms of reduced service would cause far greater harms, especially now.  We probably should be spending more on transit, but “free” transit that doesn’t serve people or destinations, or doesn’t come often enough isn’t equitable.  Instead, Zipper argues, transit agencies should be focusing on maintaining, and where possible improving service, especially frequency.

Service improvements like these are indispensable, but some of the other priorities transit agencies are currently balancing are not. For instance, with ridership still depressed, now seems like a good time to deprioritize expensive capital projects like vehicle purchases and rail expansions, and reallocate the money toward maintenance that makes service more reliable and frequent.

The planet isn’t on track to meet its climate goals–and growing transportation emissions are the major culprit.  The Internationa Energy Agency (IEA) tracks carbon emissions by major sector of the global economy.  In the wake of the Covid-19 pandemic, transportation greenhouse gas emissions have rebounded sharply.

The IEA’s summary makes it clear that transport is actually growing again, and that we have a long way to go to get on the path to net zero emissions.  They write:

In 2021 global CO2 emissions from the transport sector rebounded, growing by 8% to nearly 7.7 Gt CO2, up from 7.1 Gt CO2 in 2020, as pandemic restrictions were lifted and passenger and goods movements began to pick up following their unprecedented decline in 2020. Even with anticipated growth in transport demand, following the Net Zero Scenario requires transport sector emissions to fall by about 20% to less than 6 Gt by 2030.

While the IEA calls for vehicle electrification to put a dent in carbon emissions, its report also highlights the importance of urbanism in reducing car travel, and calls for more transit-oriented development.

New Knowledge

Parking:  Lots.  Between a tenth and a third of every major downtown area in the United States is given over to car storage.  A new study published by the Parking Reform Network has detailed maps showing the amount and location of parking in the core of nation’s largest cities.  The maps are the brainchild of Chris Carpenito, who harnessed data from Open Street Maps to identify the land area given over to surface parking in the 50 largest cities.

Here’s a typical map showing downtown Portland, Oregon.  Red areas are dedicated to parking.  About 11 percent of Portland central city is dedicated to parking.

The studies findings are summarized in a “Parking Score,” which really grades on a curve.  Larger more populous cities are expected to be denser and have less land dedicated to parking.  Smaller cities tend to have more land dedicated to parking.  Here are the downtowns of the largest US cities with the lowest parking scores (meaning least land area given over to surface parking).

For full details, see the maps of the 50 downtown areas, and the city parking score rankings at the Parking Reform Network.  The project’s methodology is described here.  This is extremely useful information that provides a new perspective on the way parking affects the urban landscape.

In the News

Clark County Today republished our article, “What are they hiding?” about the Oregon and Washington highway department’s efforts to conceal the height and bulk of the freeway widening project they’re planning between Portland and Vancouver.

The Week Observed, April 7, 2023

What City Observatory did this week

IBR’s plan to sabotage the “moveable span” alternative.  The proposed $7.5 billion Portland area freeway widening project is supposedly looking at a moveable span option to avoid illegally impeding water navigation.  But state DOT officials are planning to sabotage the analysis of a moveable span options as part of the Interstate Bridge Project. The Coast Guard has said a replacement for the existing I-5 bridges would need a 178 foot navigation clearance.  The highway departments want a 116′ clearance fixed span. The Oregon and Washington DOTs say they are going to study a “moveable span” as a “design option” but are plainly aiming to produce a costly design that just grafts a lift-span on to their current bridge design.

A moveable span would enable a lower crossing, eliminate the need for lengthy viaducts, and reduce construction costs—but ODOT is refusing to design an option that takes advantage of these features. And the DOTs have completely ignored an immersed tube tunnel option, implying that the Coast Guard directed them to study the moveable span (which it didn’t). IBR staff have signaled they have no intention of seriously considering the fixed fixed span, and are engaged in malicious compliance.

Wile E. Coyote crashes to earth: Inclusionary Zoning in Portland.  Portland’s inclusionary zoning requirement is a slow-motion train-wreck; apartment permits are down by sixty percent in the City of Portland, while apartment permitting has more than doubled in the rest of the region.

Inclusionary zoning in Portland has exhibited a Wile E. Coyote pattern:  apartment starts stayed high initially, until a backlog of grandfathered units got built.  Since then Portland apartment permits have plummeted.

Must Read

Traffic studies are junk science. There’s a whole web of pseudo-science underpinning the professions of transportation and land use planning in the United States.  Donald Shoup has famously debunked the statistical fraud in studies that purport to estimate parking requirements.  A new study summarized by Streetsblog makes a powerful case that a parallel requirement–transportation impact studies–are equally flawed.

It’s common in many development or permitting processes to require a traffic impact study–a statistical estimate of how many more trips (almost invariably automobile trips) will be “caused” by building a new shopping center or apartment complex or housing subdivision.  Cities use these impact studies to require developers to offset the impacts attributed to their development:  for example, by widening roads or paying for new traffic signals or other transportation improvements.  There’s a problem and a paradox.

The problem is that the statistical work used to estimate traffic generation is based on observations of environments that are not necessarily good predictors of every other development.  The amount of trip generation from a new greenfield mall in a Florida or Arizona suburb may be a poor guide to the added traffic associated with infill development in a dense urban setting.

The paradox is that the so called remedy–expanding traffic capacity–creates a kind of perverse, self-fulfilling prophecy.  Cities end up taxing development to subsidize more infrastructure that encourages more travel.  As study co-author Kenneth Stahl says:

“They’re being used to require traffic mitigations that only induce more driving. … You’re getting terrible policy outcomes, and they’re based on analyses that aren’t reliable at all.”

Finally, it’s worth noting that traffic impact studies seldom, if ever, consider the counterfactual:  what would happen to travel, particularly in a regional context, if more development isn’t added in a particular location. Particularly in urban settings, adding more apartments or more shopping opportunities might tend to put more people closer to common destinations, shortening auto trips, facilitating more biking and walking, and actually leading to reduced driving.

Housing’s  “Missing Bottom.”  Over the past few years, the “missing middle” has been a popular slogan in housing debates, pointing to the paucity of smaller multi-family buildings that use to be common in America’s residential neighborhoods until the widespread adoption of exclusionary single family zoning.  Writing at the blog “Building the Skyline”, Jason Barr, makes the case that we need to be thinking about a the “missing bottom”–in this case the bottom along a different dimension, housing price and age.

Barr points out that the way we get most affordable housing in the US is that people move into housing that has depreciated because it has gotten older.  Careful economic studies have traced out the steady downward progression of the housing stock, which is generally most expensive (and most occupied by higher income households when it is new) and which over time, declines in price, and is occupied by successively less well-to-do households.

Research and history suggest that we need to think about housing and cities not as static entities but as dynamic systems. Newly constructed middle-income housing not only benefits those in the middle class but is also the primary means by which low-income housing gets produced.

Because the way we get housing at the “bottom” of the market depends on this process of downward filtering, when we don’t build enough new housing at the top, the old housing doesn’t filter downward in price.  Barr presents a nice summary of some of the recent research on the subject (showing, for example, how as higher income people moving into new housing, a chain of moves is triggered that creates vacancies for low income households).  He also has some key policy recommendations, centered on increasing housing supply.

If we want to walk more, we need to do things differently.  Walking and urban advocate Jeff Speck has a seemingly innocuous-sounding column at Next City, observing the annual “Walk to Work” day.  Speck argues that rather than the kind of occasional and odd effort to walk to work, which for some is a once yearly exceptional activity, we need to think about how we changes things so that walking is common the other 364 days of the year. He observes:

More than three quarters of us get to work by car. Most of us do this not because we want to, but because we have no good alternative. Seventy-five years of sprawl, highway building, and transit disinvestment have created a national landscape that makes car ownership an obligation for almost everyone. The automobile is no longer an instrument of freedom, but rather a bulky, expensive, and dangerous prosthetic device, a prerequisite to viable citizenship.

Speck has a succinct agenda:  Build housing near transit, reform parking and fight highways.  If there were more places to live near frequent transit, transit would work better, and fewer people would need to drive.  We have too much parking, thanks to a hidden and byzantine system of regulations and subsidies that generates more driving and car dependence.  And our ever expanding highway system simply generates more travel and more sprawl.  Walking is great, but once a year performative measures just underscore how much we need serious change.

 

The Week Observed, April 14, 2023

What City Observatory did this week

The case against the Interstate Bridge Project.  We offer 16 reasons why Oregon and Washington lawmakers should question the current plans for the proposed $7.5 billion I-5 freeway expansion project between Portland and Vancouver.  Here’s reason #10 (but click through to read all 16!)

10.  IBR traffic projections have been proven dramatically wrong:  They grossly over-estimate future traffic levels on the existing bridge, which is capacity constrained.  The CRC FEIS predicted I-5 traffic growth of 1.3 percent per year; actual growth was 0.3 percent per year through 2019. They also fail to accurately predict future traffic levels.  The independent Investment Grade Analysis in 2013 showed that the IBR forecasts overstated future I-5 traffic levels by about 80,000 vehicles per day, leading to the design of a grossly over-sized project.

 

Put a bird on it:  Greenwashing highway expansions.  State highway agencies have their own carefully crafted logo for a program called “Planning and Environmental Linkages” which promises more environmentally friendly highway projects.  But when you look closely at what PEL actually entails, it’s clearly all performative process, with no substantive requirements.

As long as a highway intones gravely about the environment, it can do pretty much whatever it wants.  Case in point, a recent “Planning and Environmental Linkage” report for a major highway in Georgia contains none of the following words:  “climate,” “pollution,” “greenhouse gas,” and “emissions.”  What you will find in the report is a prominent logo, with nary a car or truck to be seen.  Put a bird on it!

Must Read

Just another deadly stroad: How Complete Streets got co-opted.  The indispensable Chuck Marohn is in fine form with a new essay on how highway engineers and road departments have coopted, and largely eviscerated the “Complete Streets” movement.  The concept behind complete streets–that we design our streets to provide safe and equal access to all people, regardless of the mode by which they travel, is sound.  In practice, however, roads that explicitly prioritize faster car movement and endanger cyclists and pedestrians can qualify as “award-winning” complete streets, given a watered down, mindless checklist approach that’s been adopted.

Marohn illustrates this with an example of Ager Road outside Washington DC.  Its literally won awards from professional road building, and engineering groups.  But the “improved” road replete with dangerous design features like slip lanes and too narrow bike lanes that endanger non-auto users. It’s painfully difficult to tell apart the “before” and “after” pictures of the project, because both are so auto-focused.  And the traffic data show that, post “Complete Streets” treatment, still nearly 70 percent of drivers exceed the posted speed limit.  Unsurprisingly, this “complete street” claimed yet another life.

The problem is that the advocates of complete streets essentially sold their souls for to get their branded idea adopted.  Highway agencies and engineers were more than happy to adopted the mantle, and provide congratulatory awards, while gutting the underlying concept, and using it to pretend that they were actually doing something to improve the safety and fairness of the transportation system.  Marohn writes:

Yet, a top-down strategy meant working within entrenched systems. It meant finding common cause with the very people who most fervently resisted their ideals. There was certainly more funding with this approach, along with greater access to power, but the ultimate cost of that success was having the core ideals of Complete Streets cast aside and tokenized.  We need you to wake up, Complete Streets advocates, and recognize that your work is being widely used for evil ends.

Highway engineers have debased and perverted many potentially meaningful terms like “multi-modal” and “pedestrian infrastructure.”  Co-opting complete streets is safety-washing, just as highway departments have engaged in woke-washing with phony equity claims, and green-washing with performative and meaningless “planning and environmental linkages.”

Freeways without futures.  The latest annual installment of the Center for New Urbanism report, Freeways without Futures is now available.  The recent passage of the $1 billion Reconnecting Communities program as part of national infrastructure legislation has given added impetus to community efforts to repair the damage done by urban freeways by a combination of capping, removal, and boulevard conversions.  Here’s a list of this year’s candidates, and we hope this helps hasten their demise, and rebirth as contributors to the urban fabric, instead of destroyers of it.

  • Interstate 787, Albany, New York
  • Interstate 35, Austin, Texas
  • US 40 Expressway, Baltimore, Maryland
  • Interstate 794, Milwaukee, WI
  • State Highway 55/Olson Memorial Highway, Minneapolis, Minnesota
  • Interstate 94, Minneapolis-Saint Paul, Minnesota
  • Interstate 980, Oakland, California
  • State Route 99, Seattle, Washington
  • Interstate 244, Tulsa, Oklahoma
  • US Route 422, Youngstown, Ohio

New Knowledge

Traffic congestion reduces vehicle miles traveled.  We tend to view traffic congestion as an unmitigated bad.  But in the presence of traffic congestion, we change our behavior.  A new study looking closely at travel survey data from the Seattle area shows that people who live in more congested parts of a metro area tend to own fewer vehicles and drive fewer miles.

The study looks at how a broad range of factors (including commonly understood variables like density, design, diversity and destinations (some of the so-called multiple “D” factors influencing travel behavior.  Its core finding is that after controlling for all of these other factors, people who live in more congested areas–as measured by a kind of travel time index computed from Google Maps travel time estimates–tend to drive less than their counterparts in less congested locations.  Specifically, each unit increase in the delay score reduces, the number of household trips  by  16 percent and its total vehicle miles of travel by about 12 percent.

The authors conclude:

. . . travel delay, a measure of congestion, is associated with fewer household vehicles, fewer vehicle trips, and lower VMT. These are beneficial to environment and energy conservation. Congestion might also be seen as an adversary of density because congestion is more likely, though not always, to exist in areas with high density, a key paradox at the center of the land use policy debate. However, congestion can coexist with density as demonstrated by Mondschein and Taylor (2017). While “delay” is treated as a built environment variable in this study, it is a measure of the “mobile” feature of the built environment. This measure differs from other land use density and diversity variables that quantify the “fixed,” long-term features of the built environment. The findings about the effects of travel delay reinforce the notion that travel delay is an important constraint of travel behavior and should be considered along with other “D” factors

In a sense, this is the flip side of our understanding of induced travel.  We know that when we make travel more convenient and faster, vehicle miles traveled increase.  It shouldn’t be a surprise that when an area is more congested, that people respond by driving less. On the surface, that seems to imply that this lessened driving is entirely a sacrifice.  Not necessarily.  Congested places also tend to have more population density and a richer and more varied set of destinations and services.  So even if you drive less, it doesn’t mean that you’re giving up other than the amount of driving you do.  And, just as with induced travel, the implication is that measures that reduce traffic congestion are likely to increase the number of trips people take, and the distance they drive.

This ia particularly important finding for transportation planning. Most existing traffic models assume that traffic congestion influences mode choice or route choice, but has essentially no effect on the number of trips or the number of miles a household drives.  That’s why state DOT travel forecasts routinely predict absurd and impossible levels of traffic growth and congestion.  But as congestion increases, people adapt their behavior, and travel less, taking fewer and shorter trips–something the traditional four step model fails to forecast.

Reza Sardari, Jianling Li and Raha Pouladi, “Delay: The Next “D” Factor in Travel Behavior?” Journal of Planning Education and Research, 2023. https://doi.org/10.1177/0739456X231154001

Hat tip to Michael Brennis at SSTI for flagging this article:  Read his take here.

In the News

Clark County Today republished our commentary, “The Case Against the Interstate Bridge Replacement.”

The Week Observed, June 30, 2023

What City Observatory did this week

Scratch one flat top!  That was the famous cry of US Naval aviators, when, early in World War II they chalked up their first victory, sinking the Japanese aircraft carrier Shoho.  Portland’s freeway fighters, who’ve been battling for years against the multi-billion dollar expansion plans of the Oregon Department of Transportation marked their first big victory this week, as ODOT cancelled plans for the $450 million I-205 freeway widening south of Portland.  ODOT staff finally concede what freeway opponents have been saying for years:  the agency simply can’t afford the project, and besides, it won’t work to reduce congestion.

The I-205 Phase 2 project is dead—although in bureaucratic parlance it’s merely “indefinitely postponed.”  Also in serious jeopardy is the I-5 Rose Quarter Freeway widening, a project whose cost has ballooned to $1.9 billion and for which Oregon DOT has no clearly identified source of funding.  Portland’s freeway fight will continue.

Must Read

Americans Understand Induced Travel, even if highway departments don’t.  Upton Sinclair’s  axiom “It is difficult to get a man to understand something, when his salary depends on his not understanding it” holds with a vengeance for highway engineers and boosters. But most Americans, who aren’t in the thrall of the highway industrial complex, now recognize that round after expensive round of highway construction has done nothing to reduce congestion.  Transportation for America’s national survey asked a representative sample of Americans a battery of questions related to traffic, transportation, sustainability and public policy.  We want to highlight one key finding:  A clear majority of Americans now recognize that widening highways is a futile and costly approach to transportation.

About two-thirds of respondents agreed strongly or somewhat that highway expansion in ineffective in reducing traffic congestion.  More than three fifths of respondents agreed that highway expansion is a waste of taxpayer money.  Our friends and Seattle’s Urbanist have a terrific write up of the report as well.

Rule of the Road:  Priority for Pedestrians.  Paris Mayor Anne Hidalgo has made prodigious strides in re-shaping transport in Paris. As she says:

Faced with the climate emergency, we are adapting Paris and radically changing the way we get around. Parisians have understood this well! In record time, they massively appropriated the cycle paths. The results are there: in 10 years, car traffic has decreased by 40%, and pollution by 45%. Paris now has more than 1,120 km of cycle paths, compared to 200 km in 2001.

That’s led to some complaints that the city has become more challenging for those on foot.  Hidalgo is unequivocal:  Priorite absolue pour les pietons:  Absolute priority for pedestrians.  The city has announced new measures to create pedestrian zones around schools, to better enforce traffic laws protecting pedestrians, and eliminate cycling on sidewalks.

Making cities work well requires a radical re-thinking of transportation priorities.  Bold leaders like Paris are showing that positive change is possible.

New Knowledge

More evidence for induced travel.  Add another study to the growing pile of literature documenting the phenomenon of induced travel—widening roads, ostensibly to reduce congestion, simply encourages more people to travel on the expanded roadway.  This latest evidence comes from England, where motorway authorities widened a stretch of the key M25 roadway from three lanes in each direction to four by allowing vehicles to drive on the road’s paved shoulder.

A benefit-cost analysis prepared for the project predicted that it would produce significant benefits to business and commuter travel by speeding traffic and reducing travel times.  But instead, as always happens, the expanded roadway simply attracted more trips.   As the study concludes:

. . . the economic benefits to longer distance business users anticipated from road widening are negated by the use made of the increased capacity by local road users for short trips of less economic value.

The study’s author also concludes that the induced travel effect is accelerated by the availability of real time traffic information and “sat-nav”—satellite navigation—systems in cars.  The availability of additional space and (temporarily) faster routes, prompts these systems to route more vehicles onto the expanded roadway.

The study has several important implications.  First, the travel time “benefits” counted in benefit-cost studies used to justify widening projects are non-existent and illusory.  Accurate benefit-cost analyses would show expanded roadways are a value-destroying proposition.  Second, you can’t alleviate congestion by providing more roadway capacity:  More roads simply prompt more driving, meaning more pollution and no reduction in congestion.

David Metz, “Digital navigation negates the economic benefits of road widening: The case of the M1 motorway,” Transportation Research Part A: Policy and Practice, Volume 174, 2023, 103749,,
https://doi.org/10.1016/j.tra.2023.103749.
(https://www.sciencedirect.com/science/article/pii/S0965856423001696)
https://ucl.scienceopen.com/hosted-document?doi=10.14324/111.444/ucloe.000034

(hat tip to @DavidZipper).

In the news

City Observatory Director Joe Cortright was nominated as one of Planetizen’s “most influential urbanists” for 2023.  You can see the entire list of nominees and register your votes here.

 

The Week Observed, September 15, 2023

What City Observatory did this week

This is what victory looks like.  Freeway fighting is hard, drawn-out work.  StateDOTs and their allies have vast funding for public relations campaigns to sell giant projects; citizen activists work from a shoestring budget, and have to attend interminable meetings that are invariably organized by project proponents.  In general, freeway fighters lose very battle—except the last one.  When bad freeway projects die, it is with a whimper, rather than a bang.  Oregon DOT’s half billion dollar I-205 freeway widening project—a nationally recognized boondoggle—died in a brief bureaucratic footnote.

What this really signals is that Oregon DOT bent to the reality that it didn’t have the money to pay for this project, and had almost no likelihood of convincing anyone else to pay for it.  This same inescapable logic has already likely doomed the I-5 Rose Quarter project as well.

Must Read

Janette Sadik Khan and Aaron Gordon on congestion pricing.  It looks like congestion pricing is really, truly going to happen in New York City early next year.  While everyone agrees that this is a step in the right direction, two influential voices are raising concerns that the city isn’t doing enough to make sure this is a success and capitalize on the best opportunity in generations to reshape urban transportation.

Janette Sadik Khan writes that the city isn’t thinking hard enough about how to creatively use the road space freed up by congestion pricing. Pricing will reduce traffic by an estimated 20 percent, but what will fill that vaccuum?   Khan argues:

You don’t want to launch a program that’s all stick and no carrot. If congestion pricing does in fact remove 20 percent of the traffic and you just wind up with underused car lanes, it’s an invitation for motorists to fill them up again. It will harm the program’s credibility irreparably.

Other places, like London, greatly increased bus service as they implemented pricing, both to accomodate more traffic and show tangible benefits from pricing.

Aaron Gordon focuses on the fact that all of the net proceeds of congestion pricing will be plowed into transit capital improvements, with nothing dedicated to actually expanding transit service.

The money collected from congestion pricing tolls cannot, legally speaking, go to anything other than paying for those big, expensive capital projects. In other words, it can’t go towards paying for more frequent service on existing subway, bus, or rail lines.

Urban Density is a good thing.  Urbanists should run towards, rather than running from density.  Denser urban areas produce a range of environmental, social and economic benefits that can’t be realized in any other ways.

Writing at Vancouver’s Georgia Straight, Helen Lui points out

Density brings public services, transit, parks, and amenities closer together. When we can walk our children to school or cycle to the nearby park, grocer, or restaurant, we reduce carbon pollutants, save money otherwise spent on cars, and get some exercise, too.

Too often, we end up apologizing for more density, rather than pointing out its important benefits.

Less parking = more affordability.  Many studies have conclusively shown that parking requirements drive up the cost of housing:  they make it more expensive to build new homes, limit the amount of housing that can be built in specific locations, and result in a smaller supply of housing, driving up overall rent levels.

As we would expect, reducing parking requirements has the effect of lowering development costs and can be expected to reduce rents.  Writing at the Minnesota Reformer, Zak Yudhishthu, charts the decline in parking built in new apartment projects after the relaxation of city parking requirements Minneapolis.

The typical number of parkings spaces per housing unit fell from about 1 prior to the reform to less than .75 spaces per unit today.

New Knowledge

Rents and Incomes.  This is actually a debunking of some new non-knowledge, specifically a social media meme that purports to show that rents increases have dramatically outstripped income growth.  It’s a compelling visual, but its not simply wrong; it’s an obvious and intentional lie.

The meme in question compares inflation adjusted incomes to unadjusted rents.  The video purports to show that rents (un-adjusted, in red) have risen, while incomes (adjusted for inflation) have declined.

It’s simply bogus.  We turn the microphone over to Economist Noah Smith.

This video is, of course, complete disinformation. The income line is adjusted for inflation, and the rent line isn’t. That makes it look like rent went up a lot more than income, where in reality they went up by a similar amount.

The reality, is that rents and incomes track one another pretty closely over time.  Here are Smith’s data, via FRED, the Federal Reserve Bank of St. Louis repository of economic data.

The misleading “rent v. income” chart is just one of too many examples of how graphics can be used to lie in a seemingly precise and convincing way.  Helpfully, Noah Smith provides some very detailed advice on how to be a critical consumer of such charts, and avoid being sucked in by visual deceptions.   Caveat emptor!

Noah Smith, “How not to be fooled by viral charts.” (Substack).  September 13, 2023.  https://www.noahpinion.blog/p/how-not-to-be-fooled-by-viral-charts

 

The Week Observed, October 6, 2023

What City Observatory did this week

What if we regulated new car ownership the same way we do new housing?  Getting a building permit for a new house is difficult, expensive, and in some places, simply impossible.  In contrast, everywhere in the US, you can get a vehicle registration automatically, just for paying a prescribed fee.  We make it really hard to build more homes, but very easy to buy more cars.  Little wonder we have a housing shortage and massive unaffordability, and percieve chronic traffic congestion and parking shortages.

But it doesn’t have to be that way:  A recent story about Singapore caught our eye: In Singapore, you can’t even buy a car without a government issued “certificate”—and the number of certificates is fixed city wide. The government auctions a fixed number of certificates each year, and the price has risen to more than $100,000. This means that a Toyota Camry, which costs about $30,000 in the US, would cost a Singapore buyer about six times as much (including all taxes and fees).  Our housing and transportation problems are ones largely of our own making, reflecting the usually unexamined choices we’ve made about how we tax and regulate both.

Must Read

CalTrans whistleblower reveals highway widening bias.  On the surface, California has bold policies that emphasize reducing vehicle miles traveled.  But underneath the hood, the state’s DOT, CalTrans, is still a highway building and highway widening agency.  Jeanie Ward-Waller, the agency’s deputy director for planning has been effectively fired, in retaliation, she argues, for calling out two deceptive and likely illegal steps the agency took to advance freeway widening projects that violate state and federal policies.  Ward-Waller charges that CalTrans took money for “pavement restoration” and used it to pay for widening a stretch of Interstate 80, and also broke the project up into a series of smaller projects to avoid a close environmental review.  These are tactics that other states have used as well, including Oregon.  This is a particularly important case because it calls out the profound disparity between headline policies (we care about climate, and we’re going to reduce VMT), and actual bureaucratic performance (out of the limelight, we’ll divert funds and avoid environmental review, and keep widening highways the way we always have).  Kudos to Ward -Waller for having the courage, at great personal expense, to call out this outrage.

California is spending precious little of its transportation budget on projects that reduce climate pollution.  Hot on the heels of this whistleblower charge is a new report from the National Resources Defense Council taking a close look at how California spends its transportation dollars.  Again, despite high-minded policy declarations about climate priorities, NRDC finds that less than 20 percent of state transportation spending goes to climate-friendly investments.

Our analysis finds that California state agencies have only allocated 18.6% of available funds towards projects and programs that are helping curb Californians’ reliance on private automobiles by investing in projects and programs like bike lanes, sidewalks, electric buses, mass transit, regional rail systems and affordable housing.

The remaining 81.4% is allocated respectively towards maintaining (71.7%) and expanding (9.7%) the current system of roads and highways that contribute not only to climate pollution, but also unhealthy air, urban sprawl and endemic traffic fatalities.  NRDC’s number one recommendation, reads like the Hippocratic Oath:  First, do no harm:

  1. Discontinue funding for projects that increase highway capacity and thus increase VMT and greenhouse gas emissions. Projects should be rescoped to achieve project goals in ways that align with the State’s climate goals.

New Knowledge

Single young adults fuel population growth in DC.  The DC Policy Center has an interesting new report charting key population changes in Washington.  In contrast the the simple (and largely wrong) “doom loop” narratives, about cities, this study shows that dense, amenity-rich urban locations are still a powerful attraction for well-educated young adults.

The top-line census numbers for the District of Columbia appear to support the urban-doom loop scenario:  Between 2019 and 2022, DC’s population declined by 3,500 persons.  But while overall population totals declined slightly, there was an increase in the number of households,

. . . the number of households increased by 9.6 percent (27,994). Despite the uptick in households, during this period, the District has lost population: the city’s estimated population for 2021 was 668,791, a 5 percent decline from 2019.

The growth in the number of housheholds was fueled by individuals living alone.  These single-person households tended to be young, and renters compared to the overall population. As the report points out, while the number of multi-person households in the district was flat to slightly declining; there was a significant increase in the number of single person households.  The most notable change was among nonfamily renters living alone; their number increased from about 75,000 in 2015 to more than 100,000 households in 2021.

What the report shows is the very nuanced picture of population change and housing markets in the District of Columbia.  Dense, urban living is obviously very attractive to many well-educated, single young adults.  This demographic can use their buying power to bid for apartments (indicated by the 25,000 household increase in renters living alone).  In effect, DC residents are consuming more housing, per person, than they did just five years ago.  This likely reflects a combination of rising incomes (among these residents), a growing demand for residential space (post-pandemic), and the continuing attraction of urban living.

The growing demand for urban living among younger, single person households also signals the challenge to housing markets.  The number of households is increasing faster than the population.  As a result, a city has to grow its housing stock just to make sure that its population doesn’t decline.  As we noted a few weeks ago, careful studies of New York City show that the city lost tens of thousands of housing units due to the consolidation of smaller apartments into larger ones.  In the face of growing demand for urban living, cities have to move even faster to expand housing if they want to avoid population loss, displacement and declining affordability.

Bailey McConnel and Yesim Sayin, D.C.’s household growth is predominantly driven by singles aged 25 to 34, DC Policy Center, October 3, 2023

The Week Observed, October 20, 2023

Must Read

Portland:  Four Floors and Corner Stores–Upzoning for urban development and housing affordability.  A coalition of community, enviornmental and social justice groups is advocating for a YIMBY strategy for more housing in Portland’s close-in Eastside neighborhoods.  Like many US cities, Portland faces tight housing markets, affordability challenges.  This strategy aims at improving affordability by expanding supply, and doing so in the close-in urban neighborhoods on Portland’s Eastside, where housing demand is strong.


The coalition, led by Portland Neighbors Welcome, makes an explicit city-forward call to action.

Our vision is simple: it should be legal for any residential lot from roughly 12th to 60th, Fremont to Powell, to contribute to a thriving, mixed-income, mixed-use fabric of urban neighborhoods by allowing street-scale apartment buildings.

This heart of Portland, our Inner Eastside, can become a more equitable version of the Northwest Alphabet District: a dynamic, walkable, mixed-income neighborhood with a mix of mid-sized apartment buildings, single-family homes, and every type in between, well-served by transit, and with commercial centers, corner stores, and shared neighborhood spaces.

The strategy draws an analogy to Northwest Portland’s dense and popular “Alphabet” district; a traditional neighborhood on the city’s westside that has a robust mix of apartments, single family homes, shops and businesses.  As we’ve pointed out at City Observatory, for most of the past century, city zoning codes have made this kind of neighborhood illegal to build.  The Portland Neighbors Welcome proposal is would make it possible to build more of the kinds of walkable, bikeable, transit served places that people want to live in.  It’s a smart way to tackle climate change, equity, and affordability.

More housing, lower rents.  There’s powerful evidence in the current housing market to show that increased local housing supply results in lower rates of rental inflation.  Carl Whitaker, research director , Economist for Real Page, has tabulated the change in apartment rents and apartment completions for sub-metropolitan markets across the United States.  Over the past year, those areas with the fastest local growth in housing supply have seen actual rent declines.  Conversely, places with little or no growth in apartment completions have seen, on average, the highest rent increases.

Submarkets (parts of metropolitan areas) with increases in inventory growth (number of rental apartments) of 10 percent or more have seen, on average about 3 percent year-over-year declines in rents.  Meanwhile, in submarkets where inventory growth is below 2.5 percent, rents have risen by about 1 percent.  As Whitaker notes, the same pattern holds across metropolitan areas, as well as sub-markets:

There’s a remarkably clear relationship between supply levels and rent growth. In the 10 large metro areas with the deepest rent cuts, the average rate of apartment supply expansion was nearly double the U.S. average. In the markets with largest rent increases, on the other hand, the average rate of supply expansion came in below the national average.

The outlook for the coming year is favorable:  a surge of new apartments is in the construction pipeline, which is likely to moderate rent growth.  Notably, the relative glut of new completions follows on the rent increases experienced in 2021, and illustrates the challenges of “temporal mismatch” in housing markets.  New housing, especially apartments, tends to get built in spurts, but tends to lag changes in demand and rents.  Hence we observe rental inflation (when a surge in demand hits a relatively fixed stock of housing), and then declines in rental inflation (or actual rent declines) as new units are delivered a year or more later.  So, for the next year or so, rents may moderate as new apartments are completed.  But after that, the future trajectory of rent increases will hinge on further increases in housing supply.

A small—but likely Pyrrhic—victory for freeway fighters in Austin.  The Texas Department of Transportation (TXDOT) is looking to foist its multi-billion dollar I-35 widening project on Austin, doubling down on decades of a freeway that’s divided the community and promoted automobile dependence.  By a 7-3 vote, the Austin City Council voted to ask TXDOT to delay funding the project.  Despite the city’s concerns, it seems likely that TXDOT will move ahead with the project regardless.  Even advocates of the resolution conceded that freeway groundbreaking, scheduled for 2024, is “inevitable.”

City Councilors seem resigned to the project moving forward, and are somewhat forlornly hoping to “mitigate” its negative effects.  To add insult to injury, a touted highway “cap” will happen only if the City of Austin comes up with half a billion dollar or more—TXDOT pay for it.  be Cities that let their state highway departments make huge and irreversible infrastructure decisions will get more sprawl, pollution and auto-dependence.   They’ll find it hard to maintain or cultivate the kind of vital urban spaces they desire.

The Week Observed, October 13, 2023

What City Observatory did this week

Britain’s Caste system of transportation.  In a cynical ploy to revive the Conservative Party’s flagging electoral hopes, Prime Minister Rishi Sunak has engaged in some blatant pro-motorist posturing.

Saying that drivers “feel under attack” Sunak has declared that Britain is “a nation of drivers.”  Sunak’s claim highlights what we’ve long argued at City Observatory:  that in a nation of drivers, those who walk, cycle and take transit are effectively members of a lower caste, not full citizens.  As egregious as Sunak’s claims about “a nation of drivers” are, they may serve the useful purpose of laying bare the hitherto unspoken discrimination embedded in current transportation policies.

Must Read

The Empire Strikes Back:  How state highway departments are co-opting and perverting the “Reconnecting Communities” program.  Much has been made of the $1 billion included in the Bipartisan Infrastructure Law to mitigate the damage done to urban neighborhoods by freeway construction.  The $1 billion was effectively a pittance, especially compared with the continued generous funding for even more destructive highway projects, but even that small amount of funding is being twisted by state highway departments to create projects that double-down on harmful roadways rather than actually restoring damaged neighborhoods.

Writing at Streetsblog, Adam Paul Susaneck, author of Segregation by Design—which publishes compelling  illustrations of how freeway construction decimated cities across the country—points out that “freeway covers” are routinely used as cover—for widening highways.

While putting a highway underground can seem an intuitive solution to reconnect the communities above, because these same mid-century standards are still embedded in nationwide transportation policy, the reality is that such projects often become, at best, a form of “greenwashing” and, at worst, Trojan horses for further highway expansion.

In Portland, Brooklyn, Austin, and St. Paul, highway departments use covers as a mask for highway widening. Expanding highway capacity, even when its partially concealed by a few blocks of freeway cover, fails to fix the devastatingly negative effects of highways on the urban landscape:  It’s the flow of cars and pollution that devastates communities, not the insufficiently aesthetic appearance of uncovered freeways.

In Michigan, that state’s department of transportation is looking to turn a little-used one mile stretch of I-375 into a boulevard, but in the process is creating a roadway that will be even more hostile to pedestrians–and as divisive for the community–as the depressed freeway it replaces.  The boulevard will be six to nine lanes wide, with a median barrier—included because pedestrians won’t be allowed to cross both directions during one signal phase, and to protect traffic flow and shorten travel times.   As local resident and architecture professor Bryan Boyer notes,

“Instead of reconnecting communities, it is going to make a bigger gulf between downtown and the eastside. Instead of being safer for people who are walking or biking, particularly in the east-west direction, it’s going to be more dangerous.”

Anika Goss of Detroit Future City argues:

“It would be disingenuous for us to build a freeway over the previous freeway and call it reparative because we put up a historical marker.”

MDOT officials respond to these criticisms with the frequently used dodge that the project is “only 30 percent designed”—which seems to imply that there’s plenty of opportunity to change things.  But the reality is that the agency is locked in on a six- to nine-lane roadway, and the agency is prioritizing traffic flow over all other considerations.

Speaking out on behalf of a whistleblower.  As we reported last week, the California Department of Transportation has demoted Deputy DirectorJeanie Ward-Waller after she filed a whistle-blower complaint against the agency for violating its own rules and state law to fund highway widening projects. For too long, state highway departments have proffered greenwashed policies, while cynically pursuing projects that simply lead to more traffic and pollution.  This case is an opportunity to call out their duplicity.  America Walks has started a nationwide petition drive to challenge these egregious actions and support a moratorium on highway expansions in California .

A coalition of allies in California is now calling for Governor Newsom to investigate this malpractice and to pause highway expansion projects throughout the state until it is completed.  The coalition has asked for national support in convincing Governor Newsom this is the right thing to do

You can sign on at this link.

 

The Week Observed, March 15, 2024

What City Observatory did this week

Abandoning road pricing monkey-wrenches state transportation, traffic reduction and climate plans.  This week, Oregon Governor Tina Kotek terminated Oregon’s Regional Mobility Pricing Program, which would have imposed per mile fees on major Portland-area freeways.  The plan, approved by the legislature seven years ago, has been developed at a snail’s pace by the Oregon Department of Transportation, but as implementation neared (possibly in 2026) many politicians got cold feet.  Kotek’s order resolves a short-term political problem, but creates all kinds of headaches for transportation in Oregon.

State and regional transportation plans have counted on road pricing to raise funds, manage (and reduce) traffic congestion, and reduce vehicle miles traveled in order to achieve state greenhouse gas reduction goals.  Terminating pricing just made each of those problems worse:  ODOT’s badly managed budget—already reeling from massive cost-overruns on highway widening projects—is now even worse off.  Plans that called for reducing congestion with time-of-day pricing are also out the window—and congestion will grow worse.  Finally, both ODOT and Metro (Portland’s regional planning agency) made road pricing the foundation of their claims that they’d reduce per capita driving by more than a third, in order to achieve adopted state greenhouse gas reductions.  These financial, transportation, and climate plans are in a shambles now as a result.

Must Read

No accountability for transportation spending.  Politicians like to make sweeping claims about “investing in infrastructure” but as we all know, the devil is in the details:  who gets how much money for what projects.  In theory, the IIJA, the Investing in Infrastructure and Jobs Act, aka the Bipartisan Infrastructure Law, was supposed to help, among other things, redress past injustices and forestall further climate change.  But it turns out that most of the money has been aspirated into the carburetor of the same old highway-dominated spending system.  Transportation for America, the national advocacy group, has invested considerable resources in just trying to track down how much money went where, and for what.  They’ve even enlisted artificial intelligence to study the question.  As Corrigan Salerno reports, the results are far from transparent.  It’s simply impossible to tell where the money went.  While Transportation for America was able to track some federal funds, it wasn’t able to follow all the money passed through to state transportation agencies:

. . . state-funded projects are not tracked in any central location. There are billions of dollars that this analysis cannot account for, programmed away in over 50 different formats within their State Transportation Improvement Programs (STIPs) that spell out state spending plans across the country.

Without details on how funds were spent, it’s virtually impossible to hold anyone accountable for results.

And even when projects ostensibly serve these climate, safety and social justice purposes, they may just be the same old highway widening projects dressed up with woke-washing and token multi-modal features, like the $450 million in “reconnecting communities funds” awarded to the Oregon Department of Transportation for a project that doubles width of the I-5 freeway that decimated the city’s historically African-American neighborhood.

Partisanship and Covid, revisited.  Brad Delong points us to the New York Times analysis of Covid death rates by county.  The pattern is striking.  While Covid-19 deaths were initially more prevalent in “blue” counties, the pattern has shifted dramatically over time, with deaths being persistently higher in “red” counties.  This shift was especially strong after the availability of the Covid vaccines.

As DeLong argues, the data suggest having neighbors who vote for Donald Trump Is dangerous to your health.  Delong calculates

“in a county where the margin for Trump vs. Biden rises by an extra 1%-point…and your chances of dying from the Covid Plague go up by 0.0075%-point: That means (with an election turnout of half the population) that you only need an additional 67 excess Trump voters in a county to kill one additional person from the Covid Plague.”

 

In the News

Oregon Public Broadcasting included Joe Cortright’s analysis of the demise of road pricing in its story “A plan to toll Portland highways is dead.”

 

 

The Week Observed, March 8, 2024

What City Observatory did this week

A yawning chasm in neighborhood distress among metro areas. Almost every metropolitan area has some neighborhoods that face serious economic distress, but the patterns of distress vary widely across the nation. We use data from the Economic Innovation Group’s latest distressed communities index to identify clusters of contiguous zip codes that are classified as distressed. We find that there is a more than order of magnitude difference in the degree of concentrated distress among Metro areas. The hardest hit metros have 20 percent or more of their population living in contiguous clusters of distressed zip codes. But nearly as many metros have almost no economic distress (with less than 2 percent of their population living in contiguous distressed zip codes)These data are a stark reminder that neighborhood level distress is a big problem in some metropolitan areas, but a much, much smaller problem in others. There’s powerful evidence that large and persistent concentrations of poverty have multi-generational effects. But in many metropolitan areas, economic distress is much more isolated, and is more about human capital than about geography.

Must Read

Housing markets explained using Legos.  Governor Kotek patiently explains housing markets in terms anyone can understand. Housing economics often seem like a complex and daunting subject, but Oregon Governor Tina Kotek has a new video that succinctly explains housing affordability, the causes of displacement and how to solve both.

 

Using Lego’s Kotek demonstrates that when more people are seeking housing than the number of available homes, prices and rents get bid up, and those with the least resources are displaced. The solution is to add more homes, and we can do so through a variety of means: single family residences, duplexes, townhomes and apartments.  For the record, City Observatory has some serious reservations about one part of the Governor’s plan: expanding urban growth boundaries around the state:  the lack of housing at the urban periphery is not Oregon’s housing supply problem, and more sprawl poses a wide range of affordability, environmental and fiscal problems. That caveat notwithstanding, we applaud the Governor’s clear exposition of the importance of expanding supply, which can be done most importantly, by allowing more housing on land already designated and serviced for urban densities.

Hurray for HLA.  This past week, voters in LA overwhelmingly voted to approve Proposition HLA, a measure requiring the city to implement plans to add bike and pedestrian improvements to city streets as they are rebuilt. Like many cities, LA has ambitious plans to add bike lanes and sidewalks, but these tend to get a very low priority (compared to maintaining and fixing stroads. Voters approved the measure notwithstanding a troubling opposition campaign highlighting opposition by local firefighters unions, which claimed the measure would increase response times.

Voters didn’t buy these scare tactics, and voted almost two-to-one in favor of Measure HLA. It’s a reminder that parochial concerns of a few (usually parking spaces, or slower speeds) are much less important than addressing the profound safety deficiencies of our existing roads.

As active transportation advocates know, firefighters often claim that traffic calming measures like diverters, narrower lanes, traffic circles and bikelanes will somehow impede their over-sized vehicles. Not only is there no evidence that this is the case, but all these improvements directly reduce the number and severity of road injuries—and in virtually every community injuries and deaths in car crashes vastly outnumber the human cost of fires. Firefighters ought to know this—they respond to car crashes much more than building fires.

Your small apartment is really as big as your city. There have been social media stories rediscovering that Tokyo has many small apartments—as small as 300 square feet. To some, such small apartments are emblematic of a terrifying prospect of cramped living. But as Noah Smith—who lived in such a Tokyo apartment for a year explains, the size of the apartment is really secondary to the urban amenities that surround it. Especially if you’re young, and single, and not wealthy, being able to afford a place of your own that puts you in the middle of shops, restaurants, parks, jobs, and close to other people who share your interests, 300 square feet is an abundance of space. You spend most of your time elsewhere.

As Smith notes there’s a contrast between where we spend the bulk of our time in the US compared to Japan:

America is a place where you live INSIDE—inside your house, your car, your office, or maybe a mall.
Japan is a place where you live OUTSIDE, in public spaces that are made to provide you with both adventure and comfort.

And no one is “forced” to live in a small apartment: these micro-apartments provide choices to those who wouldn’t otherwise have them. For too long, housing policy in the US has decided that smaller and shared living arrangements, like single room occupancy buildings, were “substandard” and made them illegal—without providing affordable alternatives. A small apartment one can afford is preferable to living on the street, or being forced to live far from the opportunities, amenities and people who live in your city.

Ikea spokes-shark Blahaj explains the merits of tiny homes.

New Knowledge

City Density Gradients

One of the key measures of urbanism is density. As Economist Ed Glaeser says, cities are the absence of space between people. Jonathan Nolan has created a great new tool for visualizing the differences in density across cities and metropolitan areas world-wide. The tool features a density gradient—a line representing the average density of each city based on the radial distance from the center of the region. As a rule, densities tend to be highest in the urban center, and then taper off the further one moves away from the center.

CityDensity.Com allows you to observe the density gradient for most of the large cities around the world, and compare different metropolitan areas. Its charts are instructive in a couple of ways. First, they illustrate the absolute differences in density across metro areas. US Metro areas are uniformly less dense than their counterparts in most of the rest of the world. Second, it’s useful to compare the slope of the density gradient across cities. Some places are highly peaked, with high levels of density in the center; while others tend to be much uniform and “flatter.”

City Density lets you look at overall population density, population-weighted density (the density experienced by the average person), and density measures that exclude bodies of water.

The charts show just how different some of the world’s great cities are from typical American cities. There are more people, and in particular, much more people close to the center. Here’s a chart showing Paris, New York and Houston.  In their centers, New York and Paris have high levels of density; Houston, like may US metros, is basically flat, with central densities levels basically no higher than at the periphery.

The City-Density charts also serve as an informative counterpart to one of urbanist social media’s favorite tropes: superimposing the rail network from some livable European city with a similarly sized but car-dependent American counterpart. Why can’t we, with about the same population, support a similar rail network, these comparisons ask? The density charts provide a compelling answer: cities with great rail systems also have high levels of density; car dependent sprawl is difficult or impossible to serve economically.

Rail Networks:  Madrid and Atlanta

The implication the map posted to Reddit (below) is that Madrid and Atlanta have roughly similar populations, but that Atlanta has a tragically sparse rail network.  Why couldn’t it be more like Madrid, which is covered in a dense mesh of rails?

Population Density

Comparing the population densities of the two regions provides a good explanation:  Madrid has high levels of density in the center, not unlike New York and Paris, while Atlanta is flat, and less dense than Madrid even 20 miles (30 kilometers) from its center.

:  Madrid and Atlanta

To be sure, there’s a chicken-and-egg problem here: Density makes a strong rail network viable, and a robust rail network makes density more attractive. Too many US metro areas are stuck in a place where its difficult to build rail networks and denser housing.

 

The Week Observed, March 1, 2024

What City Observatory Did this Week

Is it time to address the problem of “Missing Massive” housing?  This past week marked the latest convening of YIMBYTown, this year, held in Austin, Texas.  One of the perennial topics was state strategies to promote “missing middle” housing—as evidenced by multiple initiatives to allow duplexes, triplexes and four-plexes in what have been exclusively single family zones.  There’s real merit to missing middle reform, but the housing supply and affordability problem likely requires a much larger scale solution.  Alex Armlovich coined the term “Missing Massive” to highlight the need to build taller and denser in those transit-served, amenity-rich, opportunity-proximate places where demand supports more housing.  As we’ve pointed out at City Observatory, by definition, taller, higher density buildings require that less land be redeveloped to produce any given number of new housing units.

“Missing middle” is a good place to start, and is a great way to add some density, and improve the range of housing choices in many neighborhoods.  But it we’re really going to tackle the housing affordability problem at scale, it will likely require some discontinuous leaps forward in key locations.  Housing policy should fill the voids for both missing middle and missing massive housing.

Must Read

“Safe systems” is being sabotaged.  This is a must, must read.  Kea Wilson, writing at Streetsblog, steps back from day-to-day reporting to offer a powerful perspective on ways in which the “safe systems” approach to road safety is being diluted and distorted by the dominant paradigm of car-centric planning.  In principle, safe systems is a well-ordered set of priorities for changing how we approach the construction and management of roads.  In practice, most state transportation departments and their local partners mouth the words, but haven’t changed their policies or investments.

And, as Wilson points out, the rhetoric of safe streets is being appropriated and bastardized.  Just as the term “multi-modal” is glibly used by highway departments to greenwash a multi-billion dollar highway widening project that includes a token pedestrian path, the term “shared responsibility” is used to evade responsibility.

The trouble with “shared responsibility,” of course, is that it says nothing about our respective ability to save lives — or how that “responsibility” should be shared relative to our respective power and influence.

Highway agencies still overwhelmingly treat safety as an “education” or “awareness” challenge, to be treated with feeble and often insulting, victim-blaming advertising and slogans.  And meanwhile, the real resources flow to making cars move faster, not making roads safer.  As Wilson argues:

It’s great — if painfully overdue — that America has finally realized that traffic violence is a systemic problem with systemic solutions. If we can’t advance the conversation beyond that astonishingly low bar, though, the Safe Systems approach will become just another way for powerful people to keep passing the buck to the “safe road users” piece of the puzzle — and the powerless will keep unsuccessfully shouldering the weight of a “responsibility” that they never had any hope of carrying in the first place.

If you want to move the safety conversation forward, please read Kea Wilson’s excellent essay.

How the infrastructure bill is aggravating climate change.  The Infrastructure Investment and Jobs Act (aka the Bipartisan Infrastructure Law) represented a major federal commitment to boost the nation’s capital spending.  The law talked about opportunities to use these funds to help deal with pressing problems like climate change, but have left actual decisions about where to spend the money largely up to the states.  A new analysis from Transportation for America shows that a huge share of these funds have been plowed into subsidizing highway expansion projects.

Transportation for America reports:

Instead of using the historic funding levels to give people alternatives to congestion, pollution, and car dependency, our analysis finds that states have designated over $33 billion in federal dollars (over 25 percent of analyzed funds) toward projects that expand road capacity, doubling down on a strategy that has failed time and time again.

This problem is compounded by a relatively slow roll out of spending for transit, compared to a very aggressive rate of spending on road projects:  about 64 percent of highway formula funds have already been allocated by states, while only 20 percent of transit funds have gone out.  Vesting discretion on how to spend infrastructure funds with agencies that are either in denial about climate change, or unwilling to even measure and report their carbon footprint, has produced predictable results:  more subsidies to driving that will only lead to more traffic, more congestion, and more greenhouse gas emissions.

State land use reform as a critical climate strategy.  Climate change strategies lean heavily on the “technical fix”—the idea that new technologies will eliminate the need for any of us to live our lives any differently.  A new report from the Rocky Mountain Institute argues that changing state land use policies can contribute both to achieving greenhouse gas reductions, and addressing fundamental issues of housing affordability and equity.

RMI analysis shows enacting state-level land use reform to encourage compact development can reduce annual US pollution by 70 million tons of carbon dioxide equivalent in 2033. This projection, based on 2023 data, underscores the potential for significant impact within a decade. It would deliver more climate impact than half the country adopting California’s ambitious commitment to 100% zero-emission passenger vehicle sales by 2035.

The key policies that would support compact development hinge on more housing in places where it is easier to live without a car.    RMI calls for  ending exclusionary zoning; deregulating and pricing parking; eliminating minimum lot sizes, unit sizes, and setback requirements; legalizing accessory dwelling units (ADUs); and building permitting reform.  Housing in dense neighborhoods addresses the affordability problem in two ways:  it tends to drive down rents, and also lowers the amount residents have to spend on cars and gasoline to meet basic transport needs.  As RMI says, reforming land use to encourage more compact development is a win-win for affordability and climate.

Anti-trust law and local food access.  The Federal Trade Commission has come out in opposition to the proposed merger of two of the nation’s largest grocers, Kroger and Albertsons.  As NPR reports, the FTC is concerned the merger would be bad for consumers and workers, increasing the market power of the combined company to set prices and limit wage growth.  But there’s another wrinkle as well:  the two chains have overlapping stores—some across the street from one another—in many metro areas.  The merged company plans to achieve “efficiencies” by consolidating stores in these markets, which means less choice and less accessibility to consumers.

Kroger (blue) & Albertson (yellow) markets (Business Insider)

In all, Kroger and Albertson’s have said they will sell off 650 stores to other chains, but its far from clear how viable those spun-off stores will be in the long term.  After an earlier merger with Safeway, Albertsons sold off a bunch of stores to competitors who weren’t economically viable, and ended up buying many of them back.  Anything that reduces the total number of grocery stores, and lessens competition in this sector, is likely to affect consumers, who will have to travel further, and have fewer choices.

 

 

The Week Observed, February 16, 2024

Must Read

The freeway cap mirage.  Don’t like freeways?  Let’s just cover up the problem.  It’s increasingly popular to try to repair the damage done to urban neighborhoods by “capping” freeways:  building a cover so that the road is less visible.  While that’s widely seen as an improvement, some are pushing back that its really a band-aid on a gaping wound that actually doesn’t solve the problem.  Writing at Bloomberg Cities, Benjamin Schneider describes growing opposition to a proposed freeway cover over the Kensington Parkway in Buffalo.  Local activists are suing to stop the project, which papers over the problem, but doesn’t solve it:

But instead of embracing the plan to heal the neighborhood, Ladiana and her husband Terrence Robinson sued the New York State Department of Transportation in an effort to stop it. Transforming the open freeway trench adjacent to their home into green space won’t solve their neighborhood’s problems, the pair say. In fact, it could just make some of them worse.

“They’re not following all of the climate change laws,” says Ladiana. “They’re doing nothing to reduce air pollution. What they’re doing is just pushing all of the pollution out of each side.”

The project has won a $55 million grant from the Biden Administration’s Reconnecting Communities program, but that covers only a tiny fraction of the project’s billion dollar cost.  There are real questions as to whether spending that vast sum on covering a less than a mile of freeway is good investment.

Federal government to stop funding big highway projects . . . in Canada. The Federal Environment Minister Stephen Guilbeault announced that the federal government stop funding large new road projects.  According to the CBC:

Guilbeault said Monday the federal government will be there to support provinces paying for maintenance but Ottawa has decided that existing road infrastructure “is perfectly adequate to respond to the needs we have. There will be no more envelopes from the federal government to enlarge the road network,” Guilbeault said, according to quotes published in the Montreal Gazette. “We can very well achieve our goals of economic, social and human development without more enlargement of the road network.”

The CBC reported that the announcement was greeted with some pushback from pro-highway groups, prompting the Minister to clarify that the ban doesn’t apply to all road projects, just large ones.  Canada’s shift follows a similar declaration last year by the government of Wales.

Surging apartment completions are driving down rents.  We’re getting an object lesson in supply, demand and prices in the housing markets of many metropolitan areas: many new apartments are being completed, and the surge in supply is driving down rents.  While this is a good thing for renters, the picture looks different to landlords, and the business press characterizes the decline in rents as “an onslaught of distress.”  Business Insight reports:

Pricing power across apartment markets in Texas has slipped just as thousands of new units are coming online, sparking concerns that conditions are ripe for an onslaught of distress.

“Pricing power” in this case means the ability of landlords to raise rents.  A study from real estate firm Avison Young looks at 25 large markets across the US and finds that rent growth is weakest in those markets where supply growth has been the strongest.  In Austin, where the number of aparments under construction is highest, rents are down the most.

The strong correlation between higher construction and lower rents is an encouraging sign that we can do a lot to improve housing affordability by building more housing.

In the News

A climate challenge to Portland’s Regional Transportation Plan:  The Portland Oregonian highlighted a legal challenge by No More Freeways against the regional transportation plan adopted by Portland’s Metro regional government. The plan, which allocates billions of dollars for freeway expansion projects, violates state rules requiring Metro to plan for a 35 percent reduction in per capita driving over the next two decades.

Bike Portland quotes City Observatory’s analysis of the flaws in the arcane highway cost allocation formula.  As the state has cut back on maintenance and spent more on highway widening, its created an apparently illegal imbalance in the costs allocated to cars and trucks.

 

The Week Observed, February 9, 2024

What City Observatory did this week

Three big flaws in ODOT’s Highway Cost Allocation Study.  Some of the most important policy decisions are buried deep in seemingly technocratic documents.  Case-in-point:  Oregon’s Highway Cost Allocation Study.  The state’s truckers are using the latest report to claim that they’re being overcharged, but the real story is very different.  We’ve unearthed three big flaws in the report:

  • The imbalance between cars and trucks seems to stem largely from the Oregon Department of Transportation”s decision to slash maintenance and preservation, and spend more widening highways. ODOT could largely fix this “imbalance” by spending more fixing roads, and less on widening them.
  • ODOT has illegally included federal funds in its cost allocation study; the state’s law and constitution apply only to state funds.
  • ODOT has gone out of its way to scapegoat bike and pedestrian projects, which are mostly paid for with federal funds—that aren’t even properly included in the allocation study.

In addition, the highway cost allocation study leaves out huge social, environmental and fiscal costs that cars and trucks impose on society and on the state. In Oregon, like other states, cars and trucks are increasingly getting subsidized and not paying their own way, something you’ll never find out reading this report.

Must Read

California’s Freeway and Climate Collision:  The LA Times has a terrific article summarizing a spate of freeway and climate news from the Golden State.  The California Transportation Commission (CTC) just voted to approve I-15 Express Lanes that would add capacity to that freeway, mostly to handle truck traffic.  The approval vote reversed a short-handed tie vote in December that temporarily put the project on hold.

The CTC voted to restrict even its commission members to just two minutes of comments, apparently in an effort to block Commissioner Joe Lyou from presenting detailed information showing the project violates state and federal environmental laws.  Lyou’s slides show that state transportation officials argued both that the project would produce no additional truck traffic (exempting it from an air quality review) and that the project would generate 2 million more truck trips annually (to justify economic development funding).  The CTC approved the project in spite of the clear contradiction.  It’s symptomatic of a bigger problem in California–and elsewhere—state policies profess to commit to reduction pollution and greenhouse gases, but state officials routinely vote to widen highways.  And there’s a coda to the I-15 debate, California’s Assembly Speaker replaced Lyou on the CTC with a new member:  a former car dealer.

America Walks spearheaded a national sign-on letter calling for an end to highway expansion projects.  Nearly 200 organizations (including City Observatory) have signed on to a new policy letter calling on the US Department of Transportation to stop funding highway expansion projects. The letter makes the case for a new direction:

We call on our leaders in government to adopt a moratorium on expanding highways and a pause on existing projects until climate, equity, and maintenance goals are met. The highway system we have built in our country is unsustainable, both financially and environmentally, and disproportionately harms low-income and Black and brown communities. We need to remedy these problems with a responsible approach to transportation that centers on community.

The policy stresses four-part priorities to guide transportation investments going forward:  fix it first (prioritizing maintenance over expansion), safety over speed, make transit work and re-connect communities (by removing freeways or replacing them with boulevards).

US DOT rejects grant request to rebuild the Brooklyn Queens Expressway.  A hopeful sign this week from the US Transportation Deaprtment:  they rejected a request for $1 billion in federal funds to hep rebuild the Brooklyn Queens Expressway, a classic Robert Moses project that blights Brooklyn, cutting it off from its waterfront.

In the News

Strong Towns tells a national audience about Portland’s freeway fight over the proposed $1.9 billion I-5 Rose Quarter project, a mile and a half long freeway widening being packaged as a community revitalization effort.

Bike Portland has an in-depth interview with City Observatory’s Joe Cortirght exploring the problems with the arcane cost allocation study the state uses to judge the fairness of highway taxes.

The Week Observed, February 2, 2024

Must Read

How CalTrans cheated on its environmental reporting.  Some months back, former Deputy Director of CalTrans,Jeanie Ward-Waller blew the whistle on the agency’s effort to evade environmental laws and illegally use maintenance funds to widen I-80 between Sacramento and Davis.  Now the National Resources Defense Council has laid out a strong case that the agency’s environmental review is plagued with errors and misrepresentations.

The CalTrans Environmental Impact Report for the I-80 project claims, implausibly, that the widening will result in less travel than leaving the freeway at its current 6-lane width.  As NRDC points out, that’s because the CalTrans model assumes that housing, jobs, and travel patterns will be essentially the same in the “Build” and “No-Build” scenarios.  The NRDC analysis shows that CalTrans assumed a “build” level of traffic between Sacramento and Davis, regardless of whether the freeway was widened or not–and its traffic models dutifully illustrated roads  clogged to capacity, and predicted commuters would take circuitous detours (thus increasing vehicle miles traveld and pollution.

Essentially, CalTrans argued there’d be “induced” demand whether the freeway was built or not.  But that’s the opposite of what we know about induced travel:  the additional trips (and more sprawling development) occur if the freeway is built, but not if it isn’t.  As Carter Rubin of NRDC writes,

Caltrans did not fully disclose and adequately analyze the Project’s impacts. Because the DEIR relied on flawed modeling, Caltrans arrives at erroneous conclusions about traffic impacts, greenhouse gas emissions, air quality, and energy impacts. Next, when it comes to estimating how much additional driving the project will cause, Caltrans still underestimates the impact. The DEIR also fails to adequately measure the additional car-centric sprawl growth that the project will cause.

Exaggerating the volume of traffic in the “No-Build” scenario is a favored DOT tactic, it makes the traffic problem look worse, and falsely creates the impression that traffic and pollution will be less if you build a project. As a growing body of scientific evidence shows, that’s exactly backwards:  more highway capacity generates more travel, more congestion and more pollution.

The “clean cars” shortfall. Across the world, climate strategies are pinning their hopes on a “technical fix” for automobile greenhouse gas emissions. Legislating cleaner cars and cleaner fuels is supposed to reduce greenouse gases from driving.  A new report from the European Union’s Courtof Auditors, reported by the Brussels Times,  shows that the real world reductions in carbon emissions from “cleaner” cars are far less than claimed based on lab results.  The report concludes that EU CO2 emissions reduction targets for new passenger cars are unlikely to be acchieved because CO2 emissions are measured in manufacturer’s laboratory tests do not reflect reality. Real-world emissions from conventional cars have not fallen.

Perhaps the most alarming finding is the dirtier performance of hybrid vehicles.  In theory, plug-in hybrids ought to rely mostly on clean(er) charging, but in practice they get much more energy from the internal combustion portion of the hybrid-powertrain.  The report concludes:

The combustion engine in hybrid cars is used more frequently than expected, in particular for company-owned plug-in hybrids, the auditors explained. . . . “We don’t consider hybrid cars to be low-emission cars and the designation should end from 2025,” the audit team said. “In reality, they don’t pollute less than combustion engine cars and we foresee a phase-out of hybrid cars.”

Several states are counting heavily on hybrid cars to reduce emissions–this report suggests hybrids may do much less that hoped for in reducing greenhouse gas emissions.

New Knowledge

The high cost of crashes.  Road crashes cost Americans more than $340 billion in 2019, according to a new report from the National Highway Traffic Safety Administration.  That works out to a total more more than $1,000 per person per year, and represents about one and a half percent of the nation’s gross domestic product.

The costs get spread to scoiety in many ways–through insurance premiums, taxes, and the costs borne directly by injured people. Traffic crashes cost taxpayers $30 billion in 2019, roughly 9% of all motor vehicle crash costs. This is the equivalent of $230 in added taxes per household in the United States.

The report notes that their were more than 14 million motor vehicle crashes in 2019, which injured more than 4.5 million people, and killed 36,500.

A key risk factor is the amount of driving Americans do each year.  The more we drive, the more crashes, injuries and deaths.  As NHTSA notes,

. . . although the 2019 fatality rate per VMT is unchanged from 2010 the incidence of fatalities and injuries has increased, reflecting added driving exposure . .

Crashes are particularly devastating for pedestrians.  Pedestrians are far more likely to be seriously injured or killed in traffic crashes that automobile occupants.  Overall, pedestrians are about seven times more likely to be killed if involved in an crash.

Reducing driving and reducing car dependence can lower the economic, social and human costs of crashes.

Blincoe, L., Miller, T., Wang, J.-S., Swedler, D., Coughlin, T., Lawrence, B., Guo, F., Klauer, S., & Dingus, T. (2023, February). The economic and societal impact of motor vehicle crashes, 2019 (Revised) (Report No. DOT HS 813 403). National Highway Traffic Safety Administration.
https://www.nhtsa.gov/press-releases/traffic-crashes-cost-america-billions-2019?s=09

 

 

The Week Observed, January 5, 2024

What City Observatory did this week

A $9 billion Interstate Bridge Replacement Project?  Just 13 months after raising the price of the Interstate Bridge Replacement (IBR) project by more than 50 percent, the Oregon and Washington DOTs say it will cost even more.  We estimate project costs are likely to increase 20 percent or more, which would drive the price tag to as much as $9 billion, almost double the 2020 estimate..

While the DOTs blame “inflation” their own estimates show construction cost disinflation, with expected increases of no more than 3.5 percent per year for the rest of the decade. The likely increase in costs will more than wipe out the $600 million in federal funds awarded to the project in December.  The cost of the IBR is increasing faster than the DOTs can find money to pay for it.

The “Cost Estimate Validation Process” (CEVP) that state DOTs implied would remedy future cost increases utterly failed. The use of lowballed construction cost estimates to sell highway megaprojects is part of a consistent pattern of “strategic misrepresentation.”  It’s  the old bait-and-switch:  get the customer to commit to buying something with a falsely low price, and then raise the price later, when its too late to do anything about it.

ODOT has a consistent track record of lowballing pre-construction cost estimates, and recording huge cost overruns, with the average price of a major project doubling between pre-construction estimates and final costs.

Must Read

Housing is up, rents are down. Jay Parsons of Realpage reported  that 2023 was a banner year for apartment completions.  In 2023, the US completed nearly 440,000 new apartments, the highest number in about 35 years.

The surge in apartment construction has been a key factor in bringing down rental price inflation–something that will increasingly show up in the national inflation numbers in the coming year.  But real estate markets are local, and the big improvements in supply are producing the greatest rent relief in those markets that are building the most.  As Parsons writes,

. . . there remains a clear link between supply and rent change by market. Rents fell in 2023 across 40% of U.S. metro areas – and nearly all of those saw significant new supply entering the market. By comparison, nearly one-third of U.S. metro areas produced rent growth of 3% or more in 2023, and nearly all of them had little supply to work through.

The good news is that the surge in apartment completions is likely to continue through 2024, with more than 650,000 new apartments expected to be finished.  There’s clear evidence that expanding supply helps making housing more affordable.

Upzoning increases supply and lowers rents.  The principal policy tool that local governments wield to regulate housing supply—zoning—turns out to be a critical way to improve housing affordability.  Todd Litman summarizes the growing global literature showing how upzoning leads to more housing construction, and in turn, improves housing affordability.  Writing at Planetizen, Litman surveys a series of careful academic studies documenting the positive effects of upzoning.  One of the most striking examples comes from Aukland, New Zealand, which saw big increases in housing following its upzoning in 2016:

Whether the present surge in apartment construction continues, and whether it reaches the markets that need additional supply the most, will hinge in important ways, on whether local governments upzone for more supply.

Some state DOTs are climate deniers. More than 20 states are suing the US Department of Transportation over regulations that would require state Departments of Transportation—which receive billions of dollars of federal funds—to measure and report the greenhouse gas emissions associated with their transportation systems.  Streetsblog reports that this dirty dozen and a half don’t want to have to even discuss the fact that the systems that they run are now the single largest source of greenhouse gas emissions in the US.  As Transportation for America’s Beth Osborne says, the state DOTs are either dishonest or climate deniers (or both):

“They’re saying they don’t have the capacity to measure greenhouse gas emissions, and in the same [document], they’re saying they do; I’d like to know which it is,” Osborne said. “Do they have the information, and they could share it, but they just don’t want to? Or are they unable to determine the carbon impacts of their investments? … Because if they can’t, maybe we shouldn’t entrust them with this money in the first place — and give the money to the cities, which could.”

This lawsuit is just part of a pattern and practice in the transportation engineering industry to deny climate change:  As we’ve pointed out, they’re content to build us a highway to hell.

In the News

Streetsblog also pointed its readers to our critique of the flawed USDOT claim that the latest National Household Travel Survey shows a decline in travel.

Writing at Planetizen, Todd Litman cited our debunking of USDOT’s infographic purporting to show a decline in trip making, based on a flawed comparison of two very different household travel surveys.

 

The Week Observed, December 22, 2023

What City Observatory did this week

Bad data.  What appears, at first glance, to be a big decline in trip-making is really an object lesson in failing to read the footnotes.  Every five years or so, the US Department of Transportation produces the National Household Travel Survey (NHTS), which provides essential information about American travel patterns.  The latest data from the 2022 survey was just released, and the USDOT trumpeted its publication with a clever, but fundamentally flawed infographic.

US DOT claims that we went from making almost three and a half trips per person per day in 2017 to barely two trips per day in 2022. The problem is that the apparent decline in trip-making almost certainly has a lot more to do with a major change in the methodology of the NHTS since 2017.  The survey used to provide respondents with a “trip diary” to contemporaneously record daily travel.  The survey has since shifted to the web, and relies on after-the-fact recollections, which previous survey work has shown substantially under-report total trip making.  Unfortunately, that critical detail doesn’t appear in the infographic, which gives the inaccurate impression that one can directly compare the 2017 and 2022 data (You can’t).  The pandemic and work-from-home have undoubtedly triggered major changes in travel behavior.  It’s too bad the NHTS can’t tell us much about the size or permanence of these changes.

Must Read

Full disclosure:  Highway departments need to report their carbon footprint.  A key provision of the Inflation Reduction Act, which shovels hundreds of billions of dollars to states for infrastructure is direction to try to reduce greenhouse gas emissions.  The US Department of Transportation has a new rule requiring state transportation departments to measure their greenhouse gases, and set goals for reducing emissions.  The goals are aspirational and toothless, but that hasn’t stopped a bunch of states from actively opposing the rule.

But as Kevin DeGood, of the Center for American Progress writes, you can’t solve a problem you don’t acknowledge:

The premise of the rule is simple: it’s nearly impossible to reduce something you don’t measure. Moreover, by measuring the greenhouse gas emissions from vehicles traveling on the National Highway System, states will be able to assess how different transportation projects and policies would affect climate emissions. Reporting this data to US DOT will allow for some much-needed public accountability.

What this amounts to, in reality, is simply climate change denial.  Highway agencies don’t want to admit that cars and driving are responsible for climate change, even though transportation is now the largest source of greenhouse gas emissions.

Where are the 15 minute neighborhoods?  Notwithstanding the hysterical pushback from the far right, most people instinctively understand the convenience and economy of living in a place where you can take care of most of your daily needs without being required to drive a car.  Fifteen-minute living is a planning goal for many, but has is actually already a feature of some of the most desirable neighborhoods in any city.  Geographer Nat Henry has a detailed analysis of Seattle neighborhoods, showing which ones offer 15 minute living.  Significantly, Henry’s work estimates the fraction of households in each neighborhood that can walk to common destinations within 15 minutes.  It’s all or nearly all residents in a handful of dense, central neighborhoods, and none or vanishingly few in more outlying places.

Henry’s analysis is both transparent and flexible:  It spells out the criteria used to determine what destinations are within walking distance, and allows the end user to select for other important destinations (like light rail stations and grocery stores, restaurants and parks).  Henry’s interactive application generates maps (like the one shown above) and can answer basic questions, like, how many residents live within 15 minutes of a supermarket. This is the kind of tool that can demystify the concept of 15 minute living–and hopefully, defuse the silly and extreme opposition.

Sahm Rules.  At City Observatory, we mostly focus on urban economics.  But urban economies swim either with or against the tides of the larger macro-economy.  The rate of inflation, the unemployment rate and interest rates all play out directly in urban labor markets and housing markets.  As we emerged from Covid pandemic, there were serious (but temporary) supply disruptions (coupled with opportunism) that triggered a spike in inflation.  Many economists, apparently drawing on the experience of the late 1970s argued that a recession was either a necessary or desirable way to break an emerging inflationary spiral.  A handful fo economists, including Claudia Sahm, took a contrary view, arguing that the surge in inflation was short-lived, not deep seated.

The experts who say “we need a recession” or big payroll numbers might be a “nightmare” are those who warned that the Rescue Plan would spark high inflation. Inflation did pick up a lot. They were right about the direction. But, while the Rescue Plan played some role, it is not the reason for the inflation. Covid and the war in Ukraine caused massive, ongoing disruptions once you accept that it’s time to back away from the Phillips Curve.

The record of the past year–with a steady decline in inflation to the 2 percent target set by the Fed, and continued job growth and record low unemployment–shows Sahm was right.  Engineering a “soft-landing” and avoiding a recession has been a blessing in 2023, and may set the stage for further improvement in the coming year.  If you want macroeconomic analysis that is  incisive, clear, and for this year quite prescient, you should read Claudia Sahm’s work.

Happy Holidays—See you next year!

The Week Observed will return in January 2024.

The Week Observed, December 15, 2023

What City Observatory did this week

Exaggerated Benefits, Omitted Costs: The Interstate Bridge Boondoggle.  A $7.5 billion highway boondoggle doesn’t meet the basic test of cost-effectiveness.  The Interstate Bridge Project is a value-destroying proposition:  it costs more to build than it provides in economic benefits

Federal law requires that highway projects be demonstrated to be “cost-effective” in order to qualify for funding.  The US Department of Transportation requires applicants to submit a “benefit-cost” analysis, that shows that the economic benefits of a project exceed its costs. We take a close, critical look at the benefit-cost analysis prepared for the proposed  Interstate Bridge Replacement (IBR) project between Portland and Vancouver.

City Observatory’s analysis of the Interstate Bridge Replacement Benefit-Cost Analysis (IBR BCA) shows that it is riddled with errors and unsubstantiated claims and systematically overstates potential benefits and understates actual costs.

  • It dramatically understates the actual cost of the project, both by mis-stating initial capital costs, and by omitting operation and maintenance and periodic capital costs and toll charges.
  • The construction period is under-estimated, which likely understates capital costs, and overstates benefits
  • In addition, the study also omits the toll charges paid by road users from its definition of project costs, in clear violation of federal benefit-cost guidelines.
  • In addition, the IBR BCA study dramatically inflates estimated benefits.
  • It uses an incorrect occupancy estimate to inflate the number of travelers benefiting from the project.

Federal funding for the IBR project can’t go forward unless the benefit-cost analysis shows the project is cost-effective.  This report, which is replete with errors, omissions and highly questionable assumptions was prepared, not by some independent expert, but by WSP, a private consultant that holds more than $70 million in contracts to work on the IBR project—contracts that would very much be in jeopardy if the numbers in the benefit-cost study didn’t come out right.  But neither WSP nor the two state highway departments revealed this conflict of interest.

Must Read

Why is it so hard to get a red-light camera?  Angeleno (and some time City Observatory contributor), Miriam Pinski describes her experience trying to get a red light camera for an intersection in her neighborhood—one that has already cost the lives of some of her neighbors.  Red light cameras have been shown to significantly improve safety by discouraing red-light running, but the city council has made it all but impossible to get one on her street.  Some object to red light cameras as a revenue raising scam, but cameras could easily be tuned to focus entirely on safety. As Pinski relates:

Many of the problems with automated enforcement cameras, though, are eminently solvable. Los Angeles could make the first violation free and then implement a graduated system for fines, so the more red lights you run, the more you pay. After all, the certainty of getting caught matters more than the severity of the punishment.

The year of the E-Bike:  David Zipper argues that the most important transportation breakthrough of the year was the growing adoption of electric bicycles.  Writing at Fast Company, Zipper contrasts the widespread and successful growth of electric bike ownership and access to the continuing troubles plaguing the much over-hyped self-driving car.  The differences in the roll-out of the two technologies couldn’t be more striking:  self-driving is the province of a handful of very large firms, spending billions, which have produced a relative handful of vehicles, and which seem to pose huge safety risks to road users.  Meanwhile, e-bikes are a modest and simple upgrade to a robust technology, are cheap, pervasive and provided by hundreds of competing firms with their own innovative approaches.  A recent high profile crash of a self-driving Cruise vehicle in San Francisco, and the recall of 2 million Teslas is a stark reminder of the risks of autonomous vehicles; a theme punctuated by Netflix, which depicted dozens of Tesla’s run amuck nearly killing Julia Roberts:

Netflix: “Leave the world behind”

As Zipper notes:

E-bikes raise no such existential concerns. On the contrary, all signs indicate that a city full of e-bikes would be safer, healthier, cleaner, and less congested than one dominated by cars—no matter how they are driven. And e-bikes really are car replacers: The addition of a battery can enable even mobility-constrained cyclists to conquer hills, haul packages, or beat the heat. Better yet, families can save tens of thousands of dollars by using an e-bike in lieu of a second or third car. And lest we forget: E-bikes are fun.

In the News

Streetsblog featured a story on our critique of the benefit-cost analysis prepared to justify the $7.5 billion Interstate Bridge Replacement project.

The Week Observed, December 8, 2023

What City Observatory did this week

Tolling i-5 will produce massive traffic diversion.  The proposed I-5 Interstate Bridge Replacement (IBR) Project will be paid for in part by $2.80 to $4.30 tolls charged to travelers.  These tolls will cause tens of thousands of vehicles per day to stop crossing the I-5 bridge; and most traffic will divert to the parallel I-205 bridge, producing gridlock, according to IBR consultant reports and Metro travel demand modeling.

OregonDOT and Washington State DOT officials have offered vague and largely meaningless claims about potential diversion from tolling the I-5 bridge, and failed to disclose actual analyses done this subject by their consultants.

City Observatory obtained—via public records requests—toll revenue estimates prepared by IBR contractor Stantec, and travel demand modeling prepared by Metro for the IBR project.  These studies show that tolling I-5 will dramatically reduce I-5 traffic, with most vehicles diverting to I-205.

  • Tolling I-5 will cause traffic levels on I-5, currently about 140,000 vehicles per day will fall by almost half, and will permanently depress I-5 traffic
  • Tolling I-5 will cause more than 30,000 vehicles to divert to the parallel Interstate 205 bridge, likely producing gridlock.

The new toll revenue projections echo exactly the findings of studies for the earlier carbon copy of this same project (then called the Columbia River Crossing) as well as the experience of tolled bridges and highways elsewhere in the country.

Highway agency claims that investment grade forecasts are unlikely “worst case scenarios” are untrue:  Traffic levels routinely fall below levels predicted in investment grade forecasts, as happened with the Tacoma Narrows Bridge, and many other similar projects.

Down is not up:  The truth about traffic, congestion and trucking. A central message of the highway building sales pitch is that traffic is ever-growing and ever worsening, and that we have no choice but to throw more money at expanded capacity. The Oregon Department of Transportation (ODOT) claims that traffic is every-rising, congestion is ever-worsening, and we’re always moving more and more trucks. The reality, as revealed by ODOT’s own statistics is very different:

Post-pandemic, traffic levels are lower than before, time lost to traffic congestion is down almost 40 percent, and fewer trucks are on Oregon’s roads. This lower level of demand means we don’t need to squander billions on added capacity, as ODOT is proposing.  Instead, measures to reduce or manage demand, like congestion pricing, could give us much faster travel times, at far lower cost. Claims that the increase in traffic is inexorable, and that congestion will only get worse, are just a marketing gimmick to sell capacity expansion.  It speaks volumes about state highway agencies that they’ll willing distort what their own  actual data are showing about declining traffic, lessened congestion, and fewer trucks on the road, to make these spurious claims.

Must Read

Like the “End of History” proclamations the “End of Freeways are premature and misleading.  Joe Linton, writing at LA Streetsblog dismantles some phony greenwashing from highway builders in Southern California who’ve billed the opening of a $2.1 billion freeway expansion project as the very last freeway they’ll ever build.  The CalTrans talking point, offered up stenographically by the Los Angeles Times, is that the completion of the 405 freeway in Orange County marks the end of an era.

As Linton points out, that’s not true:  The region’s highway agencies are planning literally billions in more spending on dozens of freeway projects.  Not only that, but the phony claim that “its the end of an era” is exactly the same story CalTrans got placed in a New York Times story—thirty years ago.

But the freeway era didn’t come anywhere near ending in the 1990s. Caltrans continued to spend copious state and federal funding to expand highways. Countywide sales taxes in Los Angeles and Orange counties funneled even more money into the well-oiled machine that continued to tear down homes to add more and more lanes. Since the 1990s, Caltrans leaders give lip service to their agency turning over a new leaf, but the department’s massive car capacity projects show these pledges to be false.

To be sure, they’ll keep building (and widening) freeways, they just won’t call them that.  It will be “auxiliary lanes” and “repaving.”  CalTrans just fired its deputy director for calling out the illegal and sham use of “repaving” as cover for widening Interstate 80 between Sacramento and Davis.  These are people who will nod and murmur that we can’t build our way out of congestion–and then spend billions more widening roads.  It’s hypcritical greenwash.

Accessibility:  Our daily bread.  Cities solve for transportation not just with buses, bikes and cars, but with proximity.  If daily destinations are close at hand, it takes less transport expense and infrastructure just to live. In economic terms, transportation is a “transaction cost”–the more and further we have to travel, the more time and money we have to spend to do the basics.  What good cities do is to put critical destinations close to where we live.  Dueling maps appeared on social media this past week that illustrate this pattern.  In Paris, Jonathan Berk (@berkie1) shows that 94 percent of Parisians live within 5 minutes of a boulangerie.  Not to be outdone, @datavizero offers up a map showing that 95 percent of residents of Mexico City are within 5 minutes of a taqueria.

 

 

The enormous savings from convenience, proximity and choice that are present in great cities are often entirely overlooked in our economic reckoning.  Next time someone talks about transportation, please think about how long it takes a typical resident to get a baguette or a taco.

The Week Observed, December 1, 2023

What City Observatory did this week

Secret plans show ODOT is planning a 10-lane freeway in the Rose Quarter.  City Observatory has obtained previously un-released plans showing that the $1.9 billion I-5 Rose Quarter project is being build with a 160-foot wide roadway, enough to accommodate a ten through traffic lanes, contradicting the Oregon Department of Transportation’s claim that their project merely adds “one auxiliary lane” in each direction.  Each of the two 81-foot wide bays underneath Broadway and Weidler Streets are enough to accommodate five lanes of traffic, with over-sized shoulders.

The project is even wider as it passes under NE Hancock Street, where it would balloon to 250 feet wide.  In reality, ODOT is proposing to double and even triple the existing 82-foot wide roadway.  The excessively wide roadway, which has to be lowered in order to provide adequate clearance for massive cross-beams, is the real reason the project is so expensive.  Far from reconnecting the community, ODOT’s plans vastly increase the width of the freeway scar through this historically Black neighborhood, and generate even more car traffic, making the area more dangerous and more polluted, and actually impairing redevelopment. In addition, the project’s Environmental Assessment fails to analyze the effects of this larger roadway, likely violating the National Environmental Policy Act.

Must Read

We’re all YIMBY’s now.  There’s widespread and bi-partisan report for addressing America’s housing shortage by making it easier to build more apartments.  A new poll from the Pew Charitable Trusts shows consistent and large majorities favor a battery of YIMBY (Yes in my back yard) policies, ranging from upzoning around transit stops and stations, adding accessory dwelling units, and allowing apartments near offices, stores and restaurants.

Strikingly the support crosses many of the nation’s demographic and ideological divides.  As the Pew authors note:

Support for most of the housing policies transcended the usual fault lines of political party, region, race, income, and gender. The eight most popular proposals received clear majority support from Republicans, Democrats, and independents. In addition, 9 of the 10 tested measures received majority support from both renters and homeowners.

The Pew polling should provide additional impetus to the growing legislative efforts to make it easier to build more housing.

Seattle votes against taxing new homes to subsidize driving.  One of the most insidious and invisible contributors to the housing shortage in US cities is the proliferation of “impact fees” that are imposed on new housing.  Politically, its popular to pretend to foist the cost of infrastructure, especially transportation, onto “developers” by imposing hefty fees on new housing, especially apartments.  But  basic economics demonstrates that new fees don’t come out of developer profits, but instead simply drive up the cost of new housing, and also—and this is critical—reduce how much new housing gets built, which ultimately makes all housing more expensive.  The Seattle City Council considered, and narrowly rejected a proposed transportation impact fee, persuaded by a study showing it would worsen the city’s housing problem by reducing construction 15 to 17 percent.

The defeat comes at a crucial time for housing affordability in the city.  The amount of new development has fallen off sharply, with applications for new apartment permits falling about 75 percent since 2020.  Burdening new housing construction with the costs of transportation, plus fees to subsidize affordable housing, discourages new investment; the effects may not be immediately apparent, but given the long lags in the construction cycle, it may be too late to correct these bad policies before the damage is done.

$7.5 billion Interstate Bridge Project named nation’s most expensive highway boondoggle.  The misleadingly named  Interstate Bridge Replacement project, really a 12-lane wide five mile long freeway expansion between Portland and Vancouver took top honors in the recently released Highway Boondoggles report prepared by the US Public Interest Research Group.  Streetsblog featured detailed coverage of the project this past week.

The IBR project has been a consultant grift of major proportions.  Public records obtained by City Observatory show that Oregon and Washington have already committed $192 million to consultants, including more than $20 million to public relations and communications consultants.  That comes on top of $200 million spend a decade ago on the same project, then called the Columbia River Crossing.

 

 

The Week Observed, November 17, 2023

What City Observatory did this week

5 million miles wide of the mark.Portland’s regional government Metro, has proposed a regional transportation plan (RTP) that purports to achieve state and regional policies to reduce greenhouse gas emissions.  But there’s a 5 million mile problem:  The climate analysis of the Metro RTP assumes that the region will hold driving at about its current level through 2045—about 20 million miles per day.  But the region’s transportation modeling and performance measures–which drive project selection and budget decisions, have a very different future in mind, planning for a 20 percent increase in driving to 25 million miles per day.  The plan can’t reconcile the 5 million mile a day discrepancy.

Metro’s Regional Transportation Plan (RTP) claims it will meet state and regional climate objectives by slashing vehicle travel more than 30 percent per person between now and 2045.

Meanwhile, its transportation plan actually calls for a decrease in average travel of less than 1 percent per person.  Because population is expected to increase, so too will driving.

Rather than reducing driving, and associated greenhouse gas emissions, Metro’s RTP calls for accommodating more than 5 million additional miles of driving a day—a 20 percent increase from current levels.

The RTP climate strategy asserts the Portland area will drive 20 million miles a day and meet our greenhouse gas reduction goals.  But Metro’s transportation modeling shows the RTP is planning for a system that will lead to 25 million miles per day of driving.

This disconnect between Metro’s climate modeling, and the modeling it’s using to size the transportation system, and make investments violates state climate rules.

Must Read

The devil is in the detail for congestion pricing.  It looks increasingly likely that New York will actually implement congestion pricing next year.  Still, there are majny details to be worked out, and as Nicole Gelinas points out, the devil is very much in these policy details.  Still to be decided are how to deal with a raft of requests for exemptions from pricing.  According to the MTA, there have been requests for 122 categories of exemptions.  As Gelinas points out, the easiest way to deal with this is to give no exemptions, especially for privately owned vehicles.

The most straightforward way to keep the base price low is for the MTA to approve zero exemptions for anyone driving private automobiles, which constitute 35 percent of core Manhattan traffic.

Gelinas argues that buses ought to be exempt (after all, they are getting cars off the city streets and providing a low cost alternative to car travel).  The problem with exemptions is likely to be a persistent and gnarly one:  The de facto exemption that police and many public employees have wrung from parking regulations doesn’t spawn a lot of hope in this regard.

Cycling in Paris is up 50 percent in the past year.  There’s a biking revolution happening in the City of Light.  Mayor Ann Hidalgo has led an aggressive campaign to add bike lanes to the city’s streets, and Parisians are responding in dramatic fashion.  It’s now the case the on many major arterials, the volume of traffic carried in bike lanes exceeds that carried in lanes open to car traffic.  The following chart shows the change in bike traffic in Paris over the past 12 months (October 2022 to October 2023).  There are notable increases, especially on weekdays and during peak hours:  clearly cycling is carrying a significant and growing part of the urban transport load, especially at peak hours.

 

Call it a virtuous example of induced demand:  if we build capacity for bike travel, people will bike. The fact that cycling continues to grow at this prodigious rate is a signal that a large scale commitment to redesigning urban transportation systems can quickly produce big changes in travel behavior.

The Rhode Island Attorney General challenges highway spending on climate grounds.  Like many state highway departments, the Rhode Island DOT sees the latest round of federal largesse in the Bipartisan Infrastructure Law as a way to further pad its budget for widening highways.  But wider roads simply trigger more car traffic, and increased pollution and recurring congestion. Rhode Island Attorney General Peter Neronha has formally challenged the state DOT’s plan to spend funds earmarked for fighting climate change on widening freeways. Like several state’s Rhode Island has a legal commitment to reduce greenhouse gases, and federal law requires agencies like RIDOT to show how they’ll make such reductions, but according to xxxx the RIDOT plan is wildly inadequate:

. . . a majority of the funding is earmarked for congestion management projects which, in RIDOT’s own words, “will not substantially ‘move the needle’ when it comes to carbon reduction.”4 That is, the Carbon Reduction Strategy identifies its carbonreducing expenditures as efforts mainly to make it easier for cars to travel—not the transformative changes necessary to remake a carbon-heavy sector to achieve Rhode Island’s reduction mandates. With just six years left before the first interim Act on Climate mandate, the State cannot afford to keep up the status quo and fail to identify needed actions to address the largest sector of emissions.

Would that other state Attorneys General would take up this cause.

Freeway fighter conclave gets national attention.  Last month, Cincinnati hosted the first national convening of grass-roots freeway fighters from around the nation.  Megan Kimble has profile of the conference and some of its attendees at Bloomberg. Kimble featured a number of local activists from around the country, including Portland’s Adah Crandall, leader of a group called “Youth vs. ODOT”

. . . there was a strong sense of solidarity among the attendees, who shared both their frustrations and hopes during a two-minute rant session at the end of the first day. Several younger organizers, such as Crandall, spoke about the intense urgency they feel to act in the face of the climate crisis. “Let’s cause some chaos and make state DOTs really afraid,” Crandall said.

New Knowledge

It’s official:  taller, blunter vehicles are more deadly.  For a long time, traffic safety advocates have pointed to the increasingly aggressive profile of the nation’s pick-up trucks.  Grills and vehicle front-ends are taller and more massive, and as a result, more likely to deliver a deadly blow to the head, neck and torso of people walking or biking, and push them under a vehicles wheels, rather than striking someone’s legs and pushing them over the hood of a vehicle.

A new study from the Insurance Information Institute provides a grim confirmation of these  fears.  The IIHS study finds that towering newer pickup trucks and SUVs are 45 percent more likely to cause death or serious injury than today’s passenger vehicles.  Here is the study’s key finding:

In general, vehicles taller than 35 inches were more dangerous to pedestrians than the shorter ones, mainly because they tended to cause more severe head injuries. Among vehicles taller than 35 inches, those with vertical front ends were more dangerous than those with sloped front ends. Torso and hip injuries from these vehicles were more frequent and severe.

There’s little question that the growth of monster trucks is correlated with the increase in pedestrian fatalities in the past decade or so.  Hopefully the IIHS study will provide additional impetus for the National Highway Transportation Safety Agency (NHTSA) which regulates vehicles safety, to acknowledge this crisis, and take action.  They can start, of course, by expanding vehicle safety ratings to include the safety of non-occupants; but that’s just a start.  It’s long past time to regulate these deadly vehicles to protect people outside the monster truck

Hu, Wen, Monfort, Samuel S.,Cicchino, Jessica B., The association between passenger-vehicle front-end profiles and pedestrian injury severity in motor vehicle crashes,Insurance Institute for Highway Safety
November 2023, https://www.iihs.org/topics/bibliography/ref/2294

 

 

The Week Observed, November 10, 2023

What City Observatory did this week

Snow-Job:  Oregon Department of Transportation (ODOT) threatens to slash snow-plowing and other safety maintenance unless it is given more money, while spending billions on a handful of Portland area freeway widening projects.  ODOT claims it’s too broke to plow state roads this winter, with the not-at-all-subtle message that people need to give ODOT more money.  The agency’s PR has generated headlines like:

“ODOT says highways ‘may not be safe’ this winter due to budget cut”

Scary stuff, to be sure.

But ODOT’s budget claims are a Big Lie:  ODOT’s revenues are up, not down  And a handful of costly billion-dollar a mile Portland highway projects are the real hole in the agency’s budget.  ODOT’s real budget problem—the one it never mentions in press releases— is $10 billion worth of highway widening mega projects.  Three Portland area expansions—the $7.5 billion I-5 bridge replacement, the $1.9 billion Rose Quarter widening and the $600 million I-205 Abernethy Bridge, are the real source of budget stress. Not only is ODOT not cutting spending for these projects, which include hundreds of millions in spending on consultants, each of these projects has experienced 50 percent or more cost overruns in the past two years. 

 ODOT’s threat to slash spending on basic safety measures—like plowing snow and regularly repainting fog-lines on rural roads—is a cynical bureaucratic ploy to get more money at a time when deaths on Oregon roads are up 71 percent in the past decade.

Must Read

Portland’s Interstate Bridge Replacement project is the nation’s most expensive highway boondoggle.  US Public INterest Research Group has released its latest annual report cataloging the nation’s highway boondoggles.  They highlight the Interstate Bridge project has one of the worst.

Interstate Bridge Replacement, Oregon and Washington: Cost: $5 billion to $7.5 billion. Under the pretext of a simple bridge replacement, an expensive and oversized highway expansion threatens to worsen congestion in Portland and nearby Vancouver, Wash.

The report shows how excessive spending on highway expansions have not only failed to reduce traffic congestion and travel times, but have actually made our transportation and environmental problems worse.  The report concludes:

Highway expansion harms our health and the environment, doesn’t solve congestion, and creates a lasting financial burden for the public. Expanding a highway sets off a chain reaction of societal decisions that ultimately leads to the highway becoming congested again – often in only a short time. Since 1980, the U.S. has added well over 870,000 lane-miles of highway – paving more than 1,648 square miles, an area larger than the state of Rhode Island – and yet, prior to the COVID-19 pandemic, congestion on America’s roads was worse than it was in the early 1980s.

Jeanie Ward-Waller, former Deputy Director of CalTrans, is interviewed by KBPS “Freeway Exit” host Andrew Bowen.  Ward-Waller was fired from her job for calling out violations of environmental and spending laws for freeway widening projects.  She’s filed a whistle-blower complaint against CalTrans.  You can learn much more about what’s going on, and the entrenched obstacles and obstinate engineering culture that dominates an agency that still views its mission as building more and bigger roads, forever.  As Ward-Waller says, CalTrans was created to build roads, and is driven by “a mindless impulse to add more freeway lanes.”

Ward-Waller describes how she crashed into a “green ceiling” at the agency, and was dismissed for asking hard questions about why agency officials weren’t following their own adopted policies and state laws that aim to reduce driving as an essential part of the state’s climate strategy.  In addition to Ward-Waller’s detailed insider account, you’ll also appreciate the insightful questions and comments from her interlocutor, KBPS reporter Andrew Bowen, whose multi-episode Freeway Exit series is a “must listen” for anyone interested in transportation policy.

New Knowledge

Narrower lanes produce safer streets.  A very detailed new study looks at the effect of lane widths on crash rates on city streets.  For decades, highway departments have assumed that wider lanes are somehow safer, largely because they give drivers a better field of vision and more room for error (before they hit someone or something outside their lane).  The trouble, of course, is that these features of wide lanes prompt people to drive faster, which both increases the probability and severity of crashes.  This study demonstrates, conclusively, that narrower lanes are better for safety–they prompt drivers to be more careful, speed less and take fewer risks.

The study looked crash rates at more than 1,100 street segments in seven cities.  Importantly, it controlled for a number of other contextual variables that are likely to influence crash rates.  One of the most important takeaways is that narrow lanes–nine foot lanes–have the effect of reducing driver speeds and incidence of crashes on roads in the 30-35 mile per hour category.

. . . street sections with 10-foot, 11-foot, and 12-foot lanes have significantly higher numbers of non-intersection crashes than their counterparts with 9-foot lanes in the speed class of 30—35 mph. • In other words, in the speed class of 30—35 mph, wider lanes not only are not safer, but exhibit significantly higher numbers of crashes than 9-foot lanes, after controlling for geometric and cross-sectional street design characteristics of street sections.

A portion of the study consisted of a survey of highway engineers to better understand their actual practice in reducing lane widths.  The the standard practice is weighted toward wider lanes, typically as much as 12 feet, and that consideration of narrower lanes requires a separate “design exception.”  This hurdle means few states actually implement narrower lanes, even if their policies endorse the ideas of “complete streets” and “context sensitive design” which would encourage departures from a rigid engineering standard.  The report concludes:

Although in theory there has been a significant departure from conventional lane width design standards to promote flexibility in highway design, in practice we are far from implementation of the context-sensitive design solutions by most state DOTs. The design exception for lane width reduction projects seems to be a rare event in most state DOTs that participated in our survey.

This study is important scientific evidence for community activists looking for immediate steps they can take to lower traffic crashes, deaths and injuries.  Narrowing the lanes on city streets helps slow traffic, induce greater driver caution, and measurably improve safety.  Outdated claims that best engineering practice requires wider roadways needs to give way to this new knowledge.

Shima Hamidi and Reid Ewing.  A National Investigation of the Impacts of Lane Width on Traffic Safety: Narrowing Travel Lanes as an Opportunity to Promote Biking and Pedestrian Facilities Within the Existing Roadway InfrastructureNovember 2023, Johns Hopkins University.

In the News

Willamette Week quoted City Observatory director Joe Cortright in its story examining the slow and troubled roll-out of a voter-approved pre-K education program in Portland.

 

The Week Observed, November 3, 2023

What City Observatory did this week

Killer off-ramps.  The Oregon Department of Transportation’s $1.9 billion I-5 Rose Quarter widening has been repeatedly (and falsely) portrayed as a “safety” project, but the latest re-design of the project may make it even more dangerous than it is today.  An earlier “Hybrid 3” re-design, added one dangerous hairpin off-ramp from I-5 into Rose Quarter neighborhood.  A new re-design, labeled “the anchor” actually doubles up the number of hairpin off-ramps, by adding a flyover exit that crosses back over the I-5 freeway, while reversing direction from South to North.
The design seems inherently dangerous.  The I-5 main stem has a 70 mile per hour design speed, and emerges from a long covered tunnel section immediately before hitting the two hairpin off-ramps.  The danger isn’t just conjectural.  ODOT’s own analysis of”Hybrid 3″ said the trucks couldn’t stay in their lane on the hairpin exit, and that crashes would rise as a result.  Just last month, a similar hairpin ramp in downtown Portland claimed the life of pedestrian Brandon Coleman, who was killed in a hit-and-run.

Must Read

Jeanie Ward-Waller speaks out.  Earlier this month, Jeanie Ward-Waller was effectively fired from her job of CalTrans for pointing out that highway widening projects were violating environmental laws and illegally using funds dedicated to road maintenance.

Ward-Waller gives her first hand account of how calling out illegal activity to circumvent the state’s adopted climate laws led to her demotion:

My concerns centered on a large freeway project described to the public as “pavement rehabilitation,” which is repaving. But I believe the project is in fact, an illegal widening of a 10-mile freeway section of the Yolo causeway between Davis and Sacramento on Interstate 80. After scrutinizing project documents, I realized that Caltrans officials were widening the freeway, using state funds that cannot be used to add lanes. By calling it a “pavement rehab project,” Caltrans avoided public disclosure of the project’s environmental impacts.
As Ward-Waller says “the rot runs deep” in transportation agencies like CalTrans.

TexDOT thumbs its nose at environmental laws.  Kevin DeGood of the Center for American Progress lays bare the environmental lies embedded in the Texas Department of Transportation’s analysis of the proposed $4.5 billion widening of I-35 through Austin.

TXDOT makes the farcical claim that the project won’t have any environmental impact, largely because the agency’s traffic modeling predicts the same number of people will drive the same number of miles whether or not the freeway is built.  TXDOT can get away with this falsehood because the US Department of Transportation has delegated its legal authority for assessing NEPA compliance to the state.  DeGood points out a badly flawed analysis of driving trends and greenhouse gases.

According to TxDOT, the total vehicle miles of travel (VMT) on I-35 in central Austin will rise by 45 percent between 2019 and 2050 as the population grows. But this is where things get weird: TxDOT claims that total VMT and vehicle emissions would be essentially the same under both the build and no-build scenarios — meaning the state is projecting 45 percent growth in driving by 2050 regardless of whether I-35 is expanded or not. Moreover, the agency estimates that greenhouse gas emissions from driving on I-35 will only be seven percent higher under the build scenario. This stretches the bounds of credulity, to say the least.

TXDOT’s second tactic for evading environmental laws is to break up a 31 mile widening of I-35 into three separate projects, which it claims are un-related.  Taken together, they’re a permanent commitment to a more car-dependent, more polluting development pattern for decades to come.  Pretending that a massive new roadway won’t engender more driving and more greenhouse gases flies in the face of well known science of induced travel, and makes a mockery of the National Environmental Policy Act.

More driving is undoing San Diego’s climate progress.  Like many cities, San Diego loves to tout its commitment to climate goals, and celebrate ancecdotal success:  Clean electricity! New vehicle charging stations!  But buried deep in the published, but unpublicized statistical appendix of the city’s latest climate report is the inconvenient truth that greenhouse gases from transportation in the city are increasing because people are driving more. As local public radio station KPBS reports

The climate report found that emissions from the generation of electricity fell by an impressive 27% from 2020 through 2021. This was attributed to an increase in renewable energy purchases by both SDG&E and San Diego Community Power, a government-run nonprofit that began purchasing energy on behalf of homes and businesses in 2021.”  However, the emissions from on-road transportation in 2021 were 13% higher than the emissions in 2020,

And this trend is even being underwritten at public expense by millions of dollars spent on highway widenings, like a proposed $40 million widening of Grantville Road, which might save commuters 10 to 20 seconds (until the induced travel effect kicks in).

Real estate commission price-fixing violates anti-trust laws.  In a potenially landmark case out of Missouri, a federal jury has agreed that a widely used system of real estate commissions violates anti-trust laws.   The jury found that plaintiffs (homesellers) were due damages of $1.8 billion–which could be tripled under the Sherman Anti-Trust Act.  The Wall Street Journal called it a “big legal defeat for realtors, and summarized the case as follows:

The plaintiffs provided compelling evidence that overall commissions have stayed at roughly 5% or 6% for decades, split evenly between the buyer and seller brokers. This is about two to three times as high as in other wealthy countries where such self-serving industry arrangements don’t exist. The inflated commissions are baked into home prices.

The verdict may force a reform of rules on real estate commissions nationwide.  The case is likely to be appealed, but if the decision holds, it could be a major shift in the way homes are bought and sold.

The Week Observed, October 27, 2023

What City Observatory did this week

More climate fraud in Portland Metro’s proposed regional transportation plan.  We branded Metro’s proposed Regional Transportation Plan (RTP) a climate fraud because in falsely claimed the region was reducing greenhouse gases, and falsely claimed its transportation investments were on track to meet adopted state climate goals.
Metro’ staff has responded to these critiques, but  proposes only to fix these mistakes at some vague future time, and more importantly, make absolutely no substantive policy or investment changes to the RTP.
In essence, the staff response puts the lie to the claim that climate/GHG reductions are the “controlling measure” in RTP system planning.  Whether Metro is on track to achieve its committed GHG reductions or not has no bearing on any of the substantive policy and spending decisions in the RTP.
Moreover, this is a straightforward violation of the policies enacted in Metro’s 2014 Climate Smart Strategy (and reiterated in the 2018 RTP, and current RTP draft), to continuously monitor progress in GHG reduction and undertake additional measures if we were not making adequate progress.

Must Read

Garrison Keillor on cities and civility.  Writing at Substack, Garrison Keillor reflects on how walking and taking transit in cities brings us closer together.  In an essay, “Standing at 86th, Waiting for a Train,” Keillor observes:

The city attracts aspiring artists, writers, actors, musicians, who are prepared to live in poverty, wait on table if necessary, while scrambling for a break. The quickest way around town is the subway, where unemployed actors, highly paid CEOs, cleaning ladies, digital geniuses, and ordinary working stiffs merge in a river of humanity. There is no Business Class on the A train. In the subway stations, you will find refugees from South America selling cups of fruit, as well as panhandlers, and outright crazy people, all on foot, and to a refugee from the Midwest and suburban freeway culture, this is at first disconcerting and then inspiring. The civility that prevails gives you faith in your fellow man. The politeness shown to a parent of a little kid in a stroller, or an old man using a cane, or an autistic person, is just as third-grade teachers have taught their pupils for generations. People who shove are spoken to, or at least glared at. Passengers stand packed in a rush-hour train doing their best to respect each other’s personal space.

CalTrans:  Freeways for the People!  In the wake of whistleblower revelations that the California highway department is flouting environmental laws, illegally using maintenance funds to widen roads, and generally subverting the state’s climate plans, the California Transportation Commission pushed back.  The commission’s chair, from famously car-dependent Fresno tearfully claimed that wider roads were “for the people.”  All this flies in the face of scientific evidence that widening roads does nothing to reduce congestion and travel times, increases driving and pollution, and leads to more sprawl and car-dependence.

Like other state DOT’s parade bike lanes and pedestrian facilities, but devote spend orders of magnitude more funds on widening roadways.  CalTrans director xxxx tweeted about the agencies bike projects, but the reality is captured more by giant freeway widenings.

Renderings of giant freeway projects invariably show that they will be little used.  If that were really the case, there would be no reason to spend billions making them so large.

 

New Knowledge

More cars = fewer and more distant grocery stores.

A new study looks at how the changing patterns of grocery store location and residential development are affecting travel patterns.  The study is a detailed, multi-decade examination of two towns in Norway.  The authors plot the location of grocery stores and residents in 1980 and 2019, and compute the walking distances to the nearest store.  Here’s a map of Lørenskog, showing 1980 stores.  Over time, there are fewer grocery stores and they are, on average, further from homes.  The number of grocery stores declined from 23 in 1980 to 16 in 2019, with stores closing in central locations (shaded circles) and new stores opening in somewhat more peripheral locations.  The number of grocery stores fell by about one-third, even as population increased by almost 50 percent.

The net effect of store closures and new store openings was to increase the average distance to the nearest grocery store.  In 1980, 55 percent of residents lived within 500 meters of the nearest grocery; by 2019 only about 35 percent were that close.

Over four decades, the number of grocery stores declined, population decentralized, and the average household ended up further from the nearest store.  These trends have important implications for transportation policy.  With common destinations like stores further away, fewer people will walk, and more people will choose to use (and own) cars in order to perform basic daily tasks.

. . . this change in distances represents a kind of lock-in effect: Increased distances to essential locations make a modal shift from car to more sustainable transport modes difficult to achieve.The changes in accessibility illustrated in this paper are likely parts of structural conditions of locations at neighbourhood-, city- and county scale that are so decisive for travel mode choice that the contrary effects of policies and interventions trying to reduce car driving as currently applied by the government, such as increased toll roads for private cars and providing new bicycle lanes, are overruled.

In addition—though not noted in the study—growing levels of car ownership change the competitive economic landscape of grocery retailing, making it harder for small neighborhood stores to compete against larger shops with better selection, and often lower prices.

Rokseth, L.S., Heinen, E., Hauglin, E.A. et al. Reducing private car demand, fact or fiction? A study mapping changes in accessibility to grocery stores in Norway. Eur. Transp. Res. Rev. 13, 39 (2021). https://doi.org/10.1186/s12544-021-00500-7

The Week Observed, September 8, 2023

What City Observatory did this week

What apartment consolidation in New York tells us about housing markets and gentrification.  A new study shows that over the past several decades, New York City lost more than 100,000 homes due to the combination of smaller, more affordable apartments into larger, more luxurious homes

When rich people can’t buy new luxury housing, they buy up, and combine small apartments to create larger homes.

 

The illusive New York apartment (Flickr: Sharona Gott)

If you’re worried about gentrification and displacement, this is a vastly larger problem than new construction–which has been repeatedly shown to lower rents and create more housing opportunities for lower income households.

Must Read

Why Portland’s downtown is doing much better than you’ve been told.  Local and national press have flogged downtown Portland’s supposedly tepid recovery from the Covid pandemic based on statistics generated by the University of Toronto.  That study looks at pre- and post-pandemic cell phone data for selected neighborhoods in major US cities, and rates Portland second to last for the percentage of activity recovered since 2020.  But economist Mary King, writing at Portland’s Street Roots digs into that data and exposes some serious flaws.

The University of Toronto data look at only a single (and unrepresentative) part of downtown Portland (with only about 1,000 residents), and count only changes in unique visitors (effectively discounting daily, repeat travelers.  And the study uses widely varying geographies for different cities; San Diego’s counts reflect both its airport and famous Zoo, and naturally shown a dramatic rebound in unique visitor counts.  As King reports:

Counting all visits and using a much bigger definition of downtown, the Portland Metro Chamber  reports Portland had nearly two-thirds as many visits in June 2023 as in June 2019. In stark contrast, the Toronto study asserts that there were just 37% as many visitors to “downtown Portland” from March through May 2023 as in 2019.

In short, the University of Toronto data don’t offer an apples-to-apples comparison of cities; more robust data suggests that Portland’s experience is similar to other US downtowns.

Fantasy models:  Their costs and consequences.  David Levinson who blogs at The Transportist, is one of the world’s leading transportation scholars.  He’s got a must read essay that debunks the pseudo-science behind traffic modeling, and importantly lays out its consequences for how we build our communities, how we live.  These seemingly trivial technocratic details hamstring our ability to think and plan differently.  The four-step models used by most transportation planners underpin a “predict and provide” approach to transport investment that fuels more car dependency.  As Levinson says:

Overly optimistic models can mislead decision-makers into supporting the wrong transport projects, projects that are not viable, cost-effective, or beneficial to the public. This can result in the approval of projects that might never reach completion or fail to deliver the promised benefits, wasting time, money, and political capital.

Unrealistic models can also contribute to environmentally harmful decision-making. By underestimating the potential impacts of a project on air quality, greenhouse gas emissions, or natural habitats, leads to projects that compromise environmental sustainability and public health.

While the problem is bad models, the solution is not to simply try to make the models somewhat better:  Their limitations are inherent.  Levinson argues we need to be smaller scale, adaptive and incremental in our approach to transportation.

Costlier car insurance.  The Washington Post looks at the big jumps in car insurance rates around the country. Higher insurance costs are both a symptom and a cause of inflation:  cars are more expensive to replace and repair, and climate has increased losses (especially in state’s with climate driven fire, flood and storm damage).  These costs drive up insurance rates, and car insurance is one component of the consumer price index:

Premiums have kept climbing even as other types of inflation have cooled. According to the Bureau of Labor Statistics, car insurance for U.S. drivers in July was 16 percent more expensive than in July 2022, and 70 percent more expensive than in 2013.

While there’s considerable variation across states, most states have seen substantial increases in the past year.

 

Revived park reconnects Memphis to the Mississippi.  Over Labor Day weekend, Memphis opened its revitalized Tom Lee Park.

Reviewing the opening for Fast Company, Nate Berg writes:

In Memphis, Tennessee, a remarkable new public park has just opened. Filling 30 acres along the edge of the Mississippi River with active, social, ecological, and architectural spaces, it could reframe the city’s fading connection to the riverfront. It could also set a new standard for what waterfront parks can do.

Congratulations to City Observatory friend Carol Coletta and Memphis Riverparks for this impressive accomplishment.

New Knowledge

Rent and inflation.  ApartmentList.com has some of the most careful, thorough and transparent data and analyses of rental market trends in the U.S.  Their latest work sheds light on the connections between overall inflation trends in the US, and ongoing changes in the apartment market.

In 2021 and 2022, a surge in rents as we recovered from the pandemic was a key contributor to the rapid run-up in overall consumer prices.  Rents, logically, are a key consumer expense, and higher rents mean higher inflation.  But there are key differences between the current market data (as estimated by ApartmentList.com, and others), and the rental price inflation that the Bureau of Labor Statistics uses to compute the official consumer price index.  In effect, changes in market rents feed into the CPI, but with a bit of a lag.  As a result, looking at ApartmentList.com data gives you a good idea of how the official inflation rate is likely to be affected by changing rents in the months ahead.

That’s why you should pay particular attention to this chart, which shows the ApartmentList.com rental inflation estimate, the BLS rent inflation estimate, and the overall CPI.

As Rob Warnock of ApartmentList.com writes:

The Apartment List National Rent Index has proven to be a strong leading indicator of the CPI housing and rent components, as we captures price changes in new leases, which eventually trickle down into price changes across all leases (what the CPI measures). Because of these methodological differences, when our index peaked with record-setting rent growth in 2021 (+17.8 percent), the rent component of CPI was still just starting to heat up, even as overall inflation had already become a key economic issue for policymakers and everyday Americans. Now in 2023, our index shows that the rental market has been cooling rapidly for a year, but the CPI housing component has just recently hit its peak. Despite the CPI’s measure of housing inflation remaining elevated, topline inflation has already meaningfully cooled. As the CPI housing component now gradually begins to reflect the cooldown that we’ve long been reporting, it will help to further curb topline inflation in the months ahead.

The ApartmentList analysis shows that rents are coming down in most markets around the US, with year over year declines in 72 of the 100 largest markets.  This shift is underpinned by fundamentals, including increasing completions of new apartments in most markets, and evidenced by a vacancy rate that is now at pre-pandemic levels.  All this bodes well for improvements in housing affordability, overall, in the next year.  And this will also show up in reduced rates of headline inflation.

Rob Warnock, Apartment List National Rent Report, September 2023. ApartmentList.com

 

The Week Observed, September 1, 2023

What City Observatory did this week

Rose Quarter:  Death throes of a bungled boondoggle.  For years, we’ve been following the tortured Oregon Department of Transportation Plans to widen a 1.5 mile stretch of I-5 near downtown Portland.  The past few months show this project is in serious trouble.  Here’s a summary of our reporting of key issues.

  • Costs have quadrupled to $1.9 billion, from an 2017 estimate of $450 million
  • ODOT has diverted the funding earmarked for Rose Quarter to other freeway expansion projects
  • The Rose Quarter is so expensive because it’s too damn wide:  ODOT has designed a road to accommodate a 10 lane freeway, while falsely claiming its only six lanes wide
  • ODOT sold the freeway widening as restorative justice for the Albina community devastated by multiple ODOT highway projects, but then diverted funds to a suburban freeway

The truth is that the Rose Quarter is a bungled boondoggle.  Its cost has grown to the point where it’s now more than a billion dollars a mile for this roadway.  That’s money that ODOT simply doesn’t have, and would be crazy to squander on this, even if they did have it.  Rather than concede reality, the agency will play “extend and pretend” for several more years, and waste tens of millions more on consultants.

Must Read

Science has a well-known YIMBY bias.  A growing body of scientific research confirms a fundamental fact about housing markets:  Increased supply is associated with lower rents.  Writing at Forbes, Adam Millsap has a succinct summary four recent papers published in Cityscape that examine various experiences.  The research ratifies the YIMBY policy agenda:

These studies show that zoning changes and other land-use reforms can increase the supply of housing, help control prices, and boost local tax bases. Several states passed housing reforms in 2023, including Washington, Montana, Texas, and Vermont. I expect the supply of housing to increase in these states and for their housing prices to moderate over the coming years. State and local officials who are concerned about housing affordability should follow the science and reform their land-use policies.

The idea that new construction raises housing prices and rents is a cherished NIMBY myth.  The growing weight of scientific evidence is shifting public perceptions.

LA needs more density to promote affordability and make transit work.  UCLA professor Michael Lens speaks to the need for more density in Los Angeles.  The strict single family zoning that dominates most of the city is driving up home prices (now approaching a million dollars city wide) and rents.  And the limits on apartment construction undercut the region’s investment in transit.  He argues that housing would be more affordable, and transit more successful, if we allowed more density:

[Zoning limitations] mean too few people can live close enough to generate the ridership necessary to justify the immense investment it took to build the E Line. Further, the neighborhood’s single-family housing is so astronomically expensive — a million dollars will get you … nothing — that the people most likely to rely on the E Line cannot afford to live there. There are several reasons why ridership is down on L.A. Metro’s trains and buses, and too little density near transit is a major reason.

Lens adds that Los Angeles because the city is so spread out all ready–with multiple job centers, and robust neighborhoods–it needn’t concentrate density in one place, but rather it should look to add density, particularly where transit service is readily available.

Why ODOT should fund the billion bollards club.  Strong Towns has a provocative infrastructure idea:  Americans would be far safer if we invested a billion or so bollards to protect pedestrians and cyclists from cars leaving the roadway.  Chuck Marohn wrote:

America needs a billion bollards. There is no coherent argument against lining every street in America with them. This is the minimum level of protection needed to keep people safe from violence. It is the least we can do to correct the massive asymmetry of risk experienced on our nation’s streets by people outside of a vehicle.

A local Portland group has taken up the challenge and is beginning by mapping locations that would be logical locations for bollards.  To date, Portland, like most cities, primarily seems to be interested in using bollards to protect signal boxes and other inanimate objects, rather than vulnerable road users.

Tragically, the Oregon Department of Transportation’s response to safety is closing crosswalks, not building bollards.

The Week Observed, August 25, 2023

What City Observatory did this week

Metro’s Climate-Denial Regional Transportation Plan.  Portland’s regional governtment, Metro, has published a draft Regional Transportation Plan, outlining priorities for transportation investments for the next two decades, and ostensibly, aiming to deal with transportation greenhouse gas emissions, the largest source of climate pollution in the region.  But unfortunately, the RTP, does nothing to prioritize projects and expenditures that reduce greenhouse gases, and instead creates a fictional case for more road widening.

Metro falsely asserts that because its overall plan will be on a path to reduce GHGs (it wont), it can simply ignore the greenhouse gas emissions of spending billions to widen freeways

The RTP’s climate policies don’t apply to individual project selection;  projects are prioritized on whether they reduce vehicle delay—a failed metric it uses to rationalize capacity expansions that simply induce additional travel and pollution

The RTP environmental analysis falsely assume that ODOT will impose aggressive state charges on car travel, including carbon taxes, a mileage fee and congestion fees than have not been implemented—and may never be—to reduce vehicle miles traveled.

The RTP’s traffic modeling fails to incorporate the effect of expected pricing on the need for additional capacity.  Modeling done by ODOT shows that pricing would eliminate the need for capacity expansion, saving billions, and reducing greenhouse gases.

Must Read

TXDOT marches ahead with I-35 Freeway widening.  The Texas Department of Transportation has published its final environmental. report and record of decision for its multi-billion dollar I-35 widening project in Central Austin.

Kevin DeGood of the Center for American Progress has a devastating analysis of the project’s absurd environmental claims, which assert that adding multiple lanes the the roadway will have essentially a trivial impact on vehicle miles traveled and greenhouse gas emissions.

The TXDOT model essentially predicts that traffic on I-35 will increase by the same amount whether or not the roadway is widened—notwithstanding that the existing roadway is essentially at capacity much of the day.  (We’ve unrolled DeGood’s social media thread for easy reading).  The underlying reason for this is that TXDOT lives in a fictional world where traffic increases 1.5 percent per year, regardless of roadway capacity.  TXDOT is simply in denial about the induced travel effects of building additional road capacity.  The induced travel effect means that widening I-35 will increase driving and greenhouse gas emissions, and won’t solve congestion.  It’s a great way to squander billions of dollars and further blight the middle of Austin.

Why speed enforcement cameras are equitable.  Writing at CalMatters, Darnell Grigsby challenges the claim that automated traffic enforcement will unfairly affect low income households and persons of color.

As a Black man who has worked at the intersection of transportation and social justice for years, I know firsthand that calls for enforcement are complicated. High-profile police brutality cases stemming from traffic stops that spread on social media are not only a tragedy, but a reminder of the disparate impact and fear Black and brown drivers experience every day.

Grigbsby points out that unlike discretionary police enforcement, automated cameras are color-blind, and perhaps more importantly, low income and communities of color are much more affected by the traffic violence and quality of life impacts of speeding vehicles.

. . . communities of color bear the brunt of the pain from the increase in lawlessness on our roads. In fact, in California, Black pedestrians are 62% more likely to die than white Californians. Latinos are 31% more likely.

A transportation system designed by “a cabal of sentient cement-mixers.”  Bella Chu (@bellachu10) is one of the most incisive and clear voices on transportation on social media.  This week, the American Society of Civil Engineers dispensed its awards for highway construction and deemed the massive Jane Byrne freeway interchange in Chicago “an outstanding engineering achievement and improvement to the community!”

Bella’s take is spot-on:

OMG. It’s the @aashtospeaks “QoL & Community” awards all over again. If a cabal of sentient cement mixers whose only goal was to eliminate the human population via blunt force trauma, asthma and climate change had taken over the planet, they could not come up with a better plan.

 

New Knowledge

A profile of work from home.  The Covid-19 pandemic greatly accelerated the adoption of work-at-home in the US economy, and though the pandemic has faded, work-from-home is far more common now that prior to the pandemic.  A new paper from Barrero, Bloom and Davis describes the evolution of work from home.

While work from home is common for many professional service and information occupations, it’s still the case that 60 percent of US workers perform their jobs only at their workplace.  About 11 percent of all workers work exclusively from home and the remaining 30 percent of workers are “hybrid”, splitting their working hours between workplaces and home.

Industry, occupation, and geography all influence whether and how much people work at home. Some industries (retailing, food services and manufacturing) require all or nearly all of their employees to be in the workplace; other industries (software, information) have much more remote work.

Remote work also varies by geography.  Work from home is more common in denser locations than in more rural locations.  In part, this reflects the mix of industries and occupations in different places.  But it also reflects the advantages urban locations provide in terms of amenities near worker’s homes.

 

Jose Maria Barrero, Nicholas Bloom, and Steven J. Davis, “The Evolution of Working from Home,” https://wfhresearch.com, July 2023

In the news

The Portland Oregonian published Joe Cortright’s opinion piece on the Interstate Bridge:  “Oregon’s funding plan for the Interstate Bridge is a generational crime.”

Thanks to Streetsblog California for pointing its reader’s to City Observatory’s analysis of climate fraud in the Portland Metro Regional Transportation Plan.

The Week Observed, August 18, 2023

What City Observatory did this week

Climate fraud in the Portland Metro RTP.  Metro’s Regional Transportation Plan rationalizes spending billions on freeway expansion by publishing false estimates and projections of greenhouse gas emissions. Transportation is the number one source of greenhouse gases in Portland. For nearly a decade, our regional government, Metro, has said it is planning to meet a state law calling for reducing greenhouse gas emissions 75 percent by 2050. But the latest Metro Regional Transportation Plan (RTP) has simply stopped counting actual greenhouse gas emissions from transportation.

Inventories compiled by the state, the city of Portland and the federal government all show the region’s transportation emissions are going up, not down as called for in our plan.  In place of actual data, Metro and other agencies are substituting fictitious estimates from models; these estimates incorrectly assume that we are driving smaller cars and fewer trucks and SUVs, and rapidly replacing older cars. None of those assumptions are true.

As a result greenhouse gases are going up; our plans are failing, and Metro’s Regional Transportation Plan, the blueprint for spending billions over the next several decades will only make our climate problems worse

Must Read

 

The futility of freeway expansion, in three pictures. Darren Givens (@atlurbanist) has a succinct and powerful social media post visually demonstrating Atlanta’s futile efforts to reduce congestion by widening freeways through the city.  To paraphrase Don Shoup, wider freeways have been a fertility drug for traffic.

How freeways destroy urban housing wealth. Bloomberg’s CityLab reports on a new study from Transportation for America documenting the destruction of housing wealth by freeway construction in US cities.  Taking a close look at the housing demolished to provide space for freeways in Washington, DC, and Atlanta, the author’s conclude that freeway construction destroyed billions of dollar in home value, based on today’s value of urban housing.

In DC, those two [the Southeast and Southwest Freeways] interstate sections eliminated at least $1.4 billion in home value based on today’s market.  They also cost the city at least $7.6 million in annual property taxes (based on the 2023 residential property tax rate of 0.54%).

Other research has shown that the direct demolition of houses in the highway right of way is just the tip of the proverbial iceberg.  Neighborhoods divided by urban freeways and inundated by traffic tend to go into terminal decline.  Researchers at the Philadelphia Federal Reserve Bank found that the urban neighborhoods closest to freeways suffered the biggest declines in population of any urban neighborhoods.  Continued spending on highway capacity in urban areas is a value destroying activity.

New Knowledge

Zoning and educational inequality.  The geography of American cities both typifies and reinforces economic inequality.  Higher and lower income people live apart from one another in most metropolitan areas, and the balkanization of locally provided services, especially education, amplifies these inequalities.  People who live in low income communities get worse public services, and have fewer opportunities to get the education or jobs that would allow them to advance.

A new paper from Richard Kahlenberg draws a direct line between exclusionary single family zoning and the geography of inequality.  Requiring large lot, single family housing, and mostly (or entirely) banning apartments in many suburban jurisdictions both precludes many lower and moderate income families from living in such places, and enables the residents of those places to enjoy a higher level of public services, especially schools, while restricting the access of lower income households to amenities jobs and education.

Kahlenberg looks closely at Westchester County, a suburb of New York City, and compares the zoning, schools and demographics of two of its cities, Scarsdale and Port Chester.   Scarsdale is wealthier, whiter and more expensive; Port Chester is poorer, less white and less expensive.  Scarsdale has little multi-family land, and large minimum lot sizes, as much as two acres, even near commuter rail stations.  Little wonder that fewer than 10 percent of Scarsdale residents are renters, while a majority of Port Chester residents rent.  Statistically, the contrast is striking:

Its little surprise, then that places like Scarsdale are bastions of NIMBY opposition, both to denser housing, and to statewide efforts to reform zoning, like Governor Kathy Hochul’s recently defeated housing plan.  While their arguments are cloaked in an ostensible desire to protect unique local character, its clear than zoning serves as a tool of exclusion and opportunity hoarding.

As xxxx points out, Brown v. Board of Education outlawed the use of race as a basis for segregation, but since then local policies have enshrined economic status in its place.

In the news

The Oregonian published an Op-Ed—”Oregon’s funding plan for the I-5 bridge is a generational crime”—by City Observatory Director Joe Cortright, about the state’s plan to subsidize the I-5 Bridge Replacement Project with $1 billion of general obligation bonds.

KGW-TV featured City Observatory Director Joe Cortright in its reporting about the reasons for the $6 billion cost of the Interstate Bridge Replacement project (really a 12-lane wide, 5 mile long freeway, that involves rebuilding seven different interchanges).

The Week Observed, August 11, 2023

Must Read

Some Texas-sized greenwashing for highway widening in Austin.  TXDOT is aiming to spend close to $5 billion to widen I-35 through downtown Austin, and to sweeten the deal, they’re producing project renderings showing lengthy caps over portions of the widened freeway.  One hitch though:  while TXDOT will pay to build the highway, it isn’t going to contribute a dime to building the caps, which are expected to cost a cool billion.

Somebody else will have to come up with the money:  but that doesn’t stop TXDOT from shamelessly greenwashing the project with illustrations that show great swaths of green and a barely legible notation that “surface level enhancements to be provided by others.

Rhetorical blame-shifting and denial in face of automobile carnage.  Highway deaths, which have been rising for the past several years, are spiking this summer, with a number of particularly painful losses.  A driver killed Seventeen year old aspiring professional cyclist with their car; the sports national body described his death as a “training accident.”  That produced a visceral reaction from many, including Bella Chu who has some practical advice for spotting the blame-shifting and denial built into standard auto-centric accounts of this ongoing epidemic of traffic violence:

Clip out and save this list and see how many of these misleading messages you can find in press accounts of traffic violence in your city.

The outlook for cities, post-Covid.  Rani Molla has an excellent survey article looking at the future of cities in a post-covid, more work-from-home world.  Molla touches base with a wide range of experts, getting their views on everything from the decline in office employment due to remote work, to the opportunities for cities to emphasize livability and amenities as economic advantages.  Molla writes:

To compete with each other and truly thrive, cities must work on becoming more attractive places to live.  Just because big cities are probably going to be fine doesn’t mean they don’t have a lot of challenges or that they couldn’t be a lot better. They do and they can. That means maintaining what already makes cities good and adding things that could make their constituents’ lives better — that’s something that will keep people there and make others come.

As our friend Mary Rowe points out in the article, cities are dynamic and adaptive:  they’ve dealt with plenty of changes over the centuries, and the best will change and evolve to succeed in this new environment.

New Knowledge

Migration by income after the pandemic.  The Economic Innovation Group has a new analysis, based on county level tax return data showing the difference in incomes of in-migrants and out-migrants.

EIG has a terrific nationwide map that easily lets you drill down to county level data and see changes in income levels, as well as changes in the the number of tax returns filed in each year. The data compares tax year 2021 to tax year 2020.  On this map, blue areas saw higher incomes for in-migrants than out-migrants; orange areas had higher incomes for out-migrants than in-migrants.

 

While the report emphasizes that these are “post-pandemic” changes, there’s a missing baseline here:  Specifically, what were the patterns of migration prior to the pandemic and how did they change?  Much of what we observe in intra-metropolitan and inter-state moves is a well-established historical pattern:  Higher income households (often older people) move from urban center to their suburbs, while central cities tend to attract younger (and lower income households) just starting out.  To some extent, this is influenced by marriage:  Two single people moving into a city file separate tax returns; a married couple leaving the city files a joint return with a higher income.  Similarly, many patterns of interstate migration reflect long established trends (like high income movers leaving New York for Florida, or migrating out of San Francisco).  The real question is whether the pandemic accelerated these trends from their historic pattern, and whether this is a temporary blip or a sustained change.  As tantalizing as these data are, they don’t answer this question.

Conor O’Brien, “Tax Data Reveals Large Flight of High Earners from Major Cities During the Pandemic,” Economic Innovation Group, August 8, 2023

In the news

The Washington Post cited City Observatory’s study of the persistence and spread of concentrated poverty in their article “Income Ladder is Difficult to Climb in US Metros.”

The Week Observed, August 4, 2023

What City Observatory did this week

Eating local:  Why independent, local restaurants are a key indicator of city vitality.  Jane Jacobs noted decades ago that“The greatest asset a city can have is something that is different from every other place.”  While much of our food scene is dominated by national chains, some cities have many, many more locally owned independent restaurants than the the norm.  And independent restaurants get overwhelmingly higher ratings than chains.  We’ve used Yelp data to estimate the share of independent, local restaurants in large US metro areas, and come up with this ranking.

Unsurprisingly, New York and San Francisco have the highest fraction of local independent businesses.  You can scan this list to see which cities have a strong local food scene, and which are mostly driven by the national chains.

Must Read

Are we against traffic congestion or more traffic?  There’s a provocative essay on substack that questions some basic assumptions about transportation advocacy.  The author argues that while its tempting for transit and activte transportation advocates to make common cause with other road users over the supposed scourge of traffic congestion, its ultimately self-defeating.

For those who are concerned about reducing impacts on climate and improving public health in the US, we need to understand that more traffic is bad and congestion is not our problem, and is really our ally. This is sensible only if we distinguish traffic from traffic congestion. If we care about climate and public health, we need to reduce traffic. Traffic congestion might frustrate drivers enough to consider other options.

The underlying problem is our old friend induced travel:  steps we take to reduce congestion and improve car travel times inevitably lead to more and longer car trips.  And it’s the volume of traffic, not its sometimes slow speed, that is the source of the real negative externalities, including crashes and pollution.

States siphoned millions of dollars in climate money into road building projects.  One of the headline features of the Bipartisan Infrastructure Law was a modest allocation of transportation funds to reduce greenhouse gases and mitigate effects of climate change.  But the Washington Post reports that thanks to the “flexibility” in federal funding, state highways departments have reallocated millions of this money to other projects, including highway expansions and road-building.

A legal provision predating the infrastructure law allows states to shift up to half of their federal transportation funds among several programs — a provision that also applies to transportation money from the new law. Kevin DeGood, director of the infrastructure program at the left-leaning Center for American Progress, said Congress clearly intended for money to be allocated to projects that would reduce emissions or protect against extreme weather.

“It’s an absolute failure that this is allowed to happen,” he said.

researchers have also found that unclogging roads tends to induce people to drive more, spurring more emissions. Alex Bigazzi, a civil engineering professor at the University of British Columbia who has studied the effects of congestion-reducing technology, said it’s not the most efficient way to cut emissions.

“It’s a stretch to say this is a carbon-reduction strategy,” he said. “Traffic volumes are really the major driver of emissions from transportation systems.”

The reallocation of these funds is part two of an “Empire Strikes Back” effort by the highway lobby. Part one was getting the Federal Highway Administration to roll back administrative guidance directing states to fix existing highways before building new ones.

New Knowledge

Working remotely lowers productivity.  The Covid pandemic produced a sea-change in the adoption of work-from-home, and workers, businesses (and housing and office markets) are still digesting the implications.  One of the big and largely unsettled questions is whether people working at home are as productive as those working in office settings.

As a practical matter, its difficult to accurately measure worker productivity, and there are many confounding factors that are likely to reach reliable conclusions.  One key issue is selection bias:  it may be that those who are more productive working at home tend to self-select for these opportunities, so while some workers are as productive, or more productive remotely, that isn’t necessarily true on average, for for everyone.

A new study from India uses a sophisticated random assignment design to overcome this selection bias issue.  It studied workers handling a series of easily measured, routine functions which are done similarly in office and remote work settings.

The key finding:  On average, those working remotely were 18 percent less productive than their in-office peers.  About two-thirds of this difference in productivity manifested itself from the outset of assignment; the remaining increase in productivity was attributed to better/faster learning of skills in an office environment.

While overall in-office workers were more productive, the study noted that selection effects tended to worsen the home/office productivity gap:  workers who prefer to work at home, as opposed to those who preferred office locations tended to be less productive at home.

David Atkin, Antoinette Schoar & Sumit Shinde, Working from Home, Worker Sorting and Development, NBER WORKING PAPER 31515, July 2023, DOI 10.3386/w31515

In the news

City Observatory Director Joe Cortright was named one of Planetizen’s 100 most influential contemporary urbanists, clocking in at #56!

The Week Observed, July 28, 2023

What City Observatory did this week

Myth-busting:  Idling and greenhouse gas emissions.  Highway boosters are fond of claiming that they can help fight climate change by widening highways so that cars don’t have to spend so much time idling. It’s a comforting illusion to think that helping you drive faster is the solution to climate change.  But as careful research shows, widening roadways to reduce congestion simply backfires, and leads to more driving, and more emissions.

This particular myth was busted by research from Alex Bigazzi and Miguel Figliozzi of Portland State University. They showed that induced demand means that faster roadways generate more vehicle miles of travel (and more carbon emissions), and that metro areas with higher travel speeds and less vehicle delay have higher rates of greenhouse gas emissions than other metro areas.

Must Read

Krugman on Congestion Pricing:  An Act of vehicular NIMBYism.  Economist Paul Krugman weighs in on New Jersey’s lawsuit against congestion pricing in Manhattan.  Congestion pricing makes sense on both efficiency and equity grounds.  As Krugman points out, each additional auto trip to Manhattan likely imposes costs of $100 (or more) on travelers and residents.  The proposed fee will likely reduce traffic and lower the burden of those costs.  But some auto commuters, especially from New Jersey will pay more.  That’s where Krugman points out the equity issue:

What’s really striking is how few people stand to benefit from New Jersey’s attempt to block or delay congestion charges. Fewer than 60,000 New Jersey residents, out of a state labor force of almost five million, commute into New York City by car. They are also, as it happens, relatively affluent, with a median annual income of more than $100,000, relatively well able to handle the extra cost. For this, New Jersey is trying to sabotage crucial policy in a neighboring state?

New Jersey politicians are sticking up for a relative handful car commuters, who are outnumbered by the several hundred thousand New Jersey residents who commute via transit to New York. They stand to gain handsomely from much improved travel times when pricing unclogs streets and throughways.

Why congestion pricing is good for cities.  At Planetizen, Michael Lewyn takes on claims made by columnist and economist Tyler Cowen that Manhattan’s proposed congestion pricing would somehow be bad for the city’s economy and health.  Cowen reasons that the congestion charge will function as a barrier to entry to the city, discouraging people for traveling there and reducing activity, spending, and jobs.  But as Lewyn argues, those cities that catered most to the car were the one’s most devastated by disinvestment and sprawl; places like New York that avoided car dependence have seen less decline, and more resurgence.

The argument rests on an assumption that a city is most appealing when automobile commuting is easy. But this assumption seems inconsistent with historical reality. . . . a “drivable” city is not necessarily a prosperous one. So if making cities like Manhattan easier to drive through doesn’t make them more appealing, it logically follows that making them harder to drive through won’t make them less appealing.

While the congestion charge may make the city less attractive to cars, it iws likely to make it more attractive to people, and easier to access and travel through for the people who value being there.  Ironically, Cowen’s argument mimics the early and now long forgotten arguments against parking meters in cities.  Charging people for urban space causes them to use in more efficiently, and tends to allocate it to users who value it highly (and who contribute greatly to the local economy).

New Knowledge

Traffic calming and traffic evaporation.  One of the chief concerns with road diets, low traffic neighborhoods and other systems of bike/pedestrian improvements is that such measures will divert and concentrate traffic on other nearby streets. A new literature review from Germany summarizes the results of evaluations of such projects in several European cities and concludes that, contrary to these oft-raised concerns, traffic largely evaporates, rather than being displaced.

The authors conclude:

Overall, the empirical findings from Germany and abroad invalidate the fears that consistent traffic calming measures will merely shift the problem to the adjacent road network. On the contrary: Almost all surveys confirm the phenomenon of “traffic evaporation”– the volume of traffic does not just flow away one-to-one like a liquid elsewhere, but is reduced as a whole.

The authors report that studies show that traffic tends to decline in  extensive traffic calming projects between 15 and 28 percent, within entire inner cities by between 25 and 69 percent, in the area surrounding individual redesigned streets by between 4 and 52 percent.

The effect can be explained by a change in traffic behavior: the more attractive footpaths and cycle paths are, the more people use them.  Although the measurements certainly show displacement effects in adjoining streets, these are usually moderate

Uta Bauer, Sonja Bettge & Thoams Stein, Traffic calming: discharge instead of collapse! Measures and their effects in German and European cities. German Institute for Urban Studies (Difu), Berlin, 2023.  (Text in German). (https://www.sciencedirect.com/science/article/abs/pii/S0965856421000872#!). (Quotations translated from German via Google Translate and abridged)

The Week Observed, July 21, 2023

What City Observatory did this week

Few highway construction dollars for Black-owned firms in Oregon.  The Oregon Department of Transportation (ODOT) is falling short of its own goals of contracting with disadvantaged business enterprises. One-tenth of one percent of construction contracts for the I-205 Abernethy Bridge, ODOT’s largest current project, went to Black construction firms.

ODOT professed a strong interest in helping Black contractors as a selling point for the
I-5 Rose Quarter project, but instead advanced the I-205 Abernethy Bridge project, which has provided very little opportunities for Black-owned firms.  Apparently, ODOT is only interested in helping Black-owned businesses if it helps the agency sell a project in Northeast Portland (home to Portland’s historically Black Albina neighborhhood.

Must Read

How the mighty have fallen.  Bloomberg Business Week looks back at the history of urban economic success (and decline) in a data-fueled column.  His thesis is that despite its current challenges, San Francisco is unlikely to become an economic basket-case.  The case is buttressed by a look back at the nation’s highest income metropolitan areas from 1949.  It’s a fascinating list:  Some are still at the top; but others have fallen.  Surprisingly, from today’s perspective, the richest metro list is populated mostly by the industrial powerhouses that have struggled in the intervening seven decades.  Detroit was number one; and no fewer than six Ohio metros ranked in the top fifteen.

Nearly all of these older industrial centers have fallen out of the top rank of US metros.  As Fox points out, their places have been taken chiefly by emerging tech centers in other parts of the nation.  The cities have have consistently been in the top rank include New York, Chicago, San Francisco and Washington DC.  These large, diverse, knowledge-driven urban economies are the ones that have adapted best to economic transition, and are likely to do so in the future.

A new look at the destruction of Tulsa’s Black Wall Street.  The folks at Segregation by Design have a powerful new animation showing the demolition of the Tulsa’s Greenwood neighborhood.  The neighborhood, often called “Black Wall Street” was famously destroyed in the Tulsa Massacre of 1924, but then rebuilt and actually reached its zenith years later, only to recieve its final coup de grace courtesy of the Oklahoma highway department, that plowed two interstate freeways through the heart of the negibhorhood.

Via twitter, Segregation by Design has a short animation showing how highway construction obliterated hundreds of homes and businesses, producing scars that have yet to heal.

New Knowledge

Low wage workers have seen significant wage gains since the pandemic.  Labor economist Arin Dube tweeted the most recent data on wage changes since the pandemic.  Strikingly, the gains in wages have been proportionately higher for front-line workers (the production and non-supervisory workers who make up 80 percent of the private sector workforce), compared to other workers.

Wages for both categories of workers surged during the early days of the pandemic (due primarily to a composition effect–part-time and low wage workers had fewer hours of work during the pandemic.  As the dashed line on the chart shows, the latest data show that wages for front line workers are significantly outperforming wages for all workers , and that they are about on trend for where they were prior to the pandemic.  Wages for all workers are rising more slowly and sare still below trend.

The data are evidence of “wage compression”–i.e. a falling difference in wages between the highest paid and lowest paid workers.  The continued tight labor markets that have persisted since the pandemic appear to be bolstering the wages of the lowest paid workers.

 

 

The Week Observed, July 14, 2023

What City Observatory did this week

We have an in-depth series of reports on the Oregon Department of Transportation’s imploding I-5 Rose Quarter freeway widening project.

The cost of the I-5 Rose Quarter project has now quadrupled to $1.9 billion—it was a mere $450 million when it was sold to the Legislature in 2017.

ODOT hasn’t admitted its Rose Quarter project is actually dead, even though it has a $1.7 billion funding shortfall.  Instead, the agency is playing a game called “Extend and pretend“—spending $40 to $60 million over the next two years to advance design work on the main part of the project to 30 percent completion, in hopes that someone will magically provide the needed funding.

Despite the project’s ballooning price-tag, ODOT officials claim its too late to do anything to reduce the size, scope or cost of the project, claiming they’ve reached a “pens down” moment.  That’s factually untrue:  the agency is still in the throes of designing the project, and just released yet another previously un-seen design concept including a new flyover ramp last month.  Nothing in federal regulations precludes “right-sizing” the project even after funding and initial environmental approvals are received: that’s exactly what FHWA allowed with Cincinnati’s Brent Spence Bridge which was just sliced by about 40 percent in size and cost.  Finally, federal regulations prohibit environmental approvals for projects that don’t have “reasonably available funding” and the Rose Quarter is nowhere close to that standard, by ODOT’s own admission.

The apparent collapse of the Rose Quarter project has angered many in Portland’s Black community, who were promised the project would somehow help restore the neighborhood devastated by the original freeway construction, as well as providing a bonanza of jobs and contracting opportunities for the community.  Instead, ODOT took $450 million originally earmarked for the Rose Quarter project and diverted it to paying for the I-205 Abernethy Bridge, in suburban (and vastly whiter) West Linn Oregon.  ODOT’s even signed contracts and started construction on this project, and its cost has now increased to $622 million.

Must Read

Freeways and the damage done:  don’t just acknowledge, change.  Transportation for America has a powerful new report—Divided by Design—on the damage that freeways have done to the nation’s cities, and the policy changes that need to be made to repair that harm, and prevent further destruction.  Case studies of Washington and Atlanta show how urban freeways have decimated neighborhoods, displaced people–especially people of color–and harmed local budgets.  Lately, its become fashionable for highway agencies to publicly acknowledge this harm; but most of that acknowledgement is purely rhetorical and performative, because the same agencies, including the US DOT are doubling down on their historic mistakes by pumping more billions into the same failed policies and projects.  The Transportation for America reports decries this duplicity:

 [Federal and state leaders] intent to do things differently or better than their publicly racist forebears in the 1950s and 1960s is irrelevant when many of those previous practices are still deeply embedded in the transportation policies and standards of today. Today’s leaders must understand how the past is still shaping current practices. They must reevaluate how their decisions are made and who their decisions serve.  Congress and the federal government are right in part: it is time to fix the harms of our transportation system, but creating tiny new programs will fail to address the damage.

Transportation for America has a series of specific, substantive recommendations for addressing and correcting the inequities woven into the current structure of transportation policy in the US.

Utah’s smart road usage charge experiment.  Kea Wilson at Streetsblog reports on Utah’s application of the latest technology to road use charging.  It’s increasingly apparent that the gas tax is in terminal decline as a way to pay for roads, and state and federal governments are scrambling to come up with an alternative.  The choice is likely to come down to some kind of VMT (vehicle mile traveled) fee, but the big policy question is whether it will be a smart fee, or a dumb one.  The simple minded approach is to charge a flat fee per mile, but not all vehicles and not all miles are created equal.  Providing capacity for travel at peak hours is much more expensive, as is accommodating heavier, more dangerous and more highly polluting vehicles.  Utah’s monitoring system provided by technology startup ClearRoad allows road user fees to vary by vehicle, by time of date, and by road traveled. Utah is testing

a new GPS-equipped dongle that will not just monitor how much motorists drive, but exactly where they do it, down the specific lane on the highway they choose. Then, communities can consolidate that data into a single road usage charge that can be automatically adjusted based on their unique priorities — think extra tolls for using the express lane, or graduated fees for low-income motorists who truly have to drive — and automatically distributed to the agencies that manage the exact roadways those drivers’ actually traveled.

We get one chance a century to change the way we pay for transportation:  we need to get this right.  Smart technology can enable a smart choice.

In the news

City Observatory Director Joe Cortright was named one of the 100 Most Influential Urbanists, Past and Present in a poll conducted by Planetizen.

 

The Week Observed, July 7, 2023

What City Observatory did this week

Yet another exploding whale:  One of the Internet’s most popular videos shows employees of the Oregon Department of Transportation blowing up a dead whale carcass stranded on an Ocean beach, with bystanders running in terror from a rain of blubber.  ODOT’s latest fiasco is the exploding price tag of its 1.5 mile I-5 Rose Quiarter Freeway project.  This week the agency announced the price of the freeway widening project in Northeast Portland had increased yet again, to as much as $1.9 billion, more than four times the price quoted to the Oregon Legislature when it voted to approve $450 million in funding six years ago.

In addition to a ballooning price tag, ODOT spent the money the Legislature originally appropriated for the project on another freeway (in Portland’s suburbs) and is, according to OTC Member Sharon Smith “out of money.”  It looks like the Rose Quarter is a dead freeway walking.

Must Read

Portland’s “Missing Middle” reforms start bearing fruit.  Just a year ago, the Portland City Council approved the city’s “Residential Infill Policy” to allow a greater variety of housing, including multi-plex units to be built in the city’s previously exclusively single family zones.  It’s still early days for the policy, but an interim report from the Portland Planning Bureau shows that more housing in the pipeline, on its way to getting built.

The city reports that in its first year, new policy produced permits for 271 units of middle housing (duplexes, triplexes, fourplexes and sixplexes) in what were formerly single-family zones. Three-quarters of these were in fourplexes.  And over the past year, permits for “middle housing” in these residential zones well outpaced permits for conventional single family homes (just 78 were permitted).  This appears to represent strong demand for additional density.

Chart from the RIP Year-One report showing permit activity by housing type in the single-dwelling zones.

Overall, the policy resulted in approving the construction of about 3.4 units on each lot.  Middle housing provides considerably more housing per lot than the city’s permitting of accessory dwelling units (ADUs): the 126 houses and accessory dwelling units built in conjunction with a house were built on 102 lots (about 1.2 units per lot).

Americans prefer walkable neighborhoods and will pay a premium for them.  A new survey from the National Association of Realtors shows that Americans are looking for places that are walkable and have an abundance of amenities and destinations close-by.  And importantly, most signal that they are willing to pay more for to live in these walkable places.

The realtor group’s findings confirm a growing body of research, including our own City Observatory analysis, showing that houses with higher walk scores  command higher prices than other, less walkable houses, even after controlling for a wide array of characteristics that influence home values.  The premium for walkability is an indication that we haven’t built enough walkable neighborhoods in the US, and that we haven’t built enough housing in those places with high levels of walkability.

New Knowledge

Accessibility:  Proximity to Activity Centers.  Adie Tomer and Caroline George of the Brookings Institution’s Metro program have a new research report looking at the connection between activity centers and travel in the nation’s large metropolitan areas.

The report studies how distance to activity centers (concentrations of workplaces, stores, institutions and other common destinations) influences the travel behavior of households.  The study relies on detailed mapping of large US metro areas, and anonymized data from mobile devices to measure travel patterns.  Here’s a map showing person miles of travel for the Portland metropolitan area.

The key takeaway is that living in a more accessible location, in this case, defined as being within 3 miles of at least five defined activity centers, produces a significant reduction in daily travel.  In Portland, for example, households living within 3 miles of five or more activity centers travel about a third less than households living seven miles or more from 5 activity centers.  As Brookings points out, less driving translates into lower travel expense for households–and reduced carbon emissions.

The Brookings report is a helpful reminder that land use patterns–particularly accessibility to a rich set of destinations–is a key to reducing automobile dependence.  Essentially what we have to do is reverse the pattern of the past several decades, where people and activities became more dispersed, leading to greater driving, travel cost and pollution.

Metropolitan America has now spent nearly a century undercutting the importance of proximity. Urban land area grew 1.4 times faster than the population between 1960 and 2020, leading to drops in overall population density, significant increases in roadway lane miles, and longer distances between where people live and where they need to go. The result is a multi-decade expansion of the average mileage each person travels per day, or what’s known as PMT: from 19.5 miles in 1969 to 40 miles in 2017. Covering those kinds of distances overwhelmingly favors car usage, so it is little surprise that 92% of American households now have access—often by necessity—to a private vehicle.

Adie Tomer & Caroline George, Building for proximity: The role of activity centers in reducing total miles traveled, Brookings Institution, , June 29, 2023,

Building for proximity: The role of activity centers in reducing total miles traveled

Note:  City Observatory reviewed and commented on a prepublication draft of this report; City Observatory was not compensated for its review.

In the news

Axios Portland quoted City Observatory Director Joe Cortright in its story on Oregon lawmakers voting funding for the Interstate Bridge Replacement Project.

The Week Observed, June 23, 2023

What City Observatory did this week

We took the week off to celebrate the Summer Solstice and gorge on Hood strawberries!

We’ll be back next week.

Must Read

The amazing non-appearance of Carmageddon.  Echoing the point we made a City Observatory in the days—Carmageddon does a no-show in Philly —after the I-95 freeway closure in Philadelphia, Aaron Gordon of Vice points out that traffic quickly adapted to the decline in freeway capacity.  Contrary to folk beliefs (cherished and nurtured by highway engineers)  there was no chaos, gridlock or mayhem.  Commuters, through traffic, and freight movement quickly adjusted travel patterns and times, and life proceeded just as before.  

Gordon cites detailed analyses by traffic monitoring firms confirming what our early look at Google traffic maps showed:  Travel times and delays post-shutdown were almost identical to pre-crash conditions.  As Gordon writes:

Almost two weeks after the collapse, the bridge still isn’t fixed—although it will supposedly re-open this weekend in record time—but it is clear that those dire predictions did not come to pass. Initial surges in traffic in the Philadelphia area eased by the end of the week, according to data collected by HERE Technologies and Inrix, two transportation and mapping companies that use vehicle data to measure traffic flows.

Apocalyptic gridlock did not ensnare the Philadelphia area. Truck traffic did not come to a halt. In fact, more people rode the train or took alternate routes and life marched on. A few key highway junctions are a bit slower than they were two weeks ago and traffic in the immediate area of the collapse remains thick. . . . whether it’s people using public transit, workers working remotely, drivers figuring out better alternate routes, the congestion in the entire region returned to pretty close to normal by the end of this past week.

As we wrote a week ago:  this “traffic evaporation” reflects the rapid and resilient behavioral response to changes in road capacity, a fact denied by the premises of the traffic models that are routinely used to justify expensive and ineffective road widening projects.

Work from Home increased the demand for housing.  At Axios, Felix Salmon makes an important point:  Now that we’re spending less time in commercial office space, people are likely to be demanding more “home office” space.  As Salmon explains:

Take New York City as an example. If you look at square feet per resident and per employee in the Big Apple, the average New York household fits about 2.5 people into about 1,000 square feet. If one of those people has access to an external office, that provides another 150 square feet of space for working in.  When that person works from home, the household is going to feel more cramped than usual unless it expands by about 150 square feet. If the family demands 150 more square feet, that’s a substantial increase in demand, at 15%.

This is likely to have important implications for housing demand in cities.  If an urban household needs room for both its domestic/residential needs, plus additional space for a separate home office, we can expect people to spend more on housing (i.e. shifting their commuter savings to buying more residential space).  This will likely reduce average occupancy–for example, two people renting a three bedroom apartment to keep one room as an office.  This higher housing demand, in turn, would show up as both higher rents and lower total population.

New Knowledge

Unsafe speeds for pedestrians.  New data shows that 2022 was the deadliest year for pedestrians in the US in decades. Speed is a leading factor in pedestrian deaths, and new data from Streetlight data shows a stark contrast in the speeds that pedestrians are exposed to in different cities across the US.

Streetlight has calculated the average speeds on roadways with more than 200 pedestrians per day for each of the nation’s 30 largest cities. The contrast between the best and worst performing cities is dramatic.  In New York, average speeds on pedestrian streets exceed 35 miles per hour on only 1 percent of streets.  In Phoenix, 65 percent of all these pedestrian frequented streets had speeds average speeds of 35 miles per hour or more.

In all nine cities have less than 10 percent of their pedestrian-frequented roads with average speeds of 35 miles or more, while sixteen of the top 30 have a third or more of all such roads with average speeds over 35 miles per hour.  In three cities:  Phoenix, Jacksonville and Las Vegas, half or more of  pedestrian frequented streets have average speeds of 35 miles per hour ore more.

The data are a clear reflection of urbanity and street design.  Cities with dense vibrant centers (New York, Washington, Boston, San Francisco, Chicago) have relatively few pedestrians exposed to the most dangerous streets.  In contrast, the sprawling cities of the sunbelt, where many streets are wide, multi-lane arterials, expose more pedestrians to dangerous speeding traffic.

There’s also a strong correlation between cities that score well on Streetlight’s “Safe Speed” Index, and the Pedestrian Death Rate, tabulated by Transportation for America.

Cities that optimize their land use and transportation networks to speed automobiles create the most dangerous conditions for pedestrians, cyclists and other vulnerable road users.

In the news

Thanks to Streets.MN for pointing their readers to our commentary on the amazing non-appearance of Carmaggedon in Philadelphia.

The Week Observed, June 16, 2023

What City Observatory did this week

Carmageddon does a no-show in Philly.  A tanker truck caught fire and the ensuing blaze caused a section of I-95 in Philadelphia to collapse.  This key roadway may be out of commission for months, and predictably, this led to predictions of “commuter chaos.”  But on Monday morning, traffic in Philadelphia was surprisingly . . . normal.

It’s yet another instance of traffic evaporation–when roadway capacity is suddenly reduced (by a disaster or construction project), travelers quickly adapt to the newly constrained roadway system.  This evaporation is the mirror image of “induced demand”–the tendency of newly widened roadways to fill up and become just as traffic clogged as before.  The lesson of these false carmageddons is that travel is much more dynamic and flexible than we imagine.

Must Read

Block that metaphor:  Stop saying micromobility.  In an incisive essay for Streetsblog, Sarah Risser questions our use of the term “micromobility” to describe bikes, e-bikes, scooters and other similarly sized modes of transport. In this case, the use of the diminutive effectively diminishes the seriousness of these modes of transport, and reinforces the notion that that there are “regular” or “normal” modes of transport (cars, trucks, SUVs), and that smaller, safer greener ways of getting around are just toys, or marginal and substandard  modes.

Right-sized or human scale: Not “Micro”.

This kind of subtle bias pervades much transportation policy debate (calling all crashes “accidents,” and the term “jay-walking”).  As Risser says:

SUVs, pick-up trucks, and passenger cars should not be the benchmark by which we judge the size of other forms of transit, and the term ‘micromobility’ encourages us to believe that they are. Instead, we should intentionally drop the preface “micro” in micromobility and start referring to bikes, scooters, and the human body simply as “mobility,” And we should also add an appropriately descriptive prefixes to ever-larger cars, SUVs and pick-up trucks: Maybe we could call them “oversized-mobility;” “space-hogging-mobility”; or better yet, “deadly-mobility.”

The words we use matter:  We shouldn’t inadvertently embrace and reinforce automobile dominance by using a term that marginalizes sustainable, human-scaled transport.

Will Minnesota live up to the promises of its new transport legislation? There’s some good news that’s come out of Minnesota, where the legislature adopted language injecting some sound climate protection policies into the state’s transportation investment laws.  Writing at StreetsMN, Alex Burns takes a closer look at the fine print, and points out, that while there’s progress, there’s a lot more work to do here.  Although the law requires an emissions analysis for new highway expansion projects, the bill essentially grandfathers everything that’s already in an adopted highway plan, which is bad enough, and could get worse if highway engineers amend those plans before the new laws take effect in 2025. Burns writes:

. . . the bill did add some important protections against highway expansion but they do not go far enough [it] requires MnDOT to assess the impacts of a project on state climate and vehicle miles traveled (VMT) goals and perform mitigation measures only if the project adds travel lanes. A project that reconstructs a freeway with the same number of lanes is exempt, even though it will likely mean 60-plus more years of greenhouse gas emissions (not to mention air pollution, noise and human carnage) over that project’s lifespan.

More to the point, the legislation still provides the bulk of state funding for roads and highways:  $7 billion of the $9 billion appropriated for transportation primarily or exclusively serves car and truck traffic.  As Brent Toderian frequently says:  show me your budget and I’ll show you your priorities.

Downtown offices may be struggling, but dense, mixed-use neighborhoods are flourishing.  Tracy Hadden Loh of the Brookings Institution’s Metro program has a thoughtful op-ed in the Los Angeles Times pushing back against the anti-urban “doom-loop” messaging that seems to dominate the media.  Citing Brookings’ analysis of Census data, she notes that residential neighborhoods in cities are seeing renewed growth.

people are enjoying walkable, mixed-use neighborhoods where they can both live and work, in contrast to the 20th century mode of cities and suburbs that rigidly separates work zones from other activities.

Pointing to a series of Los Angeles neighborhoods that are flourishing, Loh says

Why are some neighborhoods doing extraordinarily well? These are not the richest parts of L.A. Rather, they gather big, diverse collections of economic, social, physical, and civic assets in close proximity

The common narrative about cities implies that the only reason people live in or near cities is to be close to places of work.  The resilience of urban neighborhoods that offer a wide range of social, cultural, economic and consumption opportunities is evidence that cities are about much more than access to jobs.

In the news

StreetsblogUSA re-published our commentary on the “meh” of Carmaggedon in Philadelphia after the collapse of a section of I-95.

The Week Observed, June 9, 2023

What City Observatory did this week

Guest contributor Miriam Pinski observes that getting the prices right could produce dramatic improvements in how US transportation systems perform.  New York is on the verge of implementing congestion pricing, and other US cities are strongly considering similar policies.   Pricing turns out to be the cornerstone of encouraging widespread adoption of non-auto travel modes.

Around the world, the cities that we want to emulate in transportation have done a far better job at reflecting back to road users the social and environmental costs of their decisions.  That’s a key reason why leading cities, like Copenhagen are so successful at moving away from automobile dependence.

 

Must Read

Worthwhile Canadian perspective:  Electric cars are still cars, with almost all their attendant problems.  Just as in the US, Canada is subsidizing car makers to make electric cars, and subsidizing households to buy them.  The Globe and Mail‘s Eric Reguly points out that notwithstanding the emission reduction benefits of moving away from internal combustion, this policy doesn’t make a lot of sense if we’re trying to build more livable and sustainable communities.

EVs, and hybrid cars to a lesser extent, enjoy a global image that is entirely unjustified. The pitch – buy an EV and save the planet – is just nonsense.  Never mind that EVs are still cars that need to be parked. Their presence will still disfigure cities, pushing politicians and developers to build new parking lots, roads and highways to gratify the endless swarms of drivers.
Subsidizing electric cars is a choice, and in many respects a bad choice.  In Canada, the subsidies being given to car makers for battery plants approximates the rough cost of high speed rail linking Toronto, Montreal and Quebec, an investment that would dramatically change transportation patterns and reinforce urban areas.  But the car-centric view of transportation is allowing none of that.
The Biden Administration’s disappointing record on fixing racist highways.  The Biden Administration had one plenty to acknowledge the damage done by freeway construction in urban neighborhoods, and has talked a good game about rectifying the situation, but at critical junctures, its support has waivered.  Writing at Fast Company, Benjamin Schneider relates some of the problems that have arisen.
While the Biden Administration included $1 billion for a “Reconnecting Communities”program in highway funding, this represents just a  drop in the bucket compared to the need, and the program is already enormously oversubscribed.   The funding is hardly enough to offset six decades of damage from highway building, and it turns out that some states are using the program to “greenwash” otherwise harmful projects, including a massive freeway expansion in Austin that won a Reconnecting Communities grant to plan for a cap over a small portion of the road.

In the News

Strong Towns republished our commentary lampooning the obsession with technical fixes for transportation safety.

The Week Observed, June 2, 2023

What City Observatory did this week

What computer renderings really show about the Interstate Bridge Replacement Project:  It’s in trouble. The Interstate Bridge Project has released—after years of delay—computer graphic renderings showing possible designs for a new I-5 bridge between Vancouver and Portland.  But what they show is a project in real trouble.  And they also conceal significant flaws, including a likely violation of the National Environmental Policy Act.  Here’s what they really show:

  • IBR is on the verge of junking the “double-decker” design it’s pursued for years.
  • It is reviving a single decker design that will be 100 feet wider than the “locally preferred alternative” it got approved  a year ago.
  • The single deck design is an admission that critics were right about the IBR design having excessively steep grades.
  • The single deck design has significant environmental impacts that haven’t been addressed in the current review process; The two states ruled out a single deck design 15 years ago because it had greater impacts on the river and adjacent property.
  • IBR’s renderings are carefully edited to conceal the true scale of the bridge, and hide impacts on downtown Vancouver and Hayden Island.
  • IBR has blocked public access to the 3D models used to produce these renderings, and refused to produce the “CEVP” document that addressed the problems with the excessive grades due to the double-deck design.
  • The fact the IBR is totally changing the bridge design shows there’s no obstacle to making major changes to this project at this point.

Must Read

Economic development hinges on the ability to attract talent.  Richard Florida and his colleagues have prepared a new economic analysis for the state of Michigan.  It highlights many of the state’s strengths (it’s strong intellectual capacity and established manufacturing base in transportation), but importantly stresses the state’s long term economic success hinges on doing a better job of attracting talented workers.  The report–Michigan’s Great Inflection–contains some insightful information of interest to all states.

In particular, the report highlights which states are–and aren’t–doing a good job of attracting talented young workers.  And, although Michigan does a reasonably good job of hanging on to its own youth, it does very poorly in attracting young, well-educated workers from elsewhere.

The study acknowledges the importance of bolstering current industries and institutions, but emphasizes growing, retaining and attracting talent:

To ensure the long-run prosperity of its industries, communities, and people, Michigan must focus its economic development strategy on bolstering and aligning the capabilities of its leading corporations, universities, and startups in critical transformational technologies. As importantly, if not more so, the state must enhance its strategies for generating, retaining, and attracting the talent required to compete in this new economic environment.

A newspaper’s mea culpa for supporting a racist freeway project.  The Portland Oregonian has a fascinating and detailed retrospective on how its coverage of freeway construction and urban renewal in the 1950s and 1960s reinforced and amplified the destruction of the city’s segregated Black Albina neighborhood.

The narrative relates how the newspaper paid scant attention to the dislocation of families and businesses by the construction of Portland’s Memorial Coliseum, the expansion of Emmanuel Hospital, and three major state highway construction projects, which collectively led to the demolition of hundreds of homes, and prolonged population and economic decline in Albina.

The state highway department, now the Oregon Department of Transportation, purchased it and other homes along the route and then offered them at auction, allowing bidders to salvage sinks, furnaces and fixtures before tearing down the houses.  The Oregonian covered one auction in 1961, noting that the state highway department sold 175 condemned homes for between $50 and $155 apiece. “Even the shrubbery is included,” The Oregonian wrote.  Former residents’ identities were not.

Ironically, even as this coverage appears in the local newspaper, the Oregon Departmetn of Trnasportation is proposing to spend $1.45 billion to widen Interstate 5 through the Albina neighborhood, further increasing traffic and pollution.  ODOT’s plan calls for widened overpasses it describes as freeway covers, but beyond building the overpasses, ODOT is proposing to spend nothing to restore the damage done to the neighborhood, much less replace the hundreds of houses it demolished.

New Knowledge

The big compression:  Why a tight labor market is a powerful force for equity.  The big economic lament for most of the past half century has been widening wage and income inequality.  But a funny thing has happened in the past few years:  The big gains in income have come at the low end of the labor market.  Wage increases have been highest for those in the lowest wage jobs.

In a new paper fromDavid Autor, Arindrajit Dube and Annie McGrew, quantifies the sea change in labor markets.

Wages went up for all workers following the onset of the pandemic, but went up most, and stayed higher, for the lowest wage workers.  Another key development in the market was increased job changing among lower wage workers.  Low unemployment rates prompted firms to bid up wages and gave workers the opportunity to move to other better paying jobs. As the authors conclude:

Labor market tightness following the height of the Covid-19 pandemic led to an unexpected compression in the US wage distribution that reflects, in part, an increase in labor market competition. . . the pandemic increased the elasticity of labor supply to firms in the low-wage labor market, reducing employer market power and spurring rapid relative wage growth among young non-college workers who disproportionately moved from lower-paying to higher-paying and potentially more-productive jobs

The changes are large:  what’s happened in the past few years has erased about a quarter of the increase in the college wage premium since 1980.  Tight labor markets give employers strong incentives to hire and train workers, and improve their productivity.  And they give workers the opportunity to go somewhere else if they’re not happy or feel they aren’t being adequately rewarded for their work.  To be sure, we see lots of complaints from employers that they can’t find the worker’s they need, but this is actually an indication that the labor market is working better for workers, and that’s turned out to benefit low wage workers the most.

The Covid-19 pandemic was a tragedy, to be sure.  But its economic aftermath:  a very strong fiscal stimulus which produced tight labor markets, has generated substantial economic benefits for low wage workers, whose economic plight has been worsening for decades.

David Autor, Arindrajit Dube and Annie McGrew, The Unexpected Compression: Competition at work in the low wage labor market, Working Paper 31010, National Bureau of Economic Research, http://www.nber.org/papers/w31010

 

 

The Week Observed, May 26, 2023

What City Observatory did this week

Pricing is a better, cheaper fix for congestion at the I-5 Rose Quarter.  The Oregon Department of Transportation is proposing to squander $1.45 billion to widen about a mile and a half of I-5 in Portland—that’s right about $1 billion per mile.  But a new analysis prepared by ODOT shows that pricing I-5 would do a better job of reducing congestion and improving traffic flow than widening the freeway—and would save more than a billion dollars.

ODOT study: Pricing, instead of widening, would save $1.45 billion and reduce congestion more.

ODOT has steadfastly (and falsely) maintained that pricing is “unforeseeable” and has excluded any analysis of the impact of pricing from the project’s Environmental Assessment (EA).  But an ODOT technical memorandum shows that pricing I-5 would significantly reduce traffic congestion and speed traffic flow, without widening the I-5 roadway.  The analysis also flatly contradicts EA claims that widening the highway won’t induce more traffic and also disproves claims made that the project will improve safety.

Must Read

Time for New York to stop giving away valuable street space for free.  Don Shoup, described as the O.G. of parking policy, has pointed advice for the City of New York.  The problem, Shoup notes, is that the city is awash in cars because it literally gives parking away for free.  Most cars sit parked most of the day, or for days on end, and much of the traffic on city streets is cars searching for the elusive open parking space.  All that could end if the city simply charged those who park on city streets for the privilege.  Shoup has worked out that a fee that would keep about 15 percent of parking spaces open at any time would work wonders for the city’s transportation mess:  less traffic, easier parking for those who pay, and plenty of revenue to improve local streets and subsidize mass transit.  He writes:

Demand-based prices for curb parking resemble urban acupuncture: a simple touch at a critical point — in this case, the curb lane — can benefit the whole city. In another medical metaphor, streets resemble a city’s blood vessels, and overcrowded free curb parking resembles plaque on the vessel walls, leading to a stroke. Market prices for curb parking prevent urban plaque.

In New York, a minority of residents own cars, and they tend to have higher incomes than those who don’t own vehicles.  Giving away parking hurts a majority of NYC residents, especially those of limited means.  As Shoup says, it’s time to turn parked cars from freeloaders into paying guests.

False Equivalence:  Bike- and bus-lanes are not the neighborhood destroyers that freeways were.  Los Angeles Metro’s Chief Innovation Officer Selena Reynolds maintained in a recent interview that failing to get enough local assent for bike lanes or bus priority on urban streets was functionally a repetition of the arrogant destruction of neighborhoods by highway builders in the 1950s and 1960s (and in many places to this very day).  Aaron Gordon of Vice strong disagrees.

While Reynolds frets that we are repeating the arrogant, top down approach of the past; Gordon points out that the public involvement process has been subverted, and really represents the interests of older, wealthier car-owning households, creating obstacles to more just forms of transportation.

While humility is a fine trait for a public official to have, it too often crosses over into policy nihilism, which is precisely the wrong lesson to take from the mistakes from the past. The result of this is well-intentioned, dedicated public officials comparing building bike lanes to urban highways. Not all bike or bus lane projects are perfect at their initial conception, but it is, in fact, possible to know if something is good or bad without hearing everyone’s opinion on it.

As Gordon points out, more than a million people were displaced by urban highway construction; there’s no evidence that anyone was displaced by a bike lane or bus lane.  It wasn’t simply the process that was flawed, it was that highway were infinitely more destructive of the urban fabric.  It’s useful to get public input to improve projects, but that shouldn’t dissuade public officials from acknowledging some modes are inherently more just and accessible than others.

Driving less is possible.  The Frontier Group’s Elizabeth Ridlington has a great data analysis looking at state-level trends in vehicle miles traveled per person.  Transportation models (and much public discourse) seems to assume that daily travel is a fixed, irreducible quantity, but in reality, how far we travel (and by what means) can and does change over time.  Critically, it reflects the way we build our communities and how much we subsidize car travel.

The policy takeaway here is that trends in vehicle miles traveled are not unchanging forces of nature.  They are in fact, amenable to policy changes. Places that encourage more urbanization, and provide more options, tend to have much lower growth in VMT than sprawling, auto-dependent states.

New Knowledge

Missing Middle:  Still MIA.  One of the heartening developments in the housing debate has been increasing support for local and state policies that encourage so-called “Missing Middle” housing:  duplexes, triplexes, four-plexes and other small scale, multi-family housing that fall between the traditional singe-family home and larger apartment buildings.

While there is definite progress on the policy front (Minneapolis, Oregon, Washington, and now California—have greatly liberalized the kinds of housing that can be built in many urban single family zones—the needle doesn’t appear to be moving much in terms of national housing supply.

Nationally, builders have been producing starting between about 2,00 and 4,000 new two- to four-unit structures each quarter since 2010.  These levels of production are well below the average for the previous two decades.

Missing middle housing policies are a promising, and positive step in the right direction, but ultimately, the problem of housing production is one of scale, and so far, the missing middle policies have been too small to have much impact.

Robert Dietz, “Multifamily Missing Middle Flat at Start of 2023,” National Association of Home Builders:  Eye on Housing, May 24, 2023.

H/t to: Luca Gattoni-Celli (@TheGattoniCelli)

In the News

Joe Cortright’s analysis of the ODOT was cited in by the Sisters Nugget:  Issuing debt for major highway expansions could jeopardize re-opening Cascade pass highways each Spring.

 

The Week Observed, May 19, 2023

What City Observatory did this week

Rose Quarter tolls:  Available, but not foreseeable?  There’s a glaring–and illegal–contradiction in the planning for the Oregon Department of Transportation’s $1.45 billion Rose Quarter project.  While ODOT’s financial plan claims that needed funds for the project will come from tolling I-5, the project’s environmental analysis claims that there’s no need to consider the effect of tolling on traffic or the environment, because tolling isn’t “reasonably foreseeable.”

  • Tolls can’t be both “available” and “not reasonably foreseeable.
  • Tolling would eliminate the need to widen the I-5 freeway.
  • Without tolls, ODOT can’t pay for the $1.45 billion Rose Quarter project.
  • FHWA can’t approve the Rose Quarter project because of these errors and contradictions.

No technical fix for road safety.  It’s tempting—but wrong—to assume that the growing death rate on the nation’s roads can be solved with another dose of technology.  A viral video showing a self-driving car blowing through an occupied crosswalk shows autonomous vehicles can be just as indifferent to humans on foot as human drivers.  And at the same time, university researchers are flogging a phone app as a cure for the supposed problem of “distracted pedestrians.”  There’s no shortage of silly ideas as to how technology could be applied to this problem—we have a few ourselves.  But the point is that technology is a diversion, rather than a solution.

The belief that just one more technological advance will reduce road carnage is wrong, and it illustrates our obsession with gizmos, and the car-blindedness that has relentlessly shifts blame and responsibility to vulnerable road users.

Must Read

Cars are getting older.  Axios reports new data show that the average age of a car or truck in the United States is now 12 and a half years, the longest ever recorded. That matters because climate policies are counting on people replacing their internal combustion engines with electric vehicles.  But as people hold on to fossil fueled vehicles longer, we make slower progress toward reducing greenhouse gases.

The transition from gas to electric cars will take decades.  It’ll likely take until at least 2050 — and possibly longer — before most gas-powered cars are off the road, [S&P Global Mobility researcher Todd] Campau says.

The report also shows that, so far, electric ars are less durable that their internal combustion counterparts.  about x.x % of the EVs sold in the last decade have been scrapped compared with about x.y percent of internal combustion vehicles.  The shorter average lifespan of EVs means it will take longer for EVs to replace internal combustion vehicles.

Time for a national “roads review.”  The Frontier Group’s Tony Dutzik says its time to take a considered national look at our highway transportation system.  We really don’t have a national transportation plan:  what we have is a series of state level plans (mostly to expand highways), coupled to a generous and indiscriminate federal firehose of funding.  When aded up, over the next several decades, does our spending on hundreds of billions on roads take us in the direction of meeting our climate and safety goals?

A similar “roads review” in Wales reached the conclusion that the status quo ante of incrementally expanding the road system was fundamentally out of step with the national interest, both for economic and environmental reasons.

Dutzik argues that the transportation fraternity is incapable of reforming itself (as evidenced by a 2019 report published by the Transportation Research Board) and that what we need is a broad-based “people’s roads review” that integrates the diverse critiques of our current car-centric system from urbanists, public health experts, displaced communities, climate change advocates, and fiscal experts.  As he says:

The climate crisis and the safety crisis on our roads – together with the countless other big and small problems driven by our continued emphasis on cars and road-building – demand something bigger than what transportation advocacy has yet been able to deliver. We need a rallying point that knits small movements into a bigger one, that enables the whole to become greater than the sum of the parts, and that provides the public, the media and decision-makers with a common focal point for education, engagement and action.

The Devilish Details of Missing Middle Housing.  Many states are making progress in ending the hegemony of single family zoning, opening up residential zones for duplexes, triplexes and other missing middle housing.  Washington State has recently passed HB 1110, which legalizes multiplex homes and accessory development units in most single family zones.

But as with many things in housing, the complex web of regulations and guidelines that shape what can be built mean that zoning is just one of many obstacles.  Building codes, setbacks, height limits and other provisions can be equally daunting.  Matt Hutchins looks at a new infill development manual from Washington State’s commerce department, which, in theory, is designed as a “how to” guide to promote missing middle housing, but in practice may turn out to squelch real change.

The publication is replete with gauzy watercolor illustrations of hypothetical missing middle developments.  But if you look more closely at what’s being prescribed, especially in the proposed “overlay standards”, the guide does more to limit missing middle housing than to enable it:

 The underlaying premise of these overlay development standards is to make middle housing as palatable to neighbors as possible (low densities, smaller footprints, bigger setbacks, lower heights) rather than embracing the challenge we face with the housing crisis and climate change. The standards do not match up with the reality of many neighborhoods (funky cul-de-sacs, small narrow and deep lots, parcels with buildings already on them). At its very worst, this toolkit will give slow-growth municipalities cover for downzoning by providing the option to select overlays that are less intense for the majority of their currently residentially zoned land.

The complex and interconnected web of regulations regarding housing all reflect and reinforce the dominance of exclusively single-family neighborhoods.  Dismantling this structure will require considerable effort.

New Knowledge

Remote work, urban location and household formation.  Adam Ozimek and Eric Carlson of the Economic Innovation Group have a new paper looking at how work from home has influenced housing demand.  There are some glib assertions that the ability to work at a distance is leading to decentralization and the demise of cities, but this paper points out that the effects are much more nuanced.

While there has been some population movement to suburbs since the pandemic, a striking fact is that urban housing markets remain very robust–prices for centrally located apartments have surged in most cities.

The study shows that households that work from home tend to spend more on housing (rents or mortgage payments) than households that don’t work from home.  That makes sense:  If you’re working at home, you need more space to serve as your “office”—and likely are spending more on the amenity of your home, reflecting the increased amount of time you spend there.  In addition, if you’re working at home, you’re most likely spending less on commuting, which gives you relatively more income to spend on housing.

This finding—higher housing expenditures by work at home households—holds from 2020, but the size of the effect appears attenuated in 2021 (the additional amount spent on housing by work at home households is smaller.). The author’s also find that work at home was associated with increased household formation—think of adult children moving out of the house, couples splitting up, roommates sharing a space moving to separate housing units, etc.  Even though there was some population movement to suburbs, this increased household formation helped maintain robust housing demand in denser neighborhoods.  As the authors conclude:

In general, exposure to remote work led to increases in housing demand as shown, for example, through gross monthly rental payments and home values. However, this effect was smaller for PUMAs with high population densities and expensive housing markets. Indeed, we find a negative effect of remote work exposure on population growth the most dense and expensive PUMAs, suggesting that working from home led to some level of out-migration from these areas. However, in these dense and expensive PUMAs, the positive effect on the household formation helped offset the population loss. In short, one reason places that lost population nevertheless saw robust housing markets was that they had stronger household formation.

Adam Ozimek, Eric Carlson, Economic Innovation Group, Remote Work and Household Formation, April 11, 2023
https://eig.org/wp-content/uploads/2023/04/Remote-Work-and-Household-Formation.pdf

In the News

 

 

The Week Observed, May 12, 2023

What City Observatory did this week

There’s plenty of time to fix the Interstate Bridge Project. Contrary to claims made by OregonDOT and WSDOT officials, the federal government allows considerable flexibility in funding and re-designing, especially shrinking costly and damaging highway widening projects

In Cincinnati, the $3.6 billion Brent Spence Bridge Project

  • Was downsized 40 percent without causing delays due to environmental reviews
  • Got $1.6 billion in Federal grants, without only about $250 in state funding plus vague promises to pay more
  • Is still actively looking to re-design ramps and approaches to free up 30 acres of downtown land

Within the past year, the Ohio and Kentucky transportation departments pared the size of the bridge by almost half, without prolonging the environmental review process or sacrificing federal funds.

For years, the managers of the Interstate Bridge Project have been telling local officials that if they so much as changed a single bit of the proposed IBR project, that it would jeopardize funding and produce impossible delays. Asked whether it’s possible to change the design, and they frown, and gravely intone that “our federal partners” would be displeased, and would not allow even the most minor change. It’s a calculated conversation stopper—and it’s just not true.

Must Read

The Playground City?  The persistence of work-at-home and the surge in office vacancies in downtowns has people worried about the future of cities.  One of the nation’s leading urban economists, Ed Glaeser, penned an op ed in the New York Times with his MIT colleague Carlo Ratti, foreseeing a predicting the emergence of the “playground city.”  They argue that New York’s economy is shifting from office work to entertainment:

New York is undergoing a metamorphosis from a city dedicated to productivity to one built around pleasure. . . The economic future of the city that never sleeps depends on embracing this shift from vocation to recreation and ensuring that New Yorkers with a wide range of talents want to spend their nights downtown, even if they are spending their days on Zoom. We are witnessing the dawn of a new kind of urban area: the Playground City.

Glaeser makes the point that, throughout history, cities like New York have successively re-invented themselves in the face of economic and technological changes.  New York’s economy was once fueled by trade and manufacturing, but in recent decades has been propelled by financial and professional services.  They suggest that the city can, and will, reinvent itself again.

Oddly, the column makes no reference to Glaeser’s own seminal article “The consumer city” published two decades ago.  And the term “playground city”  is an echo of Terry Nichols Clark’s 2003 book “The City as an Entertainment Machine.”  These works—and our own City Advantage (2007)— make the case that people don’t live in cities simply to be close to jobs, but that cities offer compelling advantages as places to live, interact with others, and access a diverse array of goods, services and experiences.  In cities, more different and varied goods, services and people are close at hand, and it’s easier to discover and explore new things.

Cities aren’t merely playgrounds:  they’re places we can grow and lead richer and fuller lives.  That’s why urban centers will persist and flourish, even if office employment never returns to its pre-pandemic peak.

Want better transit?  Stop squandering money widening freeways.  Too often, transportation policy is co-opted by “all of the above” multi-modalism:  we can only improve transit if we have a “balanced” program that includes more money to move cars faster.  The Transit Center pushes back against this bankrupt thinking, calling New York State leaders to cancel two big (and counterproductive) freeway widening projects, and instead spending the money on improving transit.  Under the new federal infrastructure law, states and localities have a choice, and there is a trade-off:  most federal formula funds can be “flexed” from highways to transit, if state and local leaders agree.

These phony “balanced” investments inevitably encourage more car traffic, more sprawling development, and undercut the effectiveness and raise the cost of operating transit.  In the face of a climate and road safety crisis, we have to set priorities.

Must Listen.  San Diego Public Radio (KPBS) has a new program: “Freeway Exit,’ exploring the intimate connection between freeways and the nation’s cities. The program is hosted by KPBS metro reporter Andrew Bowen, “Freeway Exit” relates the forgotten history of San Diego’s urban freeway network, and how freeways divided communities and created inequities that still exist today. The series also shows how freeway culture has contributed to the climate catastrophe we’re now facing, and how reimagining our freeways could be a key part of the solution.  Episode two features some vintage audio from the 1950s that captures the sense of unquestioned progress that enveloped cars and freeway building–before the social and environmental costs of car dominance became evident.

In the News

The Portland Mercury quotes City Observatory’s Joe Cortright, in an article entitled: “Build It and They Will Pay: ODOT’s Plans for Tolling Generate Broad Backlash.”

 

The Week Observed, May 5, 2023

What City Observatory did this week

Why can’t Oregon DOT tell the truth?  Oregon legislators asked the state transportation department a simple question:  How wide is the proposed $7.5 billion Interstate Bridge Replacement they want to build?  Seems like a simple question for an engineer.  But in testimony submitted to the Legislature, Oregon DOT officials went to great lengths to conceal and misrepresent the true size of the massive freeway bridge they’re planning.  The simple fact is ODOT plans to replace a 77′ wide bridge with one that’s more than twice as wide (164 feet) wider than a football field, and enough for twelve traffic lanes.

Lying with pictures is nothing new for the IBR project.  As we’ve noted before, despite spending tens of millions of dollars on planning, and more than $1.5 million to build an extremely detailed “digital twin” of the proposed bridge, IBR has never released any renderings showing what the bridge and its mile long approaches will look like to human beings standing on the ground in Vancouver or on Hayden Island.  And the IBR also released similar misleading and not-to-scale drawings that intentionally made the height and navigation clearance of their proposed bridge look smaller than it actually is.  And for the earlier version of this same project, the Columbia River Crossing, OregonDOT claimed to reduce with width of the freeway from 12 lanes to 10, but instead simply erased all the width measurements from the project’s Final Enviornmental Impact Statement, while keeping the project plans exactly the same.

Must Read

How freeways kill cities.  Economists have long recognized that highway construction leads to sprawling and decentralized development by making it easier and cheaper to travel further. But that’s not all, as Street MN’s Zak Yudhishthu summarizes the latest economic findings about highways and cities, highways and cars also damaged the urban fabric, making city living less desirable.  They quote Federal Reserve economistsJeffrey Brinkman andJeffrey Lin, as showing showing that freeways also reshaped our cities by creating disamenities for center-city residents.

In other words, freeways do more than just link us together. While previous research assumed that freeways drove suburbanization solely by making suburban life better, Brinkman and Lin find that freeways drove suburban flight by making urban life worse. Freeways can serve as connectors, but they also make it more difficult for center-city residents to access amenities and jobs in their cities, alongside other pollutive reductions in quality of life.

Too often, discussions about the impact of freeways just looks at the direct (and devastating) effects of initial freeway construction.  But this is just the first level of harm:  the flood of traffic carried by freeways, coupled with noise and air pollution, is caustic to livable neighborhoods, and leads to population loss and business closures, as places near freeways become hostile, car-dominated spaces.

Visit your Nearest National Parking Lot!  Streetsblog’s Kea Wilson has a cutting parody of our old school national park posters that highlights how much of our nation’s urban centers has been given over to parking.  Building on an illuminating set of maps of parking lots and structures created by the Parking Reform Network, Civicgraphics has created a series of posters highlighting the glories of abundant parking that dominates so much of the urban core.

As Wilson points out, public subsidies to parking (as much as $300 billion annually) dwarf the $3 billion we spend on national parks; so in reality parking lots are more central to our national identity that Yellowstone or Yosemite. It’s a pointed and painful parody to be sure, and it speaks vastly more truth than the fictionalized pedestrian and greenery heavy renderings that are being peddled by highway departments to greenwash road widening projects.

Lessons from California’s first-time home-buyer credit.  Housing affordability is famously a California malady.  In an effort to blunt the problems that first-time homebuyers face, the California Legislature enacted a “Dream for All” a $288 million loan program to provide down-payment assistance to new homebuyers.  While the impulse is understandable, the policy is questionable.  The first issue has to do with scale:  “for all” is rather grand, and while a quarter of a billion dollars is a lot of money, the program provided loans to fewer than 2,600 California households.  In a state with 40 million residents, those are lottery-winner odds.  Little wonder, the whole program was exhausted days after starting. The second issue has to do with who got the loans.  Sharp-eyed reporters at CalMatters noted that a disproportionate share of the loans went to applicants in the Sacramento area.  Apparently, people who worked in and around state government were much more aware and prepared to apply for the program, according to local loan officers:

. . . news of the program spread by word-of-mouth throughout the capital community in the days before the state officially launched the program on March 27. The regional rumor mill may have been churning especially quickly given how much more plugged-in locals are to matters of state bureaucracy. “Sacramento and the surrounding area’s loan officers and Realtors probably got a jump start,” he said.

A final problem has to do with the economics of supply and demand:  while the down payment loans ease the burden of ownership for the relative handful of lucky households that get one, they likely increase the number of prospective bidders for the limited supply of homes for sale.  While 2,500 more qualified buyers in California might not make much difference, expanding the program would likely create even more upward pressure on home prices, aggravating the affordability problem the program aims to solve.

Building more housing helps hold down rents.  There’s a growing body of academic literature showing how building more housing, including new market rate housing–helps hold down rents and address affordability challenges.  The problem is that academic literature seldom filters down.to the public.  Seattle public radio KUOW has a terrific and non-technical explanation of this literature, featuring an interview with UCLA professor Michael Lens, one of the authors.  They frame the question in common-sense terms:  Does building townhomes help hold down rents?  Part of the answer is that townhomes require less land and less public infrastructure (like street frontage) than single family homes, which lowers construction and development costs.  But the more important issue is that by providing more homes, and thereby increasing supply, townhouses help moderate rents.  The KUOW report concludes:

. . . here’s what the science and research is telling us so far: Housing density does bring down the cost to build housing. And most studies seem to suggest that yes, this pattern is repeating over and over in cities that reform their zoning to allow more housing.

New Knowledge

America the lonely and isolated.  Surgeon General Vivek Murthy has released a new report decrying a growing epidemic of loneliness in the US.  In simplest terms, we spend more time apart from one another now than ever before.  Across a broad array of indicators, we’re spending less time with others in social settings:

The Surgeon General’s report echoes many of the themes highlighted in City Observatory’s 2015 report “Less in Common“–emphasizing the growing isolation and declining social interaction in daily American life.

What’s striking about this report is that it clearly links isolation and growing loneliness to a range of negative health outcomes.  As Surgeon General Murthy writes:

Loneliness is far more than just a bad feeling—it harms both individual and societal health. It is associated with a greater risk of cardiovascular disease, dementia, stroke, depression, anxiety, and premature death. The mortality impact of being socially disconnected is similar to that caused by smoking up to 15 cigarettes a day, and even greater than that associated with obesity and physical inactivity.

To any urbanist, the principle causes of growing isolation will be evident in our landscape.  The sprawling, low density development patterns that define our metropolitan areas put us physically further from one another (living largely in single family houses), and require that we spend an inordinate amount of time alone in automobiles as we travel.  The failure to connect these social trends to our built environment is a major shortcoming of the Surgeon General’s report.  Even though its principal recommendation is to “cultivate a culture of connection,” it doesn’t talk about how the physical environment impedes (or promotes) connections.  You won’t find any mention of sprawl, density, commuting or car-dependence in the report.

Half a century ago, the Surgeon General’s report on the health consequences of tobacco helped trigger a major social and policy shift in the way Americans related to smoking.  We’ve banned smoking on planes and in most indoor public places, and these bans, coupled with economic incentives and changes in social attitudes about smoking have reduced deaths and respiratory disease.  We can only hope that this Surgeon General’s report will lead to similarly helpful changes in our communities, promoting greater social interaction–and expand the scope of that concern to the built environment..

 

The Week Observed, April 21, 2023

What City Observatory did this week

Why should Oregonians subsidize suburban commuters from another state? Oregon is being asked to pay for half of the cost of widening the I-5 Interstate Bridge. Eighty percent of daily commuters, and two-thirds of all traffic on the bridge are Washington residents. On average, these commuters earn more than Portland residents.

The 80/20 rule: When it comes to the I-5 bridge replacement, users will pay for only 20 percent of the cost of the project through tolls. Meanwhile, for the I-205 project in Clackamas County, users—overwhelmingly Oregonians—will pay 80 percent (or more of the cost in tolls).  Meanwhile, state legislators are looking—for the first time—to raid the state’s General Fund (which is used to pay for schools, health care, and housing) to pay for roads by subsidizing the Interstate Bridge Replacement Project to the tune of $1 billion. The proposal for Oregon to fund half of the cost of the Interstate Bridge Replacement is a huge subsidy to Washington State commuters and suburban sprawl.

A blank check for the highway lobby.  The HB 2098 “-2” amendments are perhaps the most fiscally irresponsible legislation ever to be considered by the Oregon Legislature. They constitute an open-ended promise by the Oregon Legislature to pay however much money it costs to build the $7.5 billion Interstate Bridge Replacement and $1.45 billion Rose Quarter freeway widenings—projects that have experienced multi-billion dollar cost overruns in the past few years, before even a single shovel of dirt has been turned.  HB 2098-2 amendments would:

  • Raid the Oregon General Fund of $1 billion for road projects
  • Give ODOT a blank check for billions of dollars of road spending
  • Allow unfettered ODOT borrowing to preclude future Legislatures from changing these projects and forcing their funding
  • Eliminate protective sideboards enacted by the Legislature a decade ago
  • Enact a meaningless and unenforceable cap on project expenses.

Must Read

Tokyo as an anti-car paradise.  In a shortened chapter lifted from his forthcoming book Carmageddon, Economist writer Daniel Knowles digs deep into the reasons why Tokyo excels as a metropolis with high density, affordable rents, great public transit, walkable neighborhoods and relatively few cars.  It’s more or less the inverse of the United States:  driving and parking are expensive, while density is lightly regulated.  Before you can legally register a car in Japan, you have to show that you have an off-street parking space in which to store it, and–shock!–overnight on-street parking is largely illegal.  Meanwhile, single family zoning is virtually unknown, and land owners are free to build to pretty much whatever density they’d like in residential areas, which is exactly what the nation’s mostly privately owned railroads have done, building dense housing around new or expanded transit stations.  And when it came to building expressways, Japan relied heavily on the free market, letting private companies build and operate tollways, and the tolls, like the limits on parking, make driving an unattractive and expensive alternative.  And as Knowles stresses, all this reinforces the rich, and fine grained walkable urbanism of Tokyo neighborhoods.  By insisting that cars pay their way, rather than pampering them, people take precedence.

Another freeway widening fail:  Just south of San Francisco, authorities have spent half a billion dollars adding more capacity to Highway 101, with predictably disappointing results.  Roger Rudick, writing for Streetsblog San Francisco relates the all too common tale of a freeway widening project that hasn’t done anything to reduce congestion.  After spending $600 million to widen 14 miles of freeway, one of the project’s private engineering consultants conceded:

“California in general has that problem: that as soon as we start to build something we’re immediately over capacity by the time we finish building it, so it’s almost impossible for us to keep up with the amount of demand,” admitted Monique Fuhrman, Deputy Policy Program Manager with HNTB, which worked on the project, when questioned by Swire at the CAC meeting. She added that traffic levels are already “getting back to the way we were before.”

Local transportation advocate Mike Swire pushed back against the consultants self-congratulatory talking points, challenging Fuhrman to explain why freeway expansion wasn’t simply a never-ending cycle of expansion, induced demand and recurreing congestion.  Swire asked:

“At what point do we stop doing something we know isn’t working?”

To which Fuhrman replied:

“That’s like an existential question. I do not know.”

Which is where, for the moment, we leave the absurd situation, though we can be sure there will be another engineer, another widening project, and another sense of existential bafflement (and indifference) when it too, doesn’t relieve congestion.

New apartment construction in Seattle is flat-lining:  Is mandatory affordable housing to blame?  Seattle has been one of the epicenters of rapid urban growth over the past few decades, and growing demand for city living has pushed up housing prices–and apartment rents.  A couple of years ago, the city implemented its “Mandatory Housing Affordability” (MHA) program, requiring apartment developers to either set-aside units for low or moderate income households, or make payments into a city housing affordability fund.  In effect, these requirements act like a tax on new multi-family housing, and not surprisingly, may discourage development.  New statistics published by the Urbanist show a worrying collapse of apartment permits in Seattle:

Since March 2021, shortly after the MHA program was mostly phased in, permit applications dropped sharply from an average of more than 1,500 per month to fewer than 500 per month.  We think the city should be watching this closely.  If fewer apartments are built now, that’s likely to leader to a tighter housing market in the years ahead, with likely further rent increases.

The State of Play:  How highway agencies co-opt and debase progressive policies

In last week’s Week Observed, we highlighted a provocative article from Chuck Marohn of Strong Towns, illustrating how a highway project in Maryland had effectively co-opted the “Complete Streets” moniker for a project that remained a dangerous (even deadly) car-dominated “stroad.”  Chuck made the very good point that engineers and highway departments will deftly borrow and trade on progressive sounding terminology.  That part of Marohn’s critique is valid.  But Marohn—and by dint of repetition—City Observatory may have conveyed the impression that the originators of the complete streets concept are somehow to blame.  As our friends, Stephen Davis and Beth Osbourne of Smart Growth America have offered a spirited defense of the efforts of the National Complete Streets Coalition.  They point out that the Coalition doesn’t merely promote the concept, but also evaluates the effectiveness of adopted policies, develops champions, highlights best practices and generates constant national attention through reports like Dangerous by Design.  Ultimately, Davis and Osborne agree with Marohn that there’s an entrenched status quo that’s adept at appropriating and diluting progressive concepts.  They argue that means the diverse advocates for change need to make common cause against this co-opting,

Going forward, we all need to keep our attention squarely focused on the transportation agencies and engineers who prioritize speed above all else and call it safety, who do the same old thing and call it something new, who build dangerous streets and call them complete.

That’s going to be a never-ending task:  There’s no shortage of consultants, public relations, marketing and branding types that will–for a generous fee–help legacy highway agencies re-brand their work as green, equitable, safety-conscious and net zero carbon.  Its vastly easier to feign those values in awards and press releases than it is to accomplish meaningful change in the real world.  Ultimately, we have to insist on measurable results—fewer people killed and injured, less greenhouse gases emitted, more people living in walkable, bikeable communities, with a full range of transportation choices.

 

The Week Observed, April 28, 2023

What City Observatory did this week

Testifying on the Oregon Transportation Finance.  City Observatory director Joe Cortright testified to the Oregon Legislature on HB 2098, a bill being proposed to fund bloated freeway widening projects in the Portland Metropolitan area.  As we’ve previously reported at City Observatory, proposed amendments to this bill would give the Oregon Department of Transportation a virtual blank check towards the construction of the multi-billion dollar Interstate Bride Replacement Project and the I-5 Rose Quarter freeway widening.  The draft amendments establish a vague and legally dubious statement of legislative “intent” to fund both projects, which on its face seems harmless.  But the Oregon Transportation Commission could use those statements of intent, coupled with its broad discretion to borrow funds, to launch both of these multi-billion dollar boondoggles, and present subsequent Legislature’s with a fait accompli—a reprise of the classic Robert Moses strategy of “driving stakes and selling bonds” to lock in freeway construction.

General funds for this massive bridge would take precedence over funding for education, health care and reducing homelessness.

The HB 2098 amendments also propose raiding the state general fund to the tune of $1 billion–breaking down a decades old fiscal firewall that separated road finance from other public expenditures, and is the backbone of the state’s “user pays” policy for transportation.

Links to written testimony submitted to the Joint Transportation Committee are here.

Must Read

Lessons from a century of transit decline. The heyday of American transit was nearly a century ago, and virtually every American city had a robust streetcar network. Bloomberg’s David Zipper interviews Nicholas Dagen Bloom, a professor author of the  new book, The Great American Transit Disaster, that chronicles that decline.  Bloom argues that the ascendancy of car wasn’t so much the result of conspiracies or some technological inevitability, but conscious (and, at the time, widely supported) policy choices. In the interview, Bloom offers some insights from history that might help us now.  In particular, he emphasizes maintaining transit service, and making transit more competitive by catering less to cars.

. . . we should be thinking about maintaining the quality of service for what we now have. Current levels of service could be shredded and ridership shattered as the post-pandemic financial realities sink in. Given that, I don’t know that we should be building any new transit lines at this point.

. . . the compelling case for greater ridership is the aggravation of driving. For that reason, the most positive things might be rezonings, the multifamily boom, and the end of parking minimums. If we remove highways in certain areas, there’s an opportunity for transit to be competitive. But barring that, it’s very hard to know what an agency on its own can do because they’re now in survival mode.

Bad Models, Bad Decisions.  The irreplaceable Todd Litman takes a look at the scientific basis of traffic modeling and traffic impact statements, and finds them wanting, with devastating effects.  Whether its assumptions about inexorable and unending traffic growth, or fixed (and exaggeratedly high) estimates of traffic generation from new development, planning and engineering decisions are dictated by some dubious statistics.  Total vehicle miles traveled in the US have been routinely over-predicted leading to overbuilt highways.

Litman points to this chart from the Frontier Group showing how Department of Transportation and Department of Energy forecasts have consistently over-estimated the growth in driving in the US (forecasts are colored lines; actual is black).  With this as their future worldview, it’s little wonder that planning gets biased in favor of accommodating an assumed surge in automobile travel.  As Litman writes:

. . . practitioners use demonstrably inaccurate models that often result in inefficient, unfair, and environmentally harmful decisions. In the past, practitioners hid behind their technical expertise, but they are starting to face legal challenges.

Why do transportation agencies spend so much more on automobile infrastructure than other modes, despite their lower total costs and greater benefits? It is partly the fault of practitioners who perpetuate biased planning practices. It is time for reform. Either we create more accurate, comprehensive and multimodal planning practices ourselves, or we will be forced to, by litigation.

Like Litman, we share the hope that policymakers (and courts) will insist on data, models and regulations that are scientifically based and free from implicit biases.

A worthwhile Canadian Initiative:  Cancelling a highway tunnel, building a transit tunnel.  For decades, Quebec has been contemplating a third highway crossing of of the St. Lawrence River.  Though its long been a campaign promise of the incumbent provincial government, in a surprising turnaround, Premier Francois Legault’s government, after a careful scientific review, has ditched plans for the highway tunnel and is now proposing to move forward with  a transit-only tunnel as the third crossing. The government’s review concluded, that in the wake of increased work-from-home in the wake of the Covid-21 pandemic traffic growth had attenuated, eliminating the need for the $6.5 billion highway tunnel.  Bloated, polluting road projects have always done more to damage urban economies than to bolster them.  We can only hope that leaders of US cities will similarly reconsider the wisdom of such projects.

New Knowledge

Has the Covid-21 effect run its course in urban neighborhoods?  The Covid-19 pandemic produced some abrupt shifts in population migration patterns within and across US cities.  The big question is whether these are short-lived changes that will be quickly reversed to underlying trends, or whether a post-pandemic world will be fundamentally different.  Brookings Institution demographer Bill Frey has a readout of the overall county level population trends from the latest US Census data.

Frey’s work shows that at least some of the decline in the densest, most central US counties has abated or reversed in the latest data.  (Brookings uses a typology that classifies metropolitan US counties as either “urban core” or “suburban.”  It’s a useful, rough-and-ready way of contrasting urban-suburban trends, but due to the variability of county boundaries, as we’ve noted, its an imperfect way of characterizing what’s happening in city neighborhoods.

Frey’s key finding is that, aggregated across large metro areas, the decline in population in central counties has attenuated. Urban Core counties racked up impressive population gains early in the last decade, but their growth slowed, and was negative in the pandemic year of 2020.  The Census data show that decline attenuated sharply in 2021-22.

Frey’s work helpful digs into the components of population change across the different county types.  Population changes for a variety of reasons:  natural increase (the difference between births and deaths), migration to and from other locations in the US, and international immigration.  While much of the narrative surrounding the pandemic was a supposed increased in out-migration from urban core areas, a bigger factor in many cases may have been the decline in international immigration (a trend very evident during the Trump years), which was particularly sharp during the pandemic.  Urban core counties have traditionally been the biggest magnets for international immigrants, and so they were particularly affected by the decline in international immigration.

The data show that international immigration (the orange line) bounced back in 2022 in every county classification, but was most significant for urban core counties.

William H. Frey, “Pandemic-driven population declines in large urban areas are slowing or reversing, latest census data shows,”
Brookings Institution, April 19, 2023

In the News

BikePortland featured our analysis of proposed legislation in Oregon providing a virtual blank check for two huge freeway expansion projects.

The Week Observed, March 3, 2023

What City Observatory did this week

More induced travel denial.  Highway advocates deny or minimize the science of induced travel. We offer our rebuttal to a reason column posted at Planetizen, attempting to minimize the importance of induced demand for highways.

Induced travel is a well established scientific fact:  any increase in roadway capacity in a metropolitan area is likely to produce a proportional increase in vehicle miles traveled. Highway advocates like to pretend that more capacity improves mobility, but at best this is a short lived illusion.  More mobility generates more travel, sprawl and costs.  In theory, highway planners could accurately model induced travel; but the fact is they ignore, deny or systematically under-estimate induced travel effects.  Models are wielded as proprietary and technocratic weapons to sell highway expansions.

Must Read

Consultants gone wild. There’s little dispute that the US has some of the highest urban rail construction costs on the planet.  The big question surrounding this unfortunate American exceptionalism is, “Why?”  Writing at Slate, Henry Grabar summarizes a recent study that poses a provocative answer:  It’s our excessive reliance on consultants.  The study comes from Eric Goldwyn, Alon Levy, Elif Ensari, and Marco Chitti of New York University’s Transit Costs Project, who’ve taken a close look at construction projects in the US and around the world.

Not only are the consultants themselves more expensive than permanent staff, there are two related problems:  incentives and capacity.  Consultants don’t get paid for building subways–they get paid for studying things and offering advice.  That gives them strong financial incentives to prescribe further study and more advice.  The related problem is capacity:  the agencies commissioning all this consultant work have to have the capacity to read, digest and act on the expertise, and the reductions in staff, and the diminished expertise in the public sector mean that they may not have the ability to do so–or critically, to ask hard questions.  And for the record, the problem of excessive consultant costs isn’t limited to transit projects:  The Oregon and Washington highway departments  spent nearly $200 million on planning and engineering for the never-built Columbia River Crossing a decade ago, and are on track to spend more than $200 million for planning and engineering of its star-crossed successor, the Interstate Bridge Replacement–with most of this money going to years upon years of consulting contracts.

They dare not call it zoning.  Houston’s au courant NIMBYs want to authorize conservation districts.  It sounds innocent enough, the City of Houston wants to authorize neighborhoods to form “conservation districts” to allow a majority of existing landowners in a particular area to impose restrictions on new development, including minimum lot sizes, building setbacks, roof pitches and the like. A conservation district could limit any of these features:

Ostensibly, the purpose of the conservation districts is to protect neighborhoods from untoward change.  Advocates point to the city’s famous Fourth Ward, which lost most of its historic housing, and has seen a recent wave of redevelopment, which many residents consider “out-of-character” with its history.  If this all sounds familiar, it’s because it’s awfully similar to traditional exclusionary zoning measures.  Recall that zoning was originally designed to protect established neighborhoods from scourges like slaughterhouses, tanneries, and, of course, apartments.  In some cities, historic designations have become the more fashionable way to cloak exclusion with the color of law.  And Houston, which purportedly has no zoning–at least not by that name–would never implement it, except that these conservation districts could easily be applied almost anywhere.  It’s an open invitation for NIMBY’s to craft their own bespoke zoning.  There’s apparently no limit to the number of such districts that can be created, or their size; so long as 51 percent of property owners agree, and the City Council approves, you can have your very own district.  Because wealthier neighborhoods likely have the political capital and wherewithal to navigate the process, its probable that, just like single family zoning, this will become a tool of exclusion.

Downtown Chicago gains population, despite the pandemic.  The received wisdom about the pandemic is how bad it was for city centers, especially due to a decline in office occupancy because of increasing work at home.  But a new survey from downtown Chicago underscores the changing economic role of city centers as prime residential locations, especially for well-educated young adults.  As reported in Bloomberg CityLab, downtown Chicago’s population grew by xx,xxx since before the pandemic.  The study looked at population in Chicago’s Loop (the area bordered by the elevated train that circulates around down town.

Most of the Loop’s population is 25 to 34 years old, with more than 80% living alone or with one person. Almost half don’t own a car and the majority cite the ability to walk to places, the central location and proximity to work as top reasons for living downtown.  The future of the Loop will also be more residential. Another 5,000 housing units are expected to be added by 2028, bringing the district’s total population to 54,000, according to the report

New Knowledge

A hidden climate time bomb in US home values.  A new study published in Nature estimates that home values in flood prone areas across the US are significantly over-priced because markets are ignoring likely damage from climate change.

When buyers pay for homes, they may not be aware of the likely damage from climate change, especially due to the increased risk of flooding.  To some extent, buyers already recognize that homes located in flood prone areas may be less valuable than otherwise similar homes in drier places.  Overall, homes in the 100-year flood plane sell at about a 3 percent discount to otherwise similar homes elsewhere.  But there are systematic patterns in how well markets accurately reflect flood risk. The author’s conclude:

. . . residential properties exposed to flood risk are overvalued by US$121–US$237 billion, depending on the discount rate. In general, highly overvalued properties are concentrated in counties along the coast with no flood risk disclosure laws and where there is less concern about climate change.

This study uses home value data and new data on increasing flood risks to identify the discrepancy between current market values, and values that would reflect a more realistic appraisal of likely climate risk.

There are distinctly regional patterns to climate risk.  The greatest concentration of losses (relative to the total fair market value of properties) is along the Gulf Coast, and in Appalachia and in the Pacific Northwest.

Property overvaluation as a proportion of the total fair market value of all properties.

A big policy question going forward is how we incentivize people to avoid building in high-risk areas and allocate the costs of climate related damages.  Historically, the National Flood Insurance Program has paid out much more in benefits than it has collected in premiums, and still fails to fully account for growing climate risk.  Over time, higher premiums will shift some of these climate costs to property owners in risky areas, but Congress has capped premium increases, which slows the adjustment. In theory, mortgage lenders should be less willing to lend for housing in risky areas, but as the authors point out, many use federal loan securitization to disproportionately shift the riskiest properties into the portfolios of federally guaranteed home lending programs.

Jesse D. Gourevitch, Carolyn Kousky, Yanjun (Penny) Liao, Christoph Nolte, Adam B. Pollack, Jeremy R. Porter & Joakim A. Weill

Unpriced climate risk and the potential consequences of overvaluation in US housing markets,” Nature Climate Change (2023).
https://www.nature.com/articles/s41558-023-01594-8

In the News

The Urbanist and Investor Minute republished our commentary on the Oregon Department of Transportation’s plans for tolls of as much as $15 to travel between Wilsonville and Vancouver under the title “How to finance a highway spending spree”

Thanks to Streetsblog for their shout out on our commentary about induced demand; as they put it: “How is it that, despite all evidence to the contrary, anyone still believes widening roads will reduce congestion?”

The Week Observed, February 24, 2023

What City Observatory did this week

IBR admits its bridge is too steep.  After 15 years of telling the region that the only feasible alternative for crossing the Columbia River was a pair of side-by-side double-decker bridges, the IBR project let slip that it was now thinking about a single level crossing, ostensibly because it provided better “aesthetic” options.  This comes after telling anyone who proposed an alternative (such as a bascule bridge or tunnel) that even thinking about an alternative design would impose unmanageable delays.  But the decision to consider a single level crossing is an admission that one persistent IBR critic has been right all along.
Engineer Bob Ortblad had noted that the proposed bridge would require nearly a 4 percent grade (making it one of the steepest interstate bridges).  The real reason for a single level crossing is to lop 30-35 feet off of the bridge’s height, and reduce roadway grade (and not incidentally reduce the cost of connecting the very high bridge to adjacent roads.
Until recently, the only other alternative they showed was an even taller “stacked” single bridge that put the two roadway sections on separate levels of a single bridge.

ODOT’s planned I-205 tolls will cost the average local household $600 annually.  The Oregon Department of Transportation just released its Environmental Assessment for the proposed I-205 highway widening project.  The project would be paid for by tolling traffic on the roadway as much as $4.40 at the peak hour.

The project’s economic report concludes that the typical household in Clackamas County will pay about $600 per year in tolls.  Regular commuters on I-205—those who drive daily at morning and afternoon peak hours—will have to pay $2,200 per year in tolls under the ODOT plan.  Why are the tolls so high? OregonDOT’s tolls are set high enough to generate the revenue needed to pay for the roughly $1 billion project (plus interest costs) over the next 20 to 30 years. While the purpose of the project is to reduce congestion, it’s likely these toll levels are so high that traffic will fall dramatically–meaning that the expensive freeway expansion isn’t needed.

Must Read

Even an unbuilt highway can destroy a neighborhood.  Writing at Bloomberg City Lab, Megan Kimble profiles the impacts of the proposed I-49 freeway expansion project in Shreveport Louisiana.  Its well understood that previous highway construction projects have systematically devastated urban neighborhoods, but Kimble reports that just the potential for this freeway widening through the Allendale neighborhood of Shreveport has already done its damage.  Once a highway department announces its construction plans, it has a chilling effect on private investment in the area:  who wants to buy a home in the path of, or worse yet, right next to a new highway project.  That’s exactly what’s happened in Allendale, where highway planners have a proposed “Inner City Connector” which aims to speed traffic by cutting a swath through disinvested Black neighborhoods.

Allendale’s freeway fight sets the stage for a reprise of a long established urban drama, with freeways plowing through minority neighborhoods.  Its something that the current administration has acknowledged and says it wants to correct, but that’s very much a live issue in Shreveport.  Kimble writes:

[Transportation Secretary Pete] Buttigieg and President Joe Biden have committed to repairing this harm, which is usually referred to in the past tense. But in Allendale, many residents see the Inner-City Connector as a contemporary variation on this pattern — a low-income Black neighborhood threatened with destruction to save drivers a few minutes of travel time. And they built a coalition of their own to resist.

Economic development hits a new nadir:  subsidies for In-and-Out burger.  Cities and states, it seems, will subsidize almost any business, as long as its clothed in the language of economic development and job creation.  The latest–and one of the most egregious examples comes from Tennessee, where the state has awarded $2.75 million in economic development assistance to the California drive-thru hamburger chain “In-and-Out-Burger” to build an office and restaurants in the Volunteer State.  Restaurants are a particularly good example of a “local-serving” business, meaning most, if not all of the spending (and jobs) for such businesses will be driven by the appetites of Tennesseans.  Subsidizing In-and-Out-Burger won’t lead to any more eating out, or spending in in Tennessee; but what it will do is advantage a large, out-of-state business in competition against every Mom and Pop restaurant in the state.

It’s tragic to think that you can build your economy by subsidizing the expansion of one fast-food chain to the likely competitive detriment of all the others, but that’s the current state of the art in what passes for professional economic development in much of the United States.

Wales puts an end to highway expansions.  Regular readers of City Observatory will be familiar with the science of induced travel:  building more roadway capacity does little to reduce congestion, and instead, simply stimulates more driving, pollution and traffic.  That realization has dawned full force on policy-makers in Wales, who, after a careful review of the scientific literature, have largely pulled the plug on further roadway expansions in Wales.

The Welsh government said all future roads must pass strict criteria which means they must not increase carbon emissions, they must not increase the number of cars on the road, they must not lead to higher speeds and higher emissions, and they must not negatively impact the environment.

The background is this:  In 2021, the national government put highway projects on hold, pending a policy review.  The conclusion, based on a careful examination of the results of past expansion projects, led the government to cancel or greatly revise 44 of 59 pending road projects.  Those that are being allowed to proceed are mostly smaller scale improvements.

New Knowledge

The illusion of travel time “savings.”  The typical rationalization for many highway expansion programs is that they will produce big economic benefits to travelers by reducing travel times.  If you can drive faster, the reasoning goes, that will knock a few minutes off your trip, and the time that’s freed up is something you can use (and will value).  There are a lot of problems with this theory, chief among them the reality of induced travel (which means that if a road becomes faster or has higher capacity, more people will use it, and quickly erase the travel time savings).  But even if a road is faster, that doesn’t equate to actual time “savings” according to a thoughtful analysis published in Transportation Research Interdisciplinary Procedings.

The trouble with the idea of travel time savings is the partial equilibrium or “ceteris paribus” (all other things equal) assumption that’s lurking in this calculation.  If all trip taking remains exactly the same, and all development remains just as it is, in theory, faster travel ought to translate into less time spent traveling.  But in practice, and over time, other things don’t remain the same, they change, in response to the faster travel time.  When we can drive faster, we travel farther.  And recognizing that, over time, activities and people spread out further.  The famous Marchetti’s constant is the observation that for centuries, cities have been arranged so that most daily travel is about thirty minutes in one direction.  Faster travel translated into larger, less dense cities, but not shorter travel times.

This effect plainly shows up when we look at historical trends in travel speeds and travel behavior.  As travel has become faster, trips have become longer, and people (and activities) have become more widespread.  National travel surveys from Britain show that between 1972 and 2003, the speed and distance of trips increased in tandem, while the number of trips and the total amount of time spent traveling remained essentially flat.

The fact that faster travel speeds don’t translate into realized time savings suggests that the claims that highway advocates regularly make about the economic value attributable to highway expansions are largely false.  That’s not to say that highways don’t have economic effects:  but what they do is decentralize economic activity, and longer trips have significant private costs, as well as large social and environmental costs, that are seldom fully considered.

Cornelis Dirk van Goeverden, “The value of travel speed,” Transportation Research Interdisciplinary Perspectives
Volume 13, March 2022, 100530

In the News

StreetsblogUSA cited City Observatory’s research on the connection between education and urban prosperity in its story describing how weak public transit systems restrict access to education opportunities.

The Week Observed, February 17, 2023

What City Observatory did this week

Driving between Vancouver and Wilsonville at 5PM? ODOT plans to charge you $15.  Under ODOT’s toll plans, A driving from Wilsonville to Vancouver will cost you as much as $15, each-way, at the peak hour.  Drive from Vancouver to a job in Wilsonville? Get ready to shell out as much as $30 per day.

Tolls don’t need to be nearly this high to better manage traffic flow and assure faster travel times. The higher tolls are necessitated by the need to finance ODOT’s multi-billion dollar highway spending spree. ODOT is telling everyone to get ready for tolls, but they’re being close-mouthed about how much tolls will be. Here’s what they’re planning, according to documents obtained by City Observatory.

  • Tolls on the I-205 Abernethy and Tualatin River Bridges will be $2.20 each at the peak hour. (Orange)
  • Tolls on the I-5 Interstate Bridge will be up to $5.69 (Green)
  • In addition to these tolls, drivers on I-5 and I-205 will pay tolls of 17 cents to 38 cents per mile during peak hours. Twenty miles of driving on I-5 or I-205 will cost you between $3.40 and $7.60. (Blue)

Road pricing could be an effective means to reduce congestion.  But the toll levels ODOT is proposing are really designed to maximize revenue, not to manage congestion.  Tolls could be vastly lower if set just high enough to allow a free flow of traffic.  And there’s no need to do congestion pricing during off-peak hours when there’s plenty of under-used roadway space.  But ODOT isn’t interested in managing traffic, it just wants more money to build things.

Must Read

Time to sue the bleeping engineers for malpractice.  Jeff Speck has had enough.  The author of Walkable City, has (through years of work, public speaking and two editions of his book) exposed the profound biases in the dominant transportation planning paradigm.  While there’s considerable blame for increasingly large and dangerous sport utility vehicles and our sprawling, auto-dependent development patterns, the way we design our roads is major contributor to America’s high (and rising) traffic fatality rate.

Put simply, the roadway design standards enshrined by our nation’s professional civil engineers are unnecessarily deadly to the point of criminal negligence. It’s time to place blame and demand change. . . . Engineers routinely design streets to support (and therefore invite) speeds well above the posted speed limit. Then, when speeding is observed on these streets, the manual published by the Federal Highway Administration requires that the speed limit be raised.

Highway design emphasizes speed and throughput over safety, particularly for those walking and biking.  While there’s plenty of specific policies that can be changed, Speck argues its time to assign legal liability to highway engineers for their dangerous designs.  Perversely, they’re now frequently fearful to make changes that might improve safety (like unique paint designs for crosswalks) because such innovations are approved in the auto-centric highway design manuals.  It’s a deeply flawed system, desperately in need of radical change.

Right on cue, the Oregon Department of Transportation has decided the best way to improve pedestrian safety is to . . . close lots of crosswalks.  Faced with a growing epidemic of traffic violence, OregonDOT has hit on a new strategy.  As Bike Portland reports, it plans to close crosswalks on more than 180 Portland area roadways that it controls.  Most egregious, the agency plans to close xx crosswalks on dangerous Powell Boulevard, a roadway the claimed the life of Portland chef Sarah Pilner just a few months ago.

OregonDOT: Vision Zero . . . Zero pedestrians. (Bike Portland)

Of course, closing the crosswalks doesn’t make the roadway any safer, it just effectively eliminates the criminal penalties (and bureaucratic responsibility) when motorists end up hitting pedestrians.  And it’s a clever way for the agency to avoid making the crossing actually safer (or comply with the Americans with Disabilities Act).  If it’s not a crosswalk, ODOT doesn’t have to fix it, or bring it up to ADA standards.

The ironic insult added to injury of this policy, of course, is that all of the crosswalk closures will substantially lengthen the distances that pedestrians have to walk to legally cross these roadways.  Many closures will add several minutes to the amount of time needed to reach an otherwise nearby destination:  and OregonDOT would never, ever tolerate a safety measure that forced a motorist to experience a similar level of delay.  As Jeff Speck observed, these policies make it clear that highway departments value to motorists’ time above pedestrians lives and limbs.

In the News

The Oregonian quoted City Observatory’s analysis of planned tolls on Portland area roadways in its article: “Tolls are coming to Portland-area freeways, and even tolling fans worry they’ll stack up.”

Willamette Week quoted City Observatory’s Joe Cortright on the Interstate Bridge Project’s plans to dramatically redesign a proposed Columbia River Bridge, an implicit admission that critics have been right all along.

The Week Observed, February 10, 2023

What City Observatory did this week

CEVP: Non-existent cost controls for the $7.5 billion IBR project.  Oregon DOT has a history of enormous cost overruns, and just told the Oregon and Washington Legislatures that the cost of the I-5 Bridge Replacement Program (IBR) had ballooned 54 percent, to as much as $7.5 billion.

To allay fears of poor management and further cost overruns, IBR officials testified they had completed a “Cost Estimate Validation Process” (CEVP). They assured legislators they had consulted independent subject matter experts and assessed more than 100 risks.

But asked for copies of the CEVP under the public records law, agency officials reported “no records exist” of the CEVP.

And the supposedly “nationally recognized” CEVP process has been around for more than a decade, was judged inadequate and error-filled for the Columbia River Crossing, and failed to detect key cost and schedule risks.

ODOT and WSDOT are more interested in deflecting criticism than in being accountable for—and correcting—runaway project costs.

Must Read

Lower parking requirements = More housing.  For decades now, scholars, led by the estimable Donald Shoup, have pointed out that parking requirements drive up the price of housing, which means that less new housing gets built.  And now, from Oregon, comes evidence that this process works in reverse:  lessening parking requirements leads to more housing getting built.  The Sightline Institute’s Catie Gould documents three case studies of projects that moved forward—or got bigger—because of eased parking requirements.

One of the frustrating cognitive obstacles in housing policy debate is no one can see the housing that isn’t built because of bad policies.  That’s what’s great about Catie Gould’s story:  it shows that when you get rid of bad policies, like excessive parking requirements, you get new housing that you can actually see.  One hopes policy makers and the public, will learn from object lessons like this one.

Another refrain on the science of induced demand.  There’s a huge and growing body of literature documenting the phenomenon of induced travel:  widening highways doesn’t reduce congestion, it simply prompts more travel, longer commutes and more pollution.  That message is gradually making its way to a wider audience.  Business Insider offers its take:

With billions of federal dollars available under the IIJA, the decisions states make will have a huge impact on our ability to meet our climate goals.  As The Rocky Mountain Institute’s Ben Holland relates:

“This is a make-or-break moment; How the states use those highway funds will basically determine whether we meet our transportation emissions goals.”

A rising death toll from taller trucks.  The latest fashion trend in American trucks is massive, menacing front grills.  Longer, lower, wider has given way to taller, bolder . . . and deadlier.  Tall front grills impair driver visibility; small children and even some short adults are simply invisible in the blind spots created by tall front ends.  The results are a massive increase in crashes called “front-overs”–where a driver runs over a person immediately in front of their vehicle.

The term itself is adapted from the awful, but historically much more common “back over” crash–where a driver doesn’t see someone behind their car, and simply backs up over them.  For obvious reasons, front-over victims, like back over victims are disproportionately children.  The data show that front over crashes have increased from 15 in the 1990s to nearly 600 in the last decade.

Front-over deaths up from 15 to 575. (Scale error in original).

New Knowledge

Remote-working professionals workers gravitate toward dense neighborhoods.  If you’ve even casually followed the discussions of the post-pandemic prognostications about the future of cities in the world of remote work, you’ll know that the dominant hypothesis is that, once freed from the need to commute to downtown jobs, workers will flee cities for cheaper and more bucolic suburbs and rural hamlets. After all, why would anyone want to live in a dense, and expensive city if you didn’t have to work nearby?

What that hypothesis ignores, is that cities don’t just provide places to work, city residents value the opportunities and amenities of urban life for many other reasons.

A new study from the Leah Brooks and colleagues at the American Enterprise Institute looks at the relationship between neighborhood density and where remote-workers tend to live. Brooks, et al examine block group level data on demographic characteristics (particularly income), and the industry of employment of block group residents. We know from other work that there is systematic variation in the degree to which jobs in different industries are amenable to telework. Finance, professional service and management service jobs are much more likely to allow work-at-home arrangements.

What the study finds finds, from data prior to the pandemic, is that, controlling for income, those workers most likely to be able to work from home tended to be more likely to choose to live in denser neighborhoods.

Brooks, et al offer three possible reasons why higher income workers in “teleworkable” jobs are more likely to live in denser neighborhoods.

First, if workers in industries with greater telework potential enjoy more leisure time in equilibrium, their willingness to pay for amenities that complement leisure increases, and such amenities may not be available in lower-density areas.

Second, if workers value social interactions, and interactions at work are less frequent, they may seek out social interaction in non-work settings. Non-work social interactions are more readily found in population dense areas.

Third, and similarly, if in-person contact drives agglomeration effects, a shift to remote work makes such contact outside the work place more valuable. Again, in-person contact is easier in more population dense-areas. All of these explanations point toward increased telework leading to a greater willingness to pay for housing in high density places.

What this research suggests is that people don’t live in and near cities just to be close to jobs, but—as scholars as diverse as Jane Jacobs and Robert Lucas have said—to be near other people.  And paradoxically, if people aren’t getting as much social interaction by going to work every day, it is likely they will value the density and variety of opportunities for social interaction in cities even more, rather than less.

Leah Brooks, Philip G. Hoxie, Stan Veuger, Working from Density, AEI Economics Working Paper 2023-01 January 2023

In the News

Transit Center cited City Observatory’s analysis of Oregon’s broken system of road finance, and its fundamental incompatibility with stated climate objectives.

The Week Observed, January 20, 2023

What City Observatory did this week

Dr. King: Socialism for the rich and rugged free enterprise capitalism for the poor.  We’re reminded this year of Dr. Martin Luther King’s observation that our cities, and the public policies that shape them, are deeply enmeshed in our history of racism.

Whenever the government provides opportunities in privileges for white people and rich people they call it “subsidized” when they do it for Negro and poor people they call it “welfare.” The fact that is the everybody in this country lives on welfare. Suburbia was built with federally subsidized credit. And highways that take our white brothers out to the suburbs were built with federally subsidized money to the tune of 90 percent. Everybody is on welfare in this country. The problem is that we all too often have socialism for the rich and rugged free enterprise capitalism for the poor.

Must Read

Once again, transportation is the leading source of greenhouse gas emissions.  StreetsblogUSA reports the latest data on climate change, and as for the past few years, the leading source of greenhouse gases in the US is transportation.  Here are the data gathered by the Rhodium Group:

While we continue to make progress, slowly, in reducing emissions from industry, electricity production and buildings, emissions from transportation are not noticeably lower than they were a decade ago (about 2 billion metric tons).  In spite of improved fuel economy and vehicle electrification, transportation emissions are contributing little toward achieving greenhouse gas reduction goals.  We obviously need to do much more to reduce transportation emissions if we’re to address climate change.

It’s time to stop coddling cars.  For too long, we’ve pursued a banal “all of the above” strategy, thinking that just adding a few more alternatives to automobile transportation will fundamentally shift our travel patterns (and reduce our carbon emissions).  In a powerful essay at Dezeen, Phineas Harper says we have to be more forthright about discouraging car travel if we’re to make any progress.  Cities, he writes, should not just build green transport, but actively dismantle car infrastructure.  Car transportation is so privileged and so subsidized that simply creating “alternatives” does little to undermine the automobile’s hegemony over urban space.  One of the best ways to break with the past is to simply stop trying to fight automobile congestion; though it may seem paradoxical to some, Harper argues:

Managed strategically, congestion is critical in supporting the transition to safe, sustainable transport.

Frustration with slow and congested automobile traffic creates demand for real alternatives, like walkable urban spaces.  In contrast, building more roads to relieve congestion simply prompts more driving and pollution.  And there’s no reason to believe in a technical fix for this problem:

Moreover, car-based urbanism, electric or not, is inherently unsustainable, creating low-density, inefficient and dangerous cities. A grieving parent will find little comfort in learning their child was run over by a Tesla Cybertruck rather than a diesel 4×4.

Harper’s advice challenges the widely repeated slogan that we can tackle our transportation problems with “multi-modalism,” which is usually just a marketing gimmick to sanitize a giant highway project with a few token bike-lanes or sidewalks.

New Knowledge

Rent growth in US 100 US cities over the past five years in one chart.  The housing market is both national and local.  There are big national trends that affect every place, as when we have a pandemic, or there’s a recession or sudden surge in inflation.  But every local market is a different from the nation, with some leading and others lagging national trends.  You can see the whole pattern of rent price changes in the US in a comprehensive new visualization created by Rob Warnock at Apartment List.

This chart shows the month-by-month price changes in the 100 largest US market from 2018 through 2022.  Price increases are shown in red, declines in blue; markets are ordered from largest to smallest on the vertical axis.  You can immediately see the surge in rents nationwide in 2021 as a red-band.  You can also see a cooling of rental price inflation in the last quarter of 2022 in nearly every US market.  Its also obvious that there is an underlying seasonal pattern to rent price fluctuations, which rents generally increasing the first half of the year and declining in the second half.

The Apartment List website has a detailed description of the methodology and explores and explains many of the key trends in the data, breaking down the year-by-year developments.  It’s hard to find such a compact and detailed overview of the US rental market.  If you want to get some perspective that goes beyond the last month’s or last year’s inflation data, and visualize trends in city rental markets at glance, this is the place to go.

In the News

Streetsblog re-published our “Reporter’s Guide to Congestion Cost Studies,” exposing the flawed premise underlying the claims that we lose billions of dollars each year to congestion.

The Week Observed, January 13, 2023

What City Observatory did this week

A reporter’s guide to congestion cost studies.  For more than a decade, we and others have been taking a close, hard and critical look at congestion cost reports generated by groups like the Texas Transportation Institute, Tom-Tom, and Inrix.  The reports all follow a common pattern, generating seemingly alarming, but simply ginned-up pseudo-statistics about how much congestion supposedly “costs” us. As we point out, you could calculate even higher estimates of time lost in travel because we don’t all have flying cars.  The number would be vastly higher than the imagined “losses” due to not being able to drive fast all the time, and just as realistic.

Here we’ve published a reporter’s guide to these reports, something we hope those in the media will consider before they publish yet another extreme telephoto image of freeway traffic along with an uncritical recitation of these highly questionable statistics.

Another flawed Inrix congestion cost report.  It’s back, and its still bad, the 2023 version of the Inrix traffic scorecard.  Like previous editions, this uses a fundamentally flawed concept (the travel time index) to construct absurd estimates of the supposed “cost” of road congestion.  The report’s estimates assume there’s some feasible (and cost-effective) way to enable all travelers to travel at “free flow” speeds every hour of every day.  We know (see our leading Must Read article below) that widening roads doesn’t produce faster traffic speeds.

This is just more myth and misdirection from highly numerate charlatans. Plus, this year’s report buries a pretty startling lede:  Traffic congestion is now lower than it was in 2019, and congestion declined twice as much as the decline in vehicle travel.  That factoid alone should prompt policy makers to look seriously at demand management, via congestion pricing, if we’re serious about congestion.

The case against the I-5 Rose Quarter Freeway widening project.  Portland is weighing whether to spend as much as $1.45 billion dollars widening a mile-long stretch of the I-5 freeway at the Rose Quarter near downtown. We’ve dug deeply into this idea at City Observatory, and we’ve published more than 50 commentaries addressing various aspects of the project over the past four years.  From massive cost-overruns, to flawed traffic projections, to phony claims that this is a “safety project,” to a new freeway off-ramp that endangers bikes, pedestrians and other users with a dangerous hairpin turn, there’s plenty not to like about this project.  We summarize more than 50 City Observatory commentaries we’ve published over the past four years.

Must Read

Yet more media coverage of the fundamental law of road congestion.  The New York Times has a powerful and visual summary of the growing evidence for the the role of induced travel in erasing the promised travel benefits of freeway expansion projects.

Across the country, in Los Angeles, New Jersey and Houston, freeway widening projects have utterly failed to reduce congestion  Writer Eden Weingart offers a simple explanation:

When a congested road is widened, travel times go down — at first. But then people change their behaviors. After hearing a highway is less busy, commuters might switch from transit to driving or change the route they take to work. Some may even choose to move farther away.

And this story is bolstered by an impressive and growing body of scientific evidence.

“It’s a pretty basic economic principle that if you reduce the price of a good then people will consume more of it,” Susan Handy, a professor of environmental science and policy at the University of California, Davis, said. “That’s essentially what we’re doing when we expand freeways.”

The article helpfully cites several of the research papers on the subject, and quotes Matt Turner, co-author of the definitive “fundamental law of road congestion,” on the willful ignorance of those who still deny the reality of induced travel:

“If you keep adding lanes because you want to reduce traffic congestion, you have to be really determined not to learn from history,”

But state highway departments, flush with billions of dollars in federal infrastructure funds, are primed to squander them in another futile and counterproductive round of freeway construction.

New Knowledge

European cities are getting denser.  For many decades, cities around the world have generally been growing less dense:  housing and population have tended to sprawl outward from city centers.  A new article looks at patterns in European densification, and finds a surprising reversal of this trend in many, though not all, European cities.

The paper’s key findings are summarized in maps showing the European cities that became more dense (blue) and those that became less dense or sprawled (red) in two successive time periods, (2006-2012 and 2012 to 2018).  In the early period, most European cities were becoming less dense.  In the latter period, cities in much of the center of Europe became more dense.

The exceptions in the latter period were in Iberia and Eastern Europe–which got less dense over time.  Strikingly, density gains were recorded in much of the UK, France, Germany and Italy.  This recent, and now relatively widespread increase in densities may signal an urban resurgence.

The author’s point out that the housing supply plays a key role in facilitating a quick reversal to densification. The decline in density in the previous decade (and perhaps earlier) provided sufficient housing supply for many cities to grow more dense with population growth.  As the author’s explain:

By densifying, many cities in the sample could accommodate a great population increase in a short period of time with only little expansion of residential areas. This means that a buffer capacity of unused housing stock was available to satisfy the new demand. In cities with long-lasting densification trends, specific policies, e.g. on urban regeneration, may increase the availability of housing units within the urban boundaries.

In the US, low rates of housing vacancy, especially in superstar cities, make it hard to densify quickly, because adding population requires building more housing.

Chiara Cortinovis, DAvide Geneletti & Dagmar Haase, Higher immigration and lower land take rates are driving a new densification wave in European cities,Urban Sustainability Volume 2, Article number: 19 (2022).

In the News

Our analysis showing that widening the Katy Freeway did nothing to reduce average travel times in Houston was quoted in this week’s New York Times article on the futility of road widening to lessen congestion.

Streetsblog featured comments from City Observatory’s Joe Cortright in its analysis of the latest Inrix congestion cost report.

The Week Observed, January 6, 2023

What City Observatory did this week

The case against the I-5 Rose Quarter freeway widening.  This week marked the end of public comment on the Supplemental Environmental Assessment for the Oregon Department of Transportation’s proposed $1.45 billion I-5 Rose Quarter freeway widening projects.  At a billion dollars a mile, its one of the world’s most expensive, and anachronistic freeway projects.  It proposes to double down on the damage done to Portland’s historically Black Albina neighborhood, flooding the area with even more traffic, and making local streets less safe and desirable for residents, and people biking and walking. If ever there were a project that needed a full Environmental Impact Statement, this is one, but instead, ODOT is asserting that the project has “no significant environmental impact.” Here we summarize more than 50 City Observatory commentaries published over the past five years explaining the project’s true economic, environmental and social impacts.

Traffic is declining at the Rose Quarter: ODOT growth projections are fiction.   ODOT’s own traffic data shows that daily traffic (ADT) has been declining for 25 years, by -0.55 percent per year. The ODOT modeling inexplicably predicts that traffic will suddenly start growing through 2045, growing by 0.68 percent per year. 

ODOT’s modeling falsely claims that traffic will be the same regardless of whether the I-5 freeway is expanded, contrary to the established science of induced travel. These ADT statistics aren’t contained in the project’s traffic reports, but can be calculated from data contained in its safety analysis. ODOT has violated its own standards for documenting traffic projections, and violated national standards for maintaining integrity of traffic projections.

The truth about ODOT’s proposed $1.45 billion freeway widening.  It’s really an 8 or 10-lane highway expansion, not just the addition of so-called “auxiliary” lanes.”  The project is engineered to be vastly wider than needed so it can be re-striped after it is constructed.  ODOT has failed to reveal these facts in its Environmental Assessment.

  • To avoid intruding on the Eastbank Esplanade, ODOT has dropped its plans to widen the viaduct overhanging the pathway, but will squeeze in another lane of traffic by re-striping the existing 83′ wide viaduct.
  • At the project’s key pinch point, the Weidler overpass, ODOT is engineering a crazy-wide 160 foot wide roadway ostensibly for just for six travel lanes.  But this roadway is more than enough to fit 8 or even 10 lanes of traffic, just by striping it as they are now planning to stripe the viaduct section of the project.
  • ODOT’s environmental analysis has violated NEPA by failing to consider this “reasonably foreseeable” eventuality that ODOT will re-stripe the project to 8 or more lanes, which would produce even more traffic, air pollution and greenhouse gases, impacts that are required to be disclosed.

  • ODOT has also violated NEPA by failing to consider a narrower right of way: 96 feet would be sufficient to accommodate its added “auxiliary lanes” and would have fewer environmental impacts and lower costs.
  • ODOT’s safety analysis for its narrowing of the viaduct section shows that narrower lanes and shoulders make almost no difference to the crash rate, and further show that the project’s claims that it would reduce crashes on I-5 by 50 percent are exaggerated by a factor of at least seven. The analysis also shows that the project has a safety benefit-cost ratio of about 1 to 200, meaning it costs ODOT $2 for 1 cent of traffic crash reductions, about 2,000 times less cost-effective that typical safety projects.
  • The safety analysis also confirms that ODOT made now allowance for the effects of induced demand:  It makes it clear that the project assumed that traffic levels would be exactly the same in 2045 regardless of whether the freeway was expanded or not.

The IBR project: Too much money for too many interchanges.   The real expense of the $5 billion I-5 bridge replacement project isn’t actually building a new bridge over the Columbia River:  It’s widening miles of freeway and rebuilding every intersection north and south of the river.  A decade ago, an independent panel of experts convened by OR and WA governor’s strongly recommended to ODOR and WSDOT that they eliminate one or more intersections. The panel concluded that 70 percent of the cost of the project was rebuilding 7 interchanges in five miles.The experts told ODOT and WSDOT that project interchange spacing violates both federal and state design standards. The expert panel concluded that eliminating interchanges would reduce project cost, improve safety, and improve traffic flow.

Failing to look at removing or simplifying intersections after getting this expert advice is arbitrary and capricious; ODOT and WSDOT are violating the National Environmental Policy Act’s requirement that they take a hard look at reasonable alternatives.

Must Read

Jerusalem Demsas thinks that promoting homeownership as a national investment policy was a mistake.  Writing at The Atlantic, Jerusalem Demsas as a provocative essay that takes on one of the most cherished myths in America, the importance of homeownership as a broadly shared wealth-building strategy.  She starts with an observation we’ve long stressed at City Observatory:  housing policy is based on a fundamental contradiction:  we want housing to be affordable, and we want it to be a great investment.  You simply can’t have it both ways.  And in practice, a whole raft of policies, from the federal tax code to local zoning strongly favor making housing unaffordable (especially for the young, poor, and renters) at the expense of making it a great investment for older, wealthier long-time home-owners.

While homeownership is touted as a universal wealth-building strategy, whether you make a return on investment depends heavily on when, where and how you buy.  Homeownership has tended to increase rather than reduce wealth inequality because lower income households tend to buy at the wrong time, in the wrong place, and end up paying higher prices and interest rates, which cut into their financial returns.  And, over time, most of the accumulated wealth from housing investment has ended up in the hands of older Americans who had the tax-favored good luck to buy when (and where) they did.

The policy lesson from this is clear:  we ought to stop selling (and subsidizing) homeownership, and put our policy emphasis plainly on promoting affordability.  Demsas writes:

I should be explicit here: Policy makers should completely abandon trying to preserve or improve property values and instead make their focus a housing market abundant with cheap and diverse housing types able to satisfy the needs of people at every income level and stage of life. As such, people would move between homes as their circumstances necessitate. Housing would stop being scarce and thus its attractiveness as an investment would diminish greatly, for both homeowners and larger entities. The government should encourage and aid low-wealth households to save through diversified index funds as it eliminates the tax benefits that pull people into homeownership regardless of the consequences.

Perhaps growing awareness of the long-brewing housing crisis in the US will lead more people to give some thought to this idea.

New Knowledge

Insights into 15 minute living.  Economist Ed Glaeser and a team of researchers have used mobile phone and amenity location data to look at the relationship between accessibility and “15 minute walking”.

The authors gather data on 11 billion (!) 2019 trips to “points of interest” (a grab bag term for destinations ranging from schools and parks to grocery stores and restaurants).  They then examine differences in trip-making patterns between locations with abundant local destinations and those with fewer destinations nearby.  The data provide a useful test-bed for seeing how living in a “15-minute neighborhood”–a place with lots of common destinations within a 15 minute walk–influences travel behavior.

The core conclusion of the study is that people who live in neighborhoods with higher levels of access (more nearby destinations), tend, on average to have shorter trips. There is considerable variation across metropolitan areas in the share of “15 minute” trips, ranging from as few as 10 percent in sprawling Atlanta to as many as 40 percent in New York.  Metros with more accessible destinations, on average, had a higher fraction of these 15 minute trips.
The authors conclude:
In a statistical sense, 15-minute access can explain eighty percent of the variation in 15-minute usage across metropolitan areas and 74 percent of the variation in usage within metropolitan areas. A one percentile increase in access is associated with a .8 percentage point increase in the share of trips that are within 15 minutes walking distance. This coefficient is not particularly sensitive to other controls, including population density, income and share of the population in the block group that owns a car. This correlation does not imply that encouraging more mixed-use development within residential areas will reduce average trip times, but it is certainly compatible with that hypothesis

Timur Abbiasov, Cate Heine, Edward L. Glaeser, Carlo Ratti, Sadegh Sabouri, Arianna Salazar, and Miranda Paolo Santi, The 15-minute city quantified using mobility data, NBER Working Paper 30752 http://www.nber.org/papers/w30752 NATIONAL BUREAU OF ECONOMIC RESEARCH

In the News

Commentaries by City Observatory’s Daniel Kay Hertz and Joe Cortright were featured prominently in Jerusalem Demsas’ Atlantic article on the problems with homeownership policy.  Demsaas quoted Hertz’s observation that housing is a sustained intergenerational transfer of wealth from old to young, and Cortright’s analysis of why, in practice, homeownership turns out to be a wealth destroying strategy for the poor and people of color.

The Week Observed, December 16, 2022

Editor’s Note:  Public Comment on the I-5 Rose Quarter Freeway Project

Between now and January 4, 2023, the public will be asked to weigh in with its comments on the proposed I-5 Rose Quarter Freeway Widening project.  If you’re interested, you can make your voice heard.  For more information on how to comment, we urge you to visit No More Freeways’ website.  For the rest of this month, City Observatory will be presenting a synopsis of its independent research on the project.

What City Observatory did this week

Blame inflation now: Lying about the latest IBR Cost Overrun.  The price of the I-5 “bridge replacement” project just increased by more than 50 percent, from $4.8 billion to $7.5 billion.  ODOT and WSDOT are blaming “higher inflation” for IBR cost overruns. As we’ve noted, the Oregon Department of Transportation has a long string of 100 percent cost-overruns on its major projects.  Almost every large project the agency has undertaken in the past 20 years has ended up costing at least double–and sometimes triple–its original cost estimate. The data don’t support their claim–their own agencies official projections of future construction price inflation show a negligible change from 2020 levels.

Higher construction cost inflation accounts for only $300 million of a $2.7 billion cost increase.  

The Oregon Department of Transportation’s Empty Promises for a Rose Quarter Freeway Cover.  ODOT’s Supplemental Environmental Analysis shows it has no plans for doing anything on its vaunted freeway covers. It left the description of cover’s post-construction use as “XXX facilities” in the final, official Supplemental Environmental Impact Statement.  So maybe, a strip club or an adult book store on the freeway cover?

The report makes it clear that “restorative justice” is still just a vapid slogan at the Oregon Department of Transportation. In short, ODOT has no plans to construct covers that will support significant buildings, no plans for any meaningful use of the covers after the highway is complete, and no funding for it (or anyone else) to develop anything on the highway covers.  And if somebody else does have an idea, they’ll have to pursue it with their own money, and they’d better bring lots of lawyers, because it’s not going to be easy.  In the meantime, Albina, enjoy your “XXX facilities”—we’re sure they’ll be special.

A million more miles of traffic on local streets thanks to Rose Quarter Freeway Widening.  ODOT’s proposed relocation of the I-5 Southbound off-ramp at the Rose Quarter will add 1.3 million miles of vehicle travel to local streets each year.  Moving the I-5 on ramp a thousand feet further south creates longer journeys for the 12,000 cars exiting the freeway at this ramp each day.

The new ramp location requires extensive out-of-direction travel for all vehicles connecting to local streets.  With more miles driven on local streets, and more turning movements at local intersections, hazards for all road users, but especially persons biking and walking, increase substantially. More driving on neighborhood streets increases local pollution and greenhouse gas emissions.

In the News

Willamette Week featured comments by Joe Cortright on the 56 percent increase in the cost of the proposed Interstate Bridge Replacement project to $7.5 billion, noting we had predicted the price increase six months ago:
Today’s increase did not surprise Joe Cortright, the Portland economist who has closely followed this iteration and the earlier version of the same effort, the Columbia River Crossing Project. In a May piece for City Observatory, Cortright wrote, “The IBR is likely to be a $5-7 billion project.”

The Week Observed, December 2, 2022

Editor’s Note:  Public Comment on the I-5 Rose Quarter Freeway Project

In the next month, the public will be asked to weigh in with its comments on the proposed I-5 Rose Quarter Freeway Widening project.  If you’re interested, you can make your voice heard.  For more information on how to comment, we urge you to visit No More Freeways’ website.  For the rest of this month, City Observatory will be presenting a synopsis of its independent research on the project.

What City Observatory did this week

Why won’t ODOT tell us how wide their freeway is? After more than three years of public debate, ODOT still won’t tell anyone how wide a freeway they’re planning to build at the Rose Quarter. ODOT’s plans appear to provide for a 160-foot wide roadway, wide enough to accommodate a ten lane freeway, not just  two additional “auxiliary” lanes.  In reality, ODOT is planning a 10 lane freeway.

ODOT is trying to avoid NEPA, by building a wide roadway now, and then re-striping it for more lanes after it is built. The agency has utterly failed to examine the traffic, pollution and safety effects of the ten-lane roadway they’ll actually build.

Exposing the black box calculations that highway builders use to sell their projects.  State DOT officials have crafted an Supplemental Environmental Assessment that conceals more than it reveals. The Rose Quarter traffic report contains no data on “average daily traffic” the most common measure of vehicle travel. Three and a half years later and ODOT’s Rose Quarter’s Traffic Modeling is still a closely guarded secret. The new SEA makes no changes to the regional traffic modeling done for the 2019 EA, which was done 7 years ago in 2015. The report misleadingly cites “volume to capacity ratios” without revealing either volumes or capacities. ODOT has violated its own standards for documenting traffic projections, and violated national standards for maintaining integrity of traffic projections.

Oregon Department of Transportation admits its re-design of the I-5 Rose Quarter project will create a dangerous, substandard freeway exit and increase crashes.  Earlier, City Observatory pointed out that a new design of Portland’s $1.45 billion I-5 Rose Quarter freeway widening will construct a dangerous hairpin turn for vehicles exiting the I-5 freeway into Northeast Portland.

The project’s own Supplemental Environmental Assessment confirms our analysis.  This newly revealed ODOT report shows the redesign of the I-5 Rose Quarter project will:

  • creates a dangerous hairpin turn on the I-5 Southbound off-ramp
  • increase crashes 13 percent
  • violate the agency’s own highway design standards
  • result in trucks turning into adjacent lanes and forcing cars onto highway shoulders
  • necessitate a 1,000 foot long “storage area” to handle cars exiting the freeway
  • require even wider, more expensive freeway covers that will be less buildable

Must read

USA, USA!  Number one . . . for traffic deaths.  A terrific data-driven story from the New York Times’ Emily Badger and Alicia Parlapieano points out an unfortunate area of American exceptionalism:  traffic deaths.  The US now has a higher death rate from traffic crashes that almost every advanced economy.  And while the rest of the world has been steadily reducing crash deaths, they’ve been going up, especially for vulnerable road users.  The key findings are spelled out in a compelling chart showing the number of traffic deaths per capita for major countries.

There have been some counter arguments that the US death rate isn’t so high if you compute it on a “per mile traveled” basis, rather than per capita, but that misses two key points.  First, America’s excessive auto dependence is what forces Americans to drive long distances; and every mile traveled is an additional risk, both to the person traveling and to other road users.  And second, even after adjusting for miles traveled, the trend is still in the wrong direction:  the US is getting significantly less safe that other industrialized nations, something we’ve pointed out for years at City Observatory.

Widening that freeway won’t lessen congestion, it will increase pollution.  Austin’s LMT Online responds to the plaintive cries that I-35 is congested with exactly the right tone:  so what?  Highway apologists at the Texas Transportation Institute call the highway one of the busiest in Texas, but time and again, experience has shown that widening roadways actually doesn’t reduce congestion, it just prompts more people to drive further and fuels more sprawl and car dependence.  TXDOT is proposing a $5 billion widening of I-35 through Austin.  The city’s residents will get a chance to weigh in on that decision when they choose a new Mayor soon:  one candidate favors the freeway, his opponent is against.  We’ll watch to see if Austin is learning, or if it continues to believe that just one more lane will fix things.

In the News

The Portland Mercury quoted Joe Cortright’s analysis of induced demand from the I-5 Rose Quarter Freeway widening project.

The Week Observed, November 4, 2022

What City Observatory did this week

Risky bridges: If you’re going to spend several billion dollars, you might want to get some independent expert advice.  Oregon and Washington are on the verge of committing 5 billion dollars to the construction of the so-called I-5 “Interstate Bridge Replacement” project between Portland and Vancouver.  But they’re doing so largely on the word of the two state transportation departments.

They should have learned from their experience a decade ago when the same project, then pitched as the Columbia River Crossing, failed, despite being subjected to four different independent reviews. This time, there hasn’t been any independent engineering review, like the 2011 panel that showed the planned CRC “open web” design was unbuildable.  And the IBR advocates have steadfastly refused to conduct an independent toll analysis, like the one that showed the CRC would require vastly higher tolls than advertised, and that traffic on the giant new bridge would be permanently lower than the smaller bridge it replaced.  It isn’t prudent to commit to a billion dollar project without seeking unbiased expert advice.

Must read

This week we highlight some images that you may have seen on Twitter.  The Missouri Department of Transportation celebrated the 50th Anniversary of Kansas City’s downtown loop.  But is it a loop . . . or a noose?  The kickoff to this conversation was MoDot’s map of the downtown.

 

Urbanist twitter wasn’t as excited.  Hayden Clarkin constructed this map showing how much of the city’s downtown was converted to garages (green) or surface parking lots (yellow).  City government spent several hundred million dollars in tax increment financing on subsidizing the construction of downtown parking garages).  The flourishing of parking lots and garages coincided with the construction of area freeways, including the loop.

The net result was that the dense, vibrant Kansas City that existed a century ago has been, in many places, simply eradicated.  These following pictures show approximately the same view of a downtown Kansas City street in in 1893 (before the automobile or freeways) and today.

.

Visualizing the city of the future.  Noah Smith publishes a guest column by Alfred Twu with some inspiring and delightful images of what our cities could be like.  Too much of the debate about housing and urbanism revolves around arcane statistics (floor area ratios), inscrutable zoning regulations, or tribal posturing by NIMBYs or YIMBYs.  Its refreshing to show what kind of future our cities might have.  Here’s just one such illustration.

As Smith emphasizes, Twu offers an evolutionary vision of how our current cities could change and adapt to better meet the needs of a changing population and the challenges of climate change and economic transformation.  Twu writes:

The city of tomorrow has many familiar streets and sights of the city of today – after all, it’s a remodel, not a rebuild. While other countries have the option of building new cities or neighborhoods beyond the edge of existing ones, the US already used up most land within commuting distance in the 20th century on low density suburbs. Most of us will live in places that already exist today, but with changes.

Being explicit about the images of the kind of places we want to have is something that needs to play a much larger part in debates about planning, housing, and urbanism.

New Knowledge

Micro-mobility reduces traffic congestion and speeds travel times.  The advent of shared micro-mobility (bikes, e-bikes and e-scooters) has added to the range of urban transportation options.  Most see micro-mobility as a green transportation mode, but skeptics worry that micro-mobility may be largely attracting riders from among existing transit, bike and walking trips, rather than reducing car travel.

Teasing out the net effect of micro mobility on urban transportation typically relies on surveys asking scooter or bike users how they would travel if these options weren’t available, but such studies are often inconclusive.  Some relatively abrupt policy changes to micro-mobility regulation in Atlanta created some natural experiments that researchers were able to harness to estimate the impact of micro-mobility on traffic.  Atlanta officials adopted restricted hours for some e-scooters, banning their use between 9 pm and 4 am.  By looking at changes in traffic speeds before and after the restrictions, the researchers were able to estimate how traffic speeds changed.  Using traffic speed data provided by Uber, the researchers were able to track how travel times in key neighborhoods changed after the ban.  Here are two charts illustrating their findings.

The right panel shows the increase in travel times (in additional minutes per mile) for the curfew in Atlanta’s midtown area, while the right hand panel shows the effect of same policy in areas around MARTA transit hubs. In both cases, travel times in the affected areas (measured in minutes per mile traveled) increased–meaning travel was slower without the availability of micro-mobility devices.  This change represented about a 10 percent increase in typical commute times and a 37 percent increase in travel times during major events.

We find that after about a week, users partially account for the policy change in their travel planning and habits. This behavioural response suggests that as riders pivoted from micro-mobility devices back to personal cars or ridesharing, the congestion effect following the ban stabilizes to a mean treatment effect of 0.25 minutes per mile after five weeks

While this study adds strong evidence that micro-mobility does reduce car trips, there’s a deeper issue that is overlooked:  By their nature, micro-mobility trips add to and enrich the accessibility of urban spaces.  By giving people more choices, especially non-car choices, micro-mobility is and enhancement and amenity to urban spaces.  Making urban living, consumption and experiences more accessible and attractive likely helps increase density and lowers overall trip making and vehicle miles traveled.  The “substitution” argument is implicitly a “lump of travel” fallacy:  assuming that there is a fixed and pre-ordained amount of trips to be made, and that providing additional modes only re-distributes travel among other modes.  The reality is more nuanced and complex, especially in the long run:  If the availability of micro-mobility makes urban living more attractive, more people are likely to live in cities, and ultimately own fewer cars and drive less.  Its these long-term, second- and third-order effects of expanded transportation options that deserve our close attention.

Asensio, O.I., Apablaza, C.Z., Lawson, M.C. et al. Impacts of micro-mobility on car displacement with evidence from a natural experiment and geofencing policy. Nat Energy (2022). https://doi.org/10.1038/s41560-022-01135-1

(Hat tip to the prolific David Zipper!)

The Week Observed, October 28, 2022

What City Observatory did this week

A toll policy primer for Oregon.  The Oregon Department of Transportation is proposing to finance billions in future road expansions with tolling.  While we’re enamored of road pricing as a way to better manage our transportation system, the movement to raise moeny with tolls, and in particular by borrowing against the promise of future toll revenues raises significant risks.  This policy primer sketches out the nature of the underlying transportation problem we’re trying to fix (underpriced roads are over-used, and chronically under-funded), and describes the risks from ODOT’s proposed “borrow and build first, toll later” strategy, which promises to saddle the state with billions of dollars in debt for roadways that are unlikely to be used when we ask drivers to pay for them through tolls. Louisville, Kentucky stands as a cautionary tale:  the state spent a billion dollars widening I-65 over the Ohio River, but after it imposed modest tolls traffic fell to half of pre-construction levels, saddling the state with enormous debt for a road that is spectacularly under-used:

 

The “borrow, build and toll” scheme runs this risk of putting the state in a position where it has to encourage more driving (contrary to its stated climate goals) in order to pay off construction debt.  State policy makers would be well-advised to fully understand the costs and risks of tolling before they pin the state’s transportation future to this perilous source of funding.

Must read

A reform agenda for US DOT.  We’re a little bit late to this particular party, but the Center for American Progress produced a terrific report in 2020 identifying some key opportunities to overhaul transportation policy in the US.  This comprehensive report addresses transportation finance, climate change, and equity, and is a useful framework for those working to improve transportation at a state and federal level.

. . . current policy and program structures need deep reforms to ensure that federal investments are equitable, sustainable, and targeted to communities facing the greatest need. Simply adding more money to the status quo will not help the United States meet its global climate commitments or redress the harms caused by discriminatory project selection and exclusionary labor practices.

You get what you pay for and housing affordability:  Writing at Market Urbanism, Salem Furth has a nuanced take on an oft-overlooked part of the housing affordability puzzle.  A big part of our problem is an overall shortage of housing relative to demand.  But the problem isn’t just one of aggregate supply.  There’s a mismatch between the houses that are available and the characteristics of housing we most value and want to buy.   Housing comes bundled with lots of characteristics and amenities, some of which we value highly, and some not so much.  One of the effects of zoning regulations is to force people to pay for some things they don’t value highly:  Minimum lot sizes require people to “buy” a minimum amount of land in order to be able to have a place to live, and in places where land is expensive, that may make housing to costly for many to afford.  Exclusive single family zoning in some neighborhoods means you have to buy a single family home to live in that neighborhood, even though you might prefer a townhouse or apartment (and be able to avoid lawn maintenance).  More subtly, zoning and building requirements can create mismatches between the kind or size of housing people want, value or can afford and what’s available.  So the more we create flexibility to have different kinds of housing, the more likely we are to have better “matches” between supply and demand and reasonable price points.

Developers push big hgihway package in Sacramento.  California public finance has been hamstrung by super-majority voting requirements for many tax increases, but the state Supreme Court recently carved out a provision allowing citizen-initiated taxes to be adopted with a simple majority.  In Sacramento, a developer backed group has seized on this provision to package up a $8.5 billion road widening package that would facilitate further sprawl in and around California’s capital.  There’s a lot not to like about this proposal:  it would not only undercut the region’s and state’s climate strategy, but it is also funded from the sales tax, which means that there’s no connection between how much you travel or drive, and who pays for the road expansion.  The Sacramento Bee has an in-depth story on the political maneuvering going on.

New Knowledge

Mapping activity centers in the nation’s metropolitan areas.  The spatial structure of metropolitan economies is an under-studied topic.  Our friends at the Brookings INstitution’s Metro Center have a new publication that sheds an interesting light on the clustering of economic activity within metro areas.  They use a wealth of data on the location of businesses and institutions within cities to identify what they call “activity centers.”  Their approach identifies five different kinds of assets or amenities (tourism, community, tourism, consumption, institutional, and economic. They then gather data on the concentrations of those assets in different location and identify three differ kinds of centers.  Places with a concentration of just one asset are called mono-centers.  Places with some concentration of at least two different assets are secondary centers, and the places with relatively large concentrations of two or more kinds of assets are defined as “primary centers.”

This classification system enables Brookings to map the size and location of activity centers in more than 100 of the nation’s largest metropolitan areas.   These maps illustrate the concentration of activity in particular locations.  Here is the map for Portland, Oregon:

 

The largest concentration of activity centers is in an around the city’s downtown (in the center of the map). There are a smattering of other centers in the remainder of the region, with most suburban centers being “mono-centers” (blue).

The Brookings research strategy parallels City Observatory research in developing the “storefront index.”  Our index looks at the concentration of common kinds of retail activity, and identifies clusters of storefronts located 100 meters or less from one another.  Here we’ve overlaid the storefront index on the Brookings Activity Centers for the Portland Metropolitan area.

On this map, the red dots represents clusters of storefronts.  The black circle in the center of the map is a 3-mile radius around the center of the region’s central business district.

The definition of activity centers is just the starting point.  The full report looks at the connections between activity centers and employment concentrations, productivity, walkability and vehicle miles traveled.  Primary activity centers tend to be among the most disproportionately economically productive locations, tend to be more walkable and people living in or near these centers are less likely to travel by automobile.  In addition, as the report points out, because

. . . low-income and racial minority groups tend to live in areas with higher accessibility to activity centers, efforts to focus development in these areas have strong potential to benefit these groups.

We need to know much more about the fabric and dynamics of urban spatial structure.  This Brookings report helps us visualize how important this perspective can be.

Tracy Hadden Loh, DW Rowlands, Adie Tomer, Joseph Kane,and Jennifer Vey, Mapping America’s Activity Centers”. the Building Blocks of Prosperous, Equitable and Sustainable Regions.  Brookings Institution, October, 2022.

In the News

The Brown Political Review cited our critique of Bruce Katz’s book The New Localism.

The Week Observed, October 21, 2022

What City Observatory did this week

Using phony safety claims to sell a billion dollar freeway widening.  This past week, Sarah Pliner, a promising young Portland chef was killed when she and her bike were crushed by a turning truck at SE Powell Boulevard and 26th Avenue.

Sarah Pliner (Bike Portland)

This intersection is an Oregon Department of Transportation roadway, which has been long identified as one of the most dangerous in the state.  OregonDOT has dragged its feet to do anything to improve safety, instead prioritizing the fast movement of cars and trucks.  While its done nothing here at a facility thats repeatedly killed and maimed local residents, its peddling is $1.45 billion plan to widen Interstate 5 as a “safety” project–notwithstanding that the only crashes on that facility for years have been largely minor fender-benders.  At City Observatory, we’ve repeatedly shown that the Rose Quarter project would do nothing to address the real safety problems in Portland.  It’s tragic that another person had to die because Oregon DOT continues to use a cynical and self-serving definition of safety to avoid spending money to address its roads that kill.

Must read

AASHTO:  Your highway department is a climate denier.  The Federal Highway Administration has released draft regulations calling on state highway agencies to set goals and report their progress in reducing greenhouse gas emissions from transportation.  The American Association of STate Highway and Transportation Officials (AASHTO) is violently opposed to these regulations.  Kevin DeGood of the Center for American Progress has a close, critical review of AASHTO’s comments.  The highway builders argue that they’re really powerless to do anything about greenhouse gas emissions, that it would be prohibitively costly to even monitor their impacts, and that FHWA lacks the authority to do anything.  DeGood comprehensively disposes of all these arguments:

. . .  DOTs think of GHG reductions almost exclusively in terms of EV adoption. Of course, this is ridiculous. DOTs are principally responsible for land use. Low-density sprawl doesn’t happen without highways. Full stop.

State DOTs claim that federal guidance isn’t needed because they’ve been great stewards of federal largesse and have improved the performance and safety of the highway system.  DeGood points out that that simply isn’t true:  when it comes to safety, highway deaths have increased sharply in the past decade from 34,000 per year to 43,000.  AASHTO is all about rationalizing federal handouts, and the organization is congenitally opposed to any accountability.

Mortgage rates soar; housing market crumbles. Since the beginning of the year, the interest rate charged on a 30-year, fixed rate mortgage has more than doubled, from about 3 percent to nearly 7 percent. This surge is a product of the Federal Reserve’s inflation-fighting policy. The 30 year mortgage rate is now higher than its been any time in the past two decades and the low interest policy that prevailed in the wake of the “Great Recession” is now over.  What its likely to do is choke off investment in new housing. Higher interest rates mean that homebuyers are less able to afford homes, and that it’s harder for investors to profitably build new housing.

Higher mortgage rates dampen the incentives for both buyers and seller:  buyers face higher costs for obtaining mortgages for their home purchases, and prospective sellers with mortgages also face the prospect that the mortgage interest rate on their next home may be much higher than the mortgage rate on their existing loan; this is likely to throttle “trade up” home purchasing for many current homeowners.  The housing market has benefitted from a decade of 30-year fixed mortgage rates at or below four percent.  Those days appear to be over.

New Knowledge

Cycling:  Cultural, not geographic factors predominate.  Cycling is more difficult and less comfortable in cold or wet weather and in hilly locations.  But this study shows that across the US and within metropolitan areas, these climate and geographic factors play a very minor role in explaining variations in cycling rates.  Instead, a range of demographic and cultural factors seem to be more strongly correlated with cycling.

University of Hawai’i economist Justin Tyndall used US census data to look at the correlations between bicycle commute-to-work mode share and a range of topographic, climatic, and social factors across the nation’s metropolitan areas.  A quick visual summary of this work shows few strong correlations between cycle mode share and temperature, precipitation or snowfall.  It turns out that hilliness actually has a slight positive correlation with cycle commuting (more people bicycle to work in hilly places, on average).  And cultural factors (proxied by presidential vote shares) seem to have a far stronger relationship to cycling. Metro’s that voted in larger number for Donald Trump have lower rates of cycle commuting.

What these data suggest is that cycling is less about weather or topography than it is about demographics and cultural attitudes towards cycling. As Tyndall concludes:

The role of geography in cycling uptake is frequently discussed in relation to the construction of bicycle infrastructure such as bike lanes. Opponents of bicycle infrastructure often point to hills or unsuitable weather as evidence that cycling can not be locally popular. The findings in this study have a potentially important lesson for policy: climatic and topographical endowments are unimportant to the general uptake of cycling. The exogenous cause of spatial heterogeneity in cycling appears to be related to local demographic and cultural idiosyncrasies.

Justin Tyndall, “Cycling Mode Choice Amongst US Commuters: The Role of Climate and Topography,” 2022, Urban Studies, 59(1)https://www.justintyndall.com/uploads/2/8/5/5/28559839/tyndall_cycling.pdf

The Week Observed, October 14, 2022

What City Observatory did this week

Two of the three candidates for Oregon Governor are Climate Deniers. Oregon will elect a new Governor next month, and two of the three candidates for the job insist on repeating the discredited myth that greenhouse gas emissions can be reduced by widening freeways so that people don’t spend as much time idling in traffic.  As we’ve demonstrated time and again at City Observatory, wider roads lead to more driving and increased emissions.  Pretending that wider roads will reduce greenhouse gas emissions is climate denial, something that independent Betsy Johnson and Republican Christine Drazen are shamelessly repeating. Falsely claiming that reduced idling will lower greenhouse gases is a popular lie among highway advocates.  Current Oregon DOT director Kris Strickler made this a centerpiece of his confirmation testimony in 2019.  But repeating this lie doesn’t make it true.  What it does do is further delay taking action that would really address the climate crisis.

Must read

What’s behind regional variation in housing production.  Nationally, we have a significant shortage of housing, but in reality, there isn’t a single national housing market.  Housing is in many ways a distinctly local set of markets.  Why do some markets seem to produce a lot of new housing, and others not so much?

The Urban Institute’s Yonah Freemark offers a typology and analysis of housing production in US metro areas.  Freemark’s unit of analysis is the municipality, and he classifies cities within a Metro area based on their housing costs relative to the regional average–a good indicator of demand and amenities.  Overall, few housing units get built in the lowest value cities and the highest value cities, but for very different reasons.  In low value cities, there’s little demand; in high value cities, local policies often greatly constrain new development.

Freemark’s work points up a consistent pattern in which high value, high demand communities consistently tend to allow less housing to be built than other municipalities in the same region:

. . . of the nation’s most-in-demand municipalities—those where housing values are at least 30 percent higher than their respective metropolitan areas—less than a third added more housing than their encompassing region, despite plentiful developer demand to build there. In contrast, more than 40 percent of such jurisdictions added new housing at 50 percent or less of the rates of their respective metropolitan areas—and many actually lost housing units.

 

It’s time to reform streets, not just highways.  While there’s considerable push-back against urban freeway expansion projects around the nation, Washington transportation advocate Anna Zieverts says we also need to pay attention to how we use streets.  Writing at Next City, Zieverts, who works for Disability Rights Washington, argues that we priortize car movement on local streets, and that multi-lane arterials, limited pedestrian crossings, and car-oriented signal timing all make streets difficult or dangerous for non-car road users.  While replacing freeways with boulevards may lower their impact, even boulevards are chiefly designed to move cars.  For city streets, we ought to reverse the priority, putting pedestrians and bikes first, and slowing car traffic.

What if slow speed streets were the default for our cities? Yes, it would take longer to travel far distances by car in a city, but that may be one of the more equitable ways we can incentivize people to choose other modes, especially if roads designed for slower car speeds were paired with investments in faster bus-only lanes.

Maryland highway expansion sued by Sierra Club and NRDC.  Environmental and community groups are challenging plans to widen I-95 and the Capital Beltway outside Washington DC, alleging, in part that the project’s environmental impact was concealed by faulty traffic modeling. The most obvious error in the traffic modeling was that it wildly exaggerated traffic levels, congestion and pollution in the “No Build” scenario, assuming a lemming-like behavior of commuters in the face of ever increasing delays.  The plaintiffs complaint argues:

Defendants relied on traffic modeling performed by MDOT to predict that
adding toll lanes will reduce traffic congestion on the Beltway and I-270. MDOT’s
model, however, counterfactually assumes that drivers will continue to pile onto these highways no matter how far the backup and no matter how long the delay. In reality, drivers often re-route when faced with extreme congestion. Defendants did not explain how they could reasonably rely upon the model’s outputs given this serious limitation. MDOT also refused to release its underlying modeling files, further thwarting public scrutiny of Defendants’ justification for the toll lanes project.

By exaggerating traffic congestion and pollution in the No-Build, the EIS makes the build scenario look relatively more benign than it actually is.  And all of the modeling is closely guarded secret: a black box that’s not subject to any external review, and one that conveniently always produces the answers the highway department’s want.

New Knowledge

Work from home is shredding office values.  In the wake of the Covid pandemic, there has been a persistence of remote working, particularly among knowledge work occupations.  According to a new study from three economists at New York University and Columbia University, the work from home trend is having a dramatic impact on the value of the nation’s commercial offices.

Spending more time working at home means spending less time working in the office.  With fewer people in the office, there may be less demand for office space in the future.  This study looks at recent trends in office occupancy, their correlation with work from home, and its impact on commercial leasing.  Most office space is leased for a period of several years, and in the short term (i.e. since the advent of the pandemic) there have been limited opportunities for firms to adjust how much space they lease.  But as leases come up for renewal, and as firms seek (or don’t seek) new office space, we would expect the declining demand for office space to begin to be reflected in the market.

The study shows that there’s been a sharp slowdown in new leasing activity, and in general a shift to shorter-term leases (more leases for fewer than three years; fewer leases for more than seven years).  Prior to the pandemic, the national market included about 250 million square feet of new office leases per six months; after that leasing rate has fallen to less than 100 million square feet.

But while new leasing activity has declined sharply–and not rebounded as the pandemic has eased, the path of prices has been different.  Average rents for new leases did in fact fall during 2020, but have rebounded since then.  As the authors point out, some of this rebound is due in part to a composition effect, a result of a larger share of leases being signed for the highest quality office spaces; adjusting for those effects (the solid line) shows a much smaller rebound.

Looking forward, a decreased demand for office space could significantly effect both commercial office development and city finances.  Based on their estimated trajectory for persistent work-at-home, the authors estimate that commercial office values in New York, for example, could decline almost 40 percent in the long run.  Because the city relies heavily on property taxes tied to values, that could result in a significant fiscal impact.

Arpit Gupta, Vrinda Mittal, & Stijn Van Nieuwerburgh, Work from Home and the Office Real Estate Apocalypse, Working Paper 30526. http://www.nber.org/papers/w30526 National Bureau of Economic Research

 

The Week Observed, July 8, 2022

What City Observatory did this week

Building a bridge too low–again.  In their effort to try to revive the failed Columbia River Crossing (a $5 billion freeway widening project between Portland and Vancouver) the Oregon and Washington transportation departments are repeating each of the mistakes that doomed the project a decade ago.  The latest blunder:  again planning for too low a bridge to accomodate river navigation.  The Columbia River is a major industrial thoroughfare in its own right, and the protecting river navigation is the charge of the US Coast Guard under the 1899 Rivers and Harbors Act.  The two state DOTs have planned for a fixed, 116-foot high span in place of the existing lift-bridge which allows for vessels as high as 178-feet to traverse the bridge.  Last month, the Coast Guard issued its finding that “reasonable needs of navigation” require at least the 178-foot vertical clearance.

The Oregon and Washington DOTs are in full denial mode, arguing that building a span with that clearance would be unnecessarily expensive and inconvenient to vehicle travel (with a high fixed span) and that building a new moveable span would require building the largest moveable span in the world.  A quick look around the Portland region shows that isn’t true:  The region has at least three bascule bridges with comparable clearances, some built almost a century ago.  But the DOTs seem determined to try to pressure the Coast Guard to allow a lower fixed span, something the agency has signalled its not willing to do.  Just as a decade ago, the failure to realistically address navigation needs is going to delay, and perhaps defeat, efforts to replace the I-5 bridges.

Must read

What “Ban Cars” really means.  There are few things less incendiary in public discourse that the “War on” framing of an issue.  Just a bit tongue in cheek, there’s the “War on Cars” podcast, which explores the pervasive costs, externalities and inequity of our car based transportation system.   While it grabs attention, to some, the framing seems needlessly provocative.  Writing at Jalopnik, Doug Gordon provides some background and nuance beyond the battle cry.  As he explains, the opposition to cars is really about creating better choices.

The goal of “the ban cars movement,” as I see it, is not to render cars obsolete. It’s to give people the choice to live a life where car ownership, and car dependence, are unnecessary — regardless of socioeconomic status or physical ability to operate an automobile. It starts in dense urban areas, places where a few policy tweaks could turn public transit, cycling and walking into the lowest-stress, most convenient options.

In short, our car-dependence has become so deeply woven into the fabric of everyday life that we’re unable to recognize or question it.  The “ban cars” rhetoric is one way of shocking people into re-thinking these unquestioned assumptions.

In the News

Willamette Week quotes City Observatory’s Joe Cortright on the problems associated with building a high bridge—with an excessively steep grade—across the Columbia River.

The Week Observed, May 27, 2022

What City Observatory did this week

Our apologies to City Observatory readers for our website outage on 19-22 May. 

More meaningless congestion pseudo science.  A new study from the University of Maryland claims that traffic lights cause 20 percent of all time lost in traffic.  The estimate is the product of big data analysis of anonymized cell phone data, and claims that we collectively lose more than $8.6 billion per year waiting 329 million hours hours at stop lights.  The computation may be right, but in the real world is meaningless.

Once can’t speak of that time as somehow being “lost” unless there were some other alternative without traffic lights in which travel times would somehow be less.  The “loss” is purely imaginary because it would cost vastly more than the value of lost time to say, eliminate all traffic lights by building grade separated interchanges, and stop signs at intersections would likely result in even slower travel (which might improve safety).  This study is emblematic of the underlying flaw in almost all “congestion cost” studies:  they implicitly assume that there’s some way to reduce travel times–usually by widening roads–but that would cost vastly more than the value of time supposedly lost, and is ultimately futile and self-defeating because induced travel quickly erases any time savings from wider roads.

Must read

CalTrans acknowledges induced demand.  State highway departments have long been in denial about the reality of induced demand:  wider urban roads cause more travel. In a few states, DOTs are beginning to acknowledge the well-documented science of induced demand.  CalTrans has even produced a helpful non-technical diagram explaining why building more lanes is a futile and self-defeating approach to reducing traffic congestion.

The graphic was prompted by state legislation directing CalTrans to more fully consider the greenhouse gas impacts of its investments.  Perhaps they can share it with other state highway departments. (Hat tip to @EngineerDustin for flagging this graphic).

Higher registration fees for large vehicles.  Trucks and SUVs have been getting both larger and more numerous, and are particularly problematic in cities, where they pollute more and are deadlier in crashes than smaller vehicles.  While the federal tax code subsidizes larger vehicles, some cities are starting to look at levying registration fees that reflect back to large vehicle owners a portion of the cost their vehicles impose on others. David Zipper, writing at Bloomberg offers the details of a Washington, DC proposal to charge a $500 registration fee for vehicles larger than 6,000 pounds, and amount that’s about seven times higher than the standard vehicle registration fee in the district.  While local governments are generally barred from regulating vehicle characteristics, they can use fiscal incentives to discourage excessively large vehicles.  As Zipper concludes:

With traffic fatalities climbing and the effects of climate change growing more dire, passage of the District’s new fee structure could serve as a model. Even if federal officials continue to turn a blind eye, state and local leaders need not stand by while drivers of these massive vehicles impose costs on everyone else around them. Instead, they can send motorists a clear message: If you want to buy a mammoth-sized vehicle, no one is going to stop you — but you’re going to have to pay extra.

New Knowledge

The declining salience of race in emissions exposure.  Many studies have noted a strong correlations between race and income and exposure to pollution.  People of color and the poor tend to live in homes and neighborhoods with greater exposure to air pollution and toxics.  That’s true for a variety of reasons:  politically, it’s often been easier to route highways through poor neighborhoods than rich ones, and there’s usually more effective demand to eliminate or preclude pollution in wealthier neighborhoods.  Economically, neighborhoods with high levels of pollution tend to hemorrhage population:  one thing people do when they get more income is to move to places that are less polluted.

A recent study published by the National Bureau of Economic Research looks at recent trends in air pollution to see how the relationships between income, race and pollution have changed in the past couple of decades.  It draws on a kind of natural experiment, a reduction in pollution levels triggered by stricter limits on particulate pollution.  In 2005, the EPA began enforcing a new air quality standard on fine particulate matter (PM 2.5) requiring areas that violated the standard to enforce stricter pollution control requirements.  Over the next 15 years, the new standards lead to a reduction in fine particulate pollution overall, but the study showed that the reductions in pollution were greatest in neighborhoods of color (where pollution tended to be higher).  The result of the enforcement of these new standards was to both dramatically reduce overall pollution levels, and to greatly shrink the racial gap in pollution exposure.

The authors conclude:

. . .  racial differences in ambient particulate exposure declined significantly between 2000 and 2015. . . .  We focus on PM2.5 and show that the gap between African Americans and non-Hispanic Whites narrowed from -1.6 µg/m3 in 2000 to -0.5 µg/m3 by 2015. . . . We find that very little of the decline in the gap in mean exposure levels can be accounted for by changes in mobility, individual, or neighborhood-level characteristics. Similarly, we find that racial gaps in exposure have narrowed at each quantile of the PM2.5 distribution, and that little of this narrowing can be explained by the demographic characteristics available in Census data. Instead, we find that virtually all of the closure of the gap is due to falling pollution levels in the areas where African Americans are more likely to live. There is little evidence that movement of African Americans to relatively cleaner neighborhoods or non-Hispanic Whites to relatively dirtier neighborhoods has played a significant role in the observed convergence.

This is a clear example of a scientific and health based standard having substantial and positive equity effects.  The same likely applies to dealing with climate change:  because low income people and people of color are generally at greater risk from climate change than the overall population, measures that reduce greenhouse gas emissions tend to have intrinsically equitable effects.

Janet Currie, John Voorheis, and Reed Walker, What caused racial disparities in particulate exposure to fall?  New evidence from the Clean Air Act and satellite-based measures of air quality, 2020, NBER  Working Paper 26659 http://www.nber.org/papers/w26659

The Week Observed, March 4, 2022

What City Observatory did this week

Oregon crosses the road-pricing Rubicon. Starting this spring, motorists will pay a $2 toll to drive Oregon’s historical Columbia River Gorge Highway.  Instead of widening the road, ODOT will use pricing to limit demand. This shows Oregon can quickly implement road pricing on existing roads under current law without a cumbersome environmental review or equity analysis.

For more than a decade, the agency has been dragging its feet in implementing road pricing, complaining it didn’t have the authority, or had to undertake lengthy processes first.  Tolling the old Columbia River Highway shows that instead of widening roads, the agency can use pricing to reduce congestion and make the transportation system work better. Perhaps this will be the breakthrough that will lead ODOT to deploy pricing elsewhere.

Must read

Cincinnati will have the best bridge that somebody else will pay for.  For some years, Southern Ohio and Northern Kentucky residents have been complaining about traffic on the Brent Spence Bridge, which carries two interstate highways across the Ohio River.  While many say they want a big new bridge that would almost triple  the capacity of the existing structure, neither state has ever found the money.

Now, thanks to the IIJA, the two state’s are hoping that even though they don’t think it’s worth the expense that the federal government will foot the bill.The two states are hoping federal grants will allow them to build the multi-billion dollar bridge without having to charge tolls to the people who actually use it. As the Kevin DeGood of the Center for American Progress tweeted:  “A bridge project that’s so important two states agreed it will only get built if Washington pays for it.”

The high price of electric vehicles.  In yet another record-setting round of corporate subsidies, the state of Michigan is paying General Motors to set up electric vehicle manufacturing operations.  The Guardian reports GM is slated to get $1 billion in state subsides. The company has promised some 4,000 jobs—which works out to a stupendous $310,000 per job, but that’s almost certainly an under-estimate of the true cost.  Job creation counts have routinely been over-promised and under-delivered, and even when automakers have failed in the past to deliver promised jobs, they’ve largely gotten to keep all the subsidies.  It’s ironic in a time of record high profits and record low unemployment that state’s are still easy prey for industrial location blackmail.

New Knowledge

Why going faster doesn’t save us any time.  Central to the economic analysis of transportation projects is the “value of travel time”—an estimate of what potential travel time savings are worth to travelers.  In theory—and thanks to the fundamental law of road congestion, it is way too often just a theory—road investment projects produce economic value because they reduce the amount of time people have to travel to make a particular trip.  But the notion is that a roadway or other transportation system investment that enables us to move faster (more speed) allows us to complete any given trip in less time, and that we can count the value of that time savings as an economic benefit of the transportation project.  (Which is what is commonly done in the benefit-cost studies done to justify roadway improvement projects).

The trouble is there’s no evidence that increased speed gives us additional time that we then devote to other non-transport purposes.  Going faster doesn’t give us more time.  It turns out that we don’t use increased travel speeds to get more time to do other things, we simply end up traveling further and not gaining any additional time for other valuable activities, whether work or leisure:

. . . studies on travel time spending find that the average time persons spend on travelling is rather constant despite widely differing transportation infrastructures, geographies, cultures, and per capita income levels.  The invariance of travel time is observed from both cross-sectional data  and panel data. It suggests that people will not adapt their time allocation when the speed of the transport system changes.

Instead, increased speed of transportation systems is associated with more decentralized land use patterns and longer trips, and all of the supposed “benefit” of increased speed is consumed, in iatrogenic fashion, by the transportation system.  And, importantly, changes in speed trigger changes in land use patterns that can diminish proximity to key destinations:

The higher speed enticed residents of small settlements to shift for their daily shopping from the local shop to the more distant supermarket . . . As a consequence, local shops might not survive, forcing both car owners and others to travel to the city for shopping. The impact of the speed increase is in this case a concentration of shopping facilities, implying a decrease of proximity. The positive impact on accessibility is at least partly reversed by the negative impact on proximity, and the outcome for access is undefined. For those who did not use a car and did not benefit from the speed increase, the result was an unambiguous decrease of the access.

This study suggests that the principal supposed economic benefit of transportation investments–time savings–is an illusion.  Coupled with growing evidence that higher speeds have significant social and environmental costs (lower safety, health effects, and environmental degradation) this suggests that a reappraisal of transport policies is needed.

Cornelis Dirk van Goeverden, “The value of travel speed” Transportation Research Interdisciplinary Perspectives, Volume 13, March 2022.  https://www.sciencedirect.com/science/article/pii/S2590198221002359

Hat tip to SSTI for flagging this research.

In the news

The Portland Business Journal published Joe Cortright’s commentary “$18 for 18 wheelers” detailing the high cost that truckers will pay for the proposed $5 billion I-5 Interstate Bridge Replacement/freeway widening project.

The Week Observed, March 25, 2022

What City Observatory did this week

Who’s most vulnerable to high gas prices? Rising gas prices are a pain, but they hurt most if you live in a sprawling metro where you have to drive long distances to work, shopping, schools and social activities.  Some US metros are far less vulnerable to the negative effects of rising gas prices because they have dense neighborhoods, compact urban development, good transit, and bikeable, walkable streets.  Among the 50 largest metro areas, the best performers enable their residents to drive less than half as much as the most car-dependent metros.  Those who live in metro areas where you have to drive, on average, 50 miles or more per day (places like Oklahoma City, Nashville and Jacksonville) will be hit twice as hard by higher fuel prices than the typical household living in a place like San Francisco, Boston or Portland, where people drive, on average, fewer than 25 miles per day.  When gas prices go up, it’s easy being green:  These compact, less car-dependent metros and their residents, will experience far less economic dislocation than metros where long daily car trips are built-in to urban form.

Must read

Homelessness is a housing problem.  Homelessness is a gnarly and growing problem.  But what is at it’s root?  A new analysis from Gregg Colburn and Clayton Page Aldern summarized at the Sightline Institute looks at some of the statistical correlates of homelessness, but comparing data on “point-in-time” counts of those unsheltered with key socioeconomic variables for different cities.  What characteristics and trends are most closely related to high levels of observed homelessness.  You might think that variations in poverty across cities would be a major factor, but statistically, that doesn’t seem to be the case.  The most powerful observed correlation is between housing prices and homelessness:  cities that have very high rents have higher rates of homelessness.

Poverty may be higher in some cities than others, but if rents are low and vacancies are high, then relatively more people, even those with limited incomes, find some shelter.  In contrast, in tight housing markets with high rents and few vacancies, there may simply not be enough housing units to house everyone, and those with the lowest incomes find themselves outbid by those with more money.  The clear implication here is that bringing down rents and expanding housing supply is central to reducing homelessness.

Seattle’s disappointing new waterfront highway.  Seattle is close to putting the finishing touches on the new waterfront roadway that’s being built where the former Alaskan Way viaduct towered over the city for decades.  While getting rid of the viaduct was a boon, the finished product looks to be a major disappointment.  While the project was sold with parklike, green renderings of the waterfront, the reality is that mostly what has been done is to drop the old elevated highway down to street level.  In places, the new highway will be as many as nine-lanes wide, and will separate the city’s downtown from its waterfront with a steady stream of speeding cars.  The Urbanist has this critique:

The Washington State Department of Transportation (WSDOT) tore down its hulking waterfront highway viaduct only to place just as many lanes at ground level — and actually more south of Columbia Street, where the road turns into a giant queuing zone for the Seattle Ferry Terminal.

As is so often the case, the project was sold with a glossy bird’s eye view of the project.  The ground level reality is something altogether different.

It’s a reminder that the devastation to urban space and urban living isn’t just caused by structures, its a product of the volume of cars, and of prioritizing the movement of those who are traveling through the city over those who actually want to be there.

New knowledge

How to save money on transportation:  Don’t own a car.  The Urban Institute’s Yonah Freemark has a thoughtful analysis of how higher gas prices affect households with different income levels based on their car ownership.  Freemark has examined data from the Bureau of Labor Statistics Consumer Expenditure Survey showing what fraction of a household’s income is spent on transportation, based on car ownership by various income groups.

As the following table makes clear, transportation expenditures are most burdensome for low income households who own a car:  those with incomes of less than $25,000 annually who own a car spend close to 25 percent of their household income on transportation.  Interestingly, low income households who don’t own cars spent the smallest fraction of their household income on transportation—a little more than 5 percent.  For higher income households, those who don’t own cars spend noticeably less (between a half and two-thirds as much) on transportation as those households that own cars.

In his accompanying commentary addressing the policy implications of this work, Freemark appropriately ties together housing and transportation costs as key drivers of family budgets.  While much of the policy response to higher gas prices aims at buffering consumers from higher gas prices, a logical alternative is creating more opportunities for households, especially lower income households, to live in communities where they don’t need to own a car to enjoy a decent standard of living.  One key to helping deal with expensive gas is to make sure that we have affordable housing, especially in dense, transit-served urban locations, where households can escape both the high cost of gasoline and the high cost of car ownership.

Freemark’s analysis can be read as a complement to our commentary on variations in miles driven among metropolitan areas.  If communities or households make decisions to choose car-free or car-lite options, they’ll spend less on transportation, and be far less vulnerable to economic shocks when oil prices rise.

Yohah Freemark, What Rising Gas and Rent Prices Mean for Families with Low Incomes, The Urban Institute, March 17, 2022

The Week Observed, March 18, 2022

Must read

The problem with the “reckless driver” narrative.  Strong Towns Chuck Marohn eloquently points out the deflection and denial inherent in the emerging “reckless driver” explanation for increasing car crashes and injuries.  Blaming a few reckless drivers for the deep-seated systemic biases in our road system is really a convenient way to avoid asking hard questions about the carnage we accept as routine.  Many drivers routinely engage in risky behaviors, from speeding to texting, to other distractions.  What’s changed during the pandemic, is that their are more opportunities for these risks to turn catastrophic, because there’s less traffic.  Ironically, traffic congestion holds down speeds and forces drivers to pay attention, diminishing possible crashes.  The problem isn’t the recklessness, it’s all the driving.

Focusing on reckless driving is obsessing over the smallest fraction of the underlying cause, but it fits a narrative that engineers, transportation planners, and traffic safety officials are more comfortable with, one that puts the blame on others, primarily the Reckless Driver.

Where college students want to live after graduation.  Axios has a new poll asking an important, post-pandemic (we hope) question:  Where will young workers want to live.  If remote work is as popular and prevalent as some think, young adults seemingly have more choice than ever about where to live and pursue their careers.  The Axios survey shows that the new hotspots are . . . the same as the old hotspots:  big vibrant cities, chiefly on the coasts.

The great tech-hub exodus that didn’t actually happen.  In theory, the advent of work-at-home ought to have undermined the need to be located in an expensive, superstar tech-hub.  Finally, it was supposed to be the opportunity to flourish for second-tier cities that had been bypassed by the continuing concentration of firms and talent in a relative handful of established tech centers.  But according to this article from WIRED, the theory isn’t panning out:  for the most part, tech jobs remain concentrated in the long-established tech hubs.  Drawing on data from the Brookings Institution, they report that:

The tech sector grew by 47 percent in the 2010s, and in the latter half of that decade, nearly half of tech job creation occurred in eight “superstar” metro areas: San Jose, New York, San Francisco, Washington DC, Seattle, Boston, Los Angeles, and Austin. By the end of the decade, those eight cities comprised 38.2 percent of tech jobs.

A key factor underlying the persistent success of these tech centers is knowledge spillovers.  WIRED quotes UC Berkeley’s Enrico Moretti, whose research points out the tech centers serve as force multipliers for idea creation and dissemination:  scientists are more productive when they’re embedded in an environment with others, than when they are widely scattered.

New knowledge

Why highway widening is inherently inequitable.  Who benefits when highways are widened? Urban traffic congestion is predominantly a “peak hour” problem, affecting those who travel by car during travel peaks, particularly those coinciding with morning and afternoon commute trips.  A new study from the University of Montreal looks at the socioeconomic characteristics of those who travel by car on congested roadways to discover who really benefits from highway widening.   The authors use a travel survey covering more than 300,000 trips in Montreal, disaggregated by household income, trip time and trip mode to compare the equity implications of highway expansion projects designed to reduce peak hour travel times.  They find an unequivocal result:
. . . investments in roadways made to reduce traffic congestion lead to inequitable benefits. . .  . because fewer low-income workers and low-income travelers travel by car and at peak times due to their job scheduling and activities. Also, travelled distances of low-income workers are generally shorter so that benefits of flow improvements are more modest. As such, congestion mitigation disproportionally advantages higher-income groups in terms of travel speed and time.
There are four key reasons why highway expansions disproportionately benefit higher income households relative to lower income households.
First, lower income households are far less likely to travel by car.  In Montreal, only 50.4% of trips by low income households are made using a car (compared to 69% for the entire sample, 75.3% for the wealthiest group).
Second, lower income households are far less likely to travel at the peak hour, during congested travel times.  In Montreal, only about 40 percent of low income households commute during the AM and PM peak hours, compared to about 58 percent of high income households.
Third, lower income households take shorter trips than higher income households, which means that even when they do travel by car at the peak hour, the time savings for them are smaller than for higher income households.  In Montreal, low-income car drivers travel shorter distances by car during the morning peak period. Highest income car drivers travel on average distances that are nearly 60% greater (median of 5.3 vs. 8.4 km).  Even low income households in suburban areas take shorter trips.
Fourth , low income households are less affected by traffic congestion, and therefore have less to gain from its reduction.  The authors estimate that low income households experience a travel time index (ratio of peak hour travel times to free flow travel times) of 1.3 (a peak hour trip takes 30 percent longer than a free flow trip.  This compares to higher income travelers, who experience a travel time index of 1.5.  Higher income households disproportionately benefit from congestion reduction.
As an aside, we note that this study doesn’t consider the fact that highway expansion projects generally fail as a means of reducing travel times, due to the well-established fundamental law of road congestion—which means that effectively no one benefits from highway expansion projects.  But even then, as the author’s note, highway expansions tend to lead to further decentralization of economic activity and longer trips, which itself works to the relative disadvantage of those without cars.
There’s an unfortunate excess of glib and empty rhetoric about the importance of equity considerations in transportation projects. This study shows that the highway expansion projects by their very nature are intrinsically inequitable.  No amount of PR whitewash should be allowed to obscure this fundamental fact.
Ugo Lachapelle &  Geneviève Boisjoly, The Equity Implications of Highway Development and Expansion: Four Indicators, Université du Québec à Montréal, March 11, 2022, https://doi.org/10.32866/001c.33180

The Week Observed, March 11, 2022

What City Observatory did this week

Freeway widening for whomst:  Woke-washing the survey data.  Highway builders are eager to cloak their road expansion projects in the rhetoric of equity and have become adept at manipulating images and statistics.  In their efforts to sell the $5 billion I-5 freeway widening project in Portland, ODOT and WSDOT have rolled out the usual array of stock photos (this one cribbed from women’s health campaigns, to demonstrate their “commitment” to equity.

The DOT Equity Commitment: Smiling stock photograph actors. (IBR)

More ominously, the two agencies are also using an unscientific and biased web-based survey to falsely claim that the primary beneficiaries of its highway-widening project are low income and people of color.  We dig into their survey data and find it grossly undercounts low income populations, people of color, and young adults, and is biased heavily toward whiter, wealthier groups and those who commute regularly over the bridge.  The data are so skewed and unrepresentative that they can’t be used to make valid statistical statements about who travels where in the region, or how the public feels about the proposed project.  Census data make it clear that the primary beneficiaries of this freeway widening project are peak hour, drive alone commuters who make an average of $106,000 and 86 percent of whom are non-Hispanic whites.  In addition, the survey failed to disclose the project would require tolls of at least $2.60, and didn’t ask any respondents how they felt about tolling.

Must read

A parking inventory for the Bay Area.  Don Shoup’s magisterial work, The High Cost of Free Parking, itemizes the many ways that we have to pay for parking through higher rents, longer distances to all our destinations, and traffic congestion.  Part of the opaqueness of parking economics stems from a lack of data.  We have detailed counts of the population, of businesses, of road traffic, and of housing units, but almost no one comprehensively counts parking spaces.  A new study from the San Francisco Bay Area urban research group SPUR steps into that void.  It shows the Bay Area has roughly 15 million parking spaces, most of them located on public rights of way and “free” to use.  Parking is abundantly supplied:

There are 1.9 spaces per person, 2.7 spaces per employed individual, and 2.4 spaces per auto and light-duty truck. . . . Parking area constitutes 7.9% of the total incorporated area.

That’s in stark contrast to housing.  The Bay Area famously has extremely expensive housing—in part because land use policies have allowed for only 3.6 million housing units for people–less that a quarter of the number of parking spaces available for cars.  And while most parking spaces are free and pay little or nothing in taxes, houses are expensive and heavily taxed and regulated.  The parking inventory clearly signals a regime in which vehicles are vastly more privileged than people.

The “User Pays” Road Finance Myth.  There’s a widespread fiction that the road system is fully paid for by user fees.  A new report from The Frontier Group reminds us, once again, that’s just not true.  User fees, like fuel taxes and vehicle registration fees cover only about half of the direct costs of the road system, and pay almost nothing toward the substantial external and social costs of driving.

The road finance system is regularly bailed out from taxes on unrelated activities (like groceries in Virginia), and Congress has bailed out the Highway Trust Fund with over $156 billion from general funds, in addition to billions for for the IIJA.  Drivers not only don’t pay the costs of the roads they drive on, they impose huge costs on others from crashes ($474 billion), air pollution ($41 billion) and greenhouse gas emissions $70 billion).  By all means, let’s have a user pays transportation system, because we certainly don’t have one now.  Frontier Group has specific recommendations for building on existing revenue systems (the gas tax does tax carbon pollution) and adapting others, like congestion pricing.

Need for a fundamental shift in transportation policies.  California’s Strategic Growth Council has taken a deep dive into the state’s infrastructure policies, and is calling for a dramatic change in funding priorities to tackle climate change.  Key to this is shifting transportation investment away from freeway expansion.  They write:

“. . . a significant share of funds at the state, regional, and local levels continue to be spent on adding highway lanes and other projects that increase vehicle travel. This funding not only adds to the maintenance burden of an aging highway system but also means less available funding for other investments that might move more people (such as running more buses or prioritizing their movement) without expanding roadways or inducing additional vehicle travel and provide Californians with more options to meet daily travel needs. Additionally, in most situations, particularly in urban areas, adding highway lanes will not achieve the goals they were intended to solve (such as reducing congestion) as new highway capacity often induces additional vehicle travel due to latent demand that then undermines any congestion relief benefit over time. Critically, these projects also add burdens to already impacted communities along freeway corridors with additional traffic and harmful emissions, and by further dividing and often displacing homes and families in neighborhoods that were segmented by freeways decades prior.”

The legislatively mandated report sets the stage for better policies.  If California is really serious about its climate change objectives, now is the time to stop building more highway capacity.

New knowledge

Cars make cities less compact.  The chief economic advantage of cities is that they bring people closer together, enably social interaction and promoting productivity.  While at the margin, for an individual, a car makes things in a city easier to reach, that’s not true when more and more people use cars.  Over time, the advent of car ownership has caused population and economic activity to sprawl outward, undercutting the key advantage of proximity.

In this study, a global analysis of car ownership levels and population density shows that cities with higher levels of car ownership have systematically lower levels of density.

That’s long been understood, but there’s by a technical criticism that it may be the low density of places that “causes” more car ownership, rather than the other way around.  This paper tackles that issue by using an instrumental variables methodology that proxies the establishment of car manufacturing facilities for car ownership, and finds that causality runs from car ownership to lower density (the advent of more car manufacturing explains the decline in density.)  While that analysis is convincing, it is very likely the case that, over time, as car ownership increases and density declines, that there is also a feedback loop from lower density to more car ownership:  as land uses become more dispersed due to car ownership, more households are forced to acquire cars to maintain access in the face of falling density, and this in turn triggers further declines in density.  Now, decades after widespread adoption of cars in advanced economies, it may be impossible (and pointless) to try and fully separate cause and effect.  The key finding of this study, however, is that in lower income nations, more widespread car ownership is likely to lead to further declines in population density in cities, which in turn will increase their carbon footprint, and make it all the more difficult to fight climate change.

Francis Ostermeijer, Hans Koster, Jos van Ommeren, Victor Nielsen, Cars make cities less compact, VOX EU, 8 March 2022, and

Ostermeijer, F, H R A Koster, J van Ommeren and V M Nielsen (2022), “Automobiles and urban density”, Journal of Economic Geography, forthcoming.

In the news

Clark County Today re-published our commentary “$18 for 18 wheelers” on proposed truck tolls for the proposed Interstate 5 freeway widening project in Portland.

The Week Observed, February 25, 2022

What City Observatory did this week

Freeway widening for whomst?  Woke-washing is all the rage among those pushing highway projects these days, and there’s no better example that Portland’s I-5 “bridge replacement” project (really a 5 mile long, 12 lane wide, $5 billion road expansion).  It’s being sold as a boon for low income workers pushed to distant suburbs by rising home prices.  But the reality, as told by Census data, is very different, the peak hour, drive-alone commuters who are the principal beneficiaries of the project are overwhelmingly wealthy and white:  86 percent of them are non-Hispanic whites; their median income is $106,000–figures for both are far greater than for the rest of the Portland metro area.

 

Must read

The housing theory of everything.  This is a long essay, but well worth the read (as the author’s say, it is a theory of everything.”)  Their core argument is that our housing policy failures spill over into everything from the wealth gap and inequality, to urban sprawl, political polarization and climate change.

The tight supply of housing in the nation’s most productive metropolitan areas has both equity and economic growth implications. The US economy is measurably smaller than it would be if more workers lived in the most productive places, and the high cost of housing has made it harder for lower income workers to migrate to the places with the greatest opportunity—and has limited their economic gains when they do move there.

The scope and subject matter of this essay (though perhaps not its style) is an echo of Kim May Cutler’s  magisterial essay “How burrowing owls lead to vomiting anarchists .”  Both works, in different ways, tie together in a compelling way urban problems that superficially seem disconnected.  It’s something you’ll want to read.

As the pandemic ebbs, driving rebounds.  Bryn Huxley-Reicher of the Frontier Group has published a great analysis of the rebound in driving in the past few months and frames a more important question:  what next?

Little surprise that the waning of the pandemic has produced a rebound in driving nearly as sharp as the decline we experienced starting in March 2020. Per capita driving has bounced back almost to the same level it was at prior to the pandemic. This is one important bit of evidence to suggest that at least some of the changes we observed during the pandemic are transitory—and it remains to be seen how much of other trends, particularly work at home, will revert to their pre-pandemic form.

The big question going forward, for climate and for the economy, is whether VMT per capita will increase in the years ahead. Increased driving makes attaining our climate goals harder, and also increases the financial burden on households. Huxley-Reicher helpfully looks back at previous forecasts of future VMT growth made by the US departments of energy and transportation. There’s a consistent “hockey stick” quality to these forecasts. Even though driving has been flattening out in recent years, the forecasts always predict a significant acceleration of driving, which never seems to happen. The black line showing “actual” VMT per capita is almost invariably lower than the low end of the federal forecasts. It’s indicative of the underlying bias in much of this modeling, and also fails to explicitly recognize that policy can, and given our climate crisis, should be making it possible for Americans to drive less—something we got a small taste of during the past two years.

Jerusalem Demsas on gentrification housing.  (Really, a “must watch.”) There’s a fair amount of folk wisdom when it comes to housing markets, notably a tendency to deny some of the fundamental precepts of market economics. For example, there’s a widespread and oft-repeated claim that the construction of new market rate housing in a neighborhood drives up prices for surrounding houses.  It’s common to single out the nearly ubiquitous “five over one” apartment buildings that are going up in cities around the country has harbingers of gentrificaiton and dispalcement.  This video challenges this view. i

Understandably, it is the kind of inference people can draw when they see new housing going up in “hot” neighborhoods. But this casual inference gets cause and effect backwards—it’s rising demand, signalled by rising prices, that prompts the new construction, not the other way round. And moreover, choking off the supply of new housing only further constraints supply and tends to push prices higher. Demsas offers some simple illustrations of how this works that despite their simplicity are grounded in some of the latest and most thorough academic research on the subject.

New Knowledge

Neighborhood effects:  What we know.   For decades, scholars have been studying how where you live influences your lifetime opportunities.  You’ve heard the aphorism that “zip codes are destiny,” and there are plainly patterns of poverty and prosperity, opportunity and struggle across the landscape of the nation, and within cities, among different neighborhoods.  What’s been difficult is separating out the effects that are due to sorting (successful people moving to prosperous places, and impoverished ones moving to, or trapped in places of concentrated poverty), from the causal effects of place.

Famously, the decision to undertake a kind of social science experiment triggered by the relocation of tenants from Chicago housing projects in the wake of a lawsuit challenging racial discrimination unleashed of flood of data and scholarship on the subject.  Two of the leading authors in the field, Lawrence Katz and Erik Chyn have written a summary of this evolving field for the Journal of Economic Literature.  It’s a great way to get familiar with what’s been learned about neighborhood effects in the past few decades.

There’s more here than can be summarized neatly, but one of the key findings has to do with differing results for children and adults.  It turns out that the kind of neighborhood you grow up in has profound and lifelong effects.  Kids who move to more opportunity rich locations, especially those who do so at an early age, tend to significantly outperform their otherwise similar peers to remain in disadvantaged neighborhoods, on a range of social and economic indicators.  But the older the child at the time they move to a stronger neighborhood, the smaller the effect.  And for adults, the literature shows that moving to an opportunity-rich neighborhood seldom has a statistically significant impact on key economic outcomes.  The perspectives and networks we establish in our formative, earlier years seem to have a lasting effect.  As the author’s conclude:

The Moving to Opportunity experiment generated beneficial impacts on long-run economic outcomes of moves to higher-opportunity areas only for younger children who received a larger “dosage” of childhood exposure to improved neighborhood environments relative to their older counterparts. Disruption costs of moves across different types of neighborhoods could potentially outweigh the small exposure gains for older children

This body of research is central to our thinking about poverty–especially understanding the devastating effects of neighborhoods of concentrated poverty–and related problems, like discrimination.  There’s much more here, and whether you’re looking for an introduction to the scholarship on the subject or a refresher and update, this is an indispensable article.

Erik Chyn and Lawrence Katz, “Neighborhoods Matter: Assessing the Evidence for Place Effects,” Journal of Economic Perspectives, Volume 35, Number 4, Fall 2021, pages 197–222

In the news

Planetizen summarized some of the crazy rent inflation numbers being reported for markets around the country, and reminded its readers about our critique of some indices which rely on noisy and volatile on-line listings data to estimate inflation.

The Week Observed, January 7, 2022

What City Observatory did this week

1. Metro’s failing climate strategy. Portland Metro’s Climate Smart Strategy, adopted in 2014, has been an abject failure. Portland area transportation greenhouse gasses are up 22 percent since the plan was adopted: instead of falling by 1 million tons per year, emissions have increased by 1 million tons annually, to more than 7 million tons, putting us even further from our climate goals.

Metro’s subsequent 2018 RTP has watered down the region’s climate effort far below what is needed to comply with Oregon’s statutory greenhouse gas reduction goal, based on the assumption that 90 percent of emission reductions would be accomplished with cleaner vehicles. All of Metro’s key assumptions about transit, vehicle turnover, technology adoption, and driving, have been proven wrong. The plan has set a goal for reducing vehicle miles traveled that is actually weaker than the reductions the region achieved in the decade prior to the adoption of the “Climate Smart Strategy.” The agency has not acknowledged the failure of its climate efforts, and is at the same time moving forward to allow the Oregon Department of Transportation to build a series of freeway widening projects that will add more than 140,000 tons of greenhouse gasses per year.

2. Why the I-5 Bridge Replacement is a Climate Disaster. The plan to spend $5 billion widening the I-5 Bridge Over the Columbia River would produce 100,000 additional metric tons of greenhouse gases per year, according to the induced travel calculator. Metro’s 2020 transportation package would have cut greenhouse gases by 5,200 tons per year– 20 times less than the additional greenhouse gases created by freeway widening.

3. Oregon, Washington advance I-5 bridge based on outdated traffic projections.  The Oregon and Washington Departments of Transportation are advancing their $5 billion freeway widening plan based on outdated 15-year-old traffic projections. No new projections have been prepared since the 2007 estimates used in the project’s Draft Environmental Impact Statement.

The two state DOTs are essentially “flying blind” assuming that out-dated traffic projections provide a reasonable basis for sizing and designing and new bridge, and rejecting other alternatives. The two agencies have spent two years and tens of millions of dollars but not done the most basic preliminary work to accurately predict future traffic levels. The Oregon DOT has specifically violated Governor Kate Brown’s pledge that new traffic analyses would be done prior to determining the “best solution” for the I-5 bridge project. The two agencies have no plans to publish new traffic studies until mid-to-late 2022—months after determining a final design and asking for other local sponsors to approve.

Must read

How we build cities matters for climate policy.  Rushaine Goulbourn and Jennie Schuetz of Brookings Metro offer up a comparative analysis of housing development patterns in three US cities (Atlanta, Chicago, and Washington) that sheds important light on the connect between urban density and combating climate change.  People living in denser, more centralized metro areas tend to drive less and walk and use transit more than their counterparts living in more sprawling metro areas.  While historically most housing growth has occurred and the periphery, that’s beginning to change, and our ability to capitalize on the demand for urban living could pay important environmental dividends.  The encouraging sign is that central cities have been doing somewhat better.

However, the urban core makes up a larger share of new housing post-Great Recession in all three metro areas. Since 2010, permits in the urban core were nearly half of total permits in Chicago and more than one-quarter of total permits in Atlanta and Washington, D.C. These trends are consistent with increasing demand for high amenity central cities across much of the U.S.

Misapplying value capture to parking policy.  Writing at Substack, Darrell Owens criticizes the tortured logic that some non-profit advocates have used to try to hold parking reform hostage in a vain attempt to generate additional funds for affordable housing.  It’s long been recognized that parking requirements drive up the cost of development, particularly for the most modestly priced apartments. Policy has focused on lessening or eliminating parking requirements, but the notion that this somehow might result in added profits for real estate developers or land owners, has led some affordable housing advocates to insist on extracting some payment in exchange for rolling back parking. In practice that hasn’t worked, and in contrast, recent experience in San Diego shows that simply lessening parking requirements produced far more construction, including affordable construction than under the more tortured negotiation approach.  As Owens writes:

. . . after parking minimums near transit were abolished in 2018, density bonus units increased by five-fold. This means developers were more likely to provide on-site low-income units after reduction of parking requirements, not less (as predicted by planning value capture theory). And non-profit low-income housing developers massively increased their production during this period.   . . . Real life evidence does not suggest that parking requirements are being traded for affordable units. . . . given what we have learned about the need to reduce auto emissions (EVs alone won’t do it, according to the California Air Resources Board), requiring parking as pretextual planning and praying that private developers will want to bargain it away is courting climate disaster.

New Knowledge

Explaining the induced demand calculator.  UC Davis researcher Jamey Volker presented a webinar explaining the science basis of the Induced Travel Calculator.  The calculator serves as the model for the Rocky Mountain Institute’s SHIFT calculator, and is also being used in California as part of environmental reviews for highway expansion projects.  Volker’s presentation succinctly describes the key mechanisms behind induced travel, summarizes the growing body of literature that confirms the quantitative estimates of its impact, and discusses how the calculator can be applied in environmental reviews.

UC Davis researcher Jamey Volker

The key insight behind the theory of induced demand is that improvements to a roadway, such as additional lanes, or other measures that improve the flow of traffic, have the effect of lowering the “time cost” of travel.  Faced with a lower time cost of travel, people’s behavior quickly changes:  they take more trips, longer trips, trips to more distant destinations, and choose car travel more often.  In addition, over time, household and business location decisions may shift to a more dispersed pattern.  The net result of all of these behavioral shifts is to create additional travel, which quickly returns congestion levels to pre-improvement conditions.

A key problem is that the travel models used by state highway agencies don’t accurately reflect what’s known about induced demand.  Volker explains:

Many models, and particularly the traditional four-step models do not include all the feedback loops necessary to capture the behavioral changes caused by capacity expansion.  So, for example, not many models feed changes in estimated travel times into the trip distribution or trip generation stages of the model, and ignore improved travel times from a capacity expansion could increase the number of trips that households and freight operators choose to make, or cause them to choose more distant trip destinations.

Over the past decade, a series of econometric studies, chiefly in the United States, but also in Europe and Japan have confirmed the specific quantitative relationship between capacity expansions and increased traffic.  As Volker summarizes the research, studies have generally converged on a “unit elasticity” of traffic with respect to capacity expansions:  a one percent increase in capacity is associated with a one percent increase in vehicle miles traveled.

That’s important because in underscores the futility of adding capacity to try to reduce congestion:  more capacity in urban areas simply generates more traffic, undercutting any time savings and increase traffic and greenhouse gas emissions.

Finally, Volker points out that the calculator provides a scientifically-grounded means of efficiently estimating the amount of additional vehicle miles traveled that would be generated by a highway expansion, and by extension, the amount of added pollution greenhouse gases.

. . . there’s a burgeoning movement here to look more induced travel, whether it’s under CEQA analysis or the NEPA analysis in another state, or a joint NEPA analysis in California.Induced travel is so important for these analyses and it’s also important for cost-benefit analyses; how much VMT your project causes has a direct relationship with how much air pollution or greenhouse gas emissions will be emitted, and also what the cost and benefits of the project will be. Is it going to actually reduce congestion and if so, for how long? That could significantly affect the benefit calculus of cost benefit analysis.

Caltrans has already started using the UC Davis calculator in its project assessments, and has provided funding for UC Davis to further refine the calculator.  Other state DOT’s which profess to care about climate ought to be doing the same.

Dr. Jamey Volker, “Caltrans Planning Horizons: Induced Vehicle Travel in the Environmental Review Process.” Webinar. September 29, 2021

In the news

BikePortland quoted City Observatory’s analysis of biased way in which state highway departments use traffic models to downplay the negative environmental effects of highway expansions.

The Week Observed, January 14, 2022

What City Observatory did this week

What does equity mean when we have a caste-based transportation system? Transportation and planning debates around the country increasingly ponder how we rectify long-standing inequities in transportation access that have disadvantaged the poor and people of color.  In Oregon, the Department of Transportation has an elaborate “equitable mobility” effort as part of its analysis of tolling. And Portland’s Metro similarly is reviewing transportation trends to see how they differentially affect historically disempowered groups.

This discussion is fine, and long overdue, but for the most part ignores the equity elephant in the room:  America’s two-caste transportation system.  We have one transportation system for those wealthy and able enough to own and drive a car, and another, entirely inferior transportation system for those too young, too old, too infirm or too poor to be able to either own or drive a private vehicle.  There are other manifest inequities in the transportation system, but most of them stem from, or are amplified by this two-caste system.

Those in the lower-caste face dramatically longer travel times, less accessibility to jobs, parks, schools, and amenities, and face dramatically greater risk in traveling than those in the privileged caste.  In a policy memorandum to Portland’s Metro regional government, we’ve highlighted the role of the caste system, and pointed out that its virtually impossible to meaningfully improve equity without addressing this divide.

Must read

The look of gentrification.  One of the chief criticisms of gentrification is aesthetic:  Supposedly new buildings or new businesses both symbolize and cause gentrification.  A fancy coffee shop, a new apartment block.  But Darrell Owen challenges the notion that these symbols tell us much about the causes, and even less about the cures, for gentrification.

As Owens argues, the cause of gentrification is the increased demand for housing in what were formerly disinvested neighborhoods, coupled with the tendency of new arrivals to insist on “status quo-ism”–trying to keep things from changing.  It’s tempting to paint particular people or buildings as victims, but this misses the underlying causes:

Caricatures like skinny white bike bros or some scooter make convenient distractions away from the longstanding unaffordability that the original gentrifiers often created. Many of the first wave gentrifiers had consumed existing housing back when it was cheap, then made it impossible to add more housing to mitigate additional residents like themselves, thus increasing displacement onto incumbents. Bernal, the Haight, South Berkeley, North Oakland—all these neighborhoods experienced this. “I’m the good one,” the first-wave gentrifier insists. “The neighborhood gentrified only after I got here.”  No, the neighborhood gentrified because you got here.

To Owens, the gentrification occurs because of the limits imposed on the construction of new housing:  It isn’t the new buildings that cause gentrification, it’s the ban on new buildings that has the effect of driving up prices and displacing long-time residents.

Those who fought attempts to grow the housing capacity of our old neighborhood got what they wanted: all the same old houses, parks and stores. But at the expense of the people who had lived there in the first place by trading them for new arrivals. Population growth does not require displacement when you prioritize making space rather than the aesthetic of buildings.

A rural reality check on remote work.  The increasingly wide adoption of remote work in the wake of the Covid pandemic has given rise to renewed prognostications that rural economies will be buoyed by an influx of talented workers from cities.  Writing at Cardinal News, Dwayne Yancey offers a reality check and some cautions.  A first important bit of context is acknowledging that the population decline in rural America is deep-seated and long-lived:  rural counties are older, deaths outnumber births, and much of this is baked in the demographic cake.  While migration is benefiting some rural areas, it tends to be those places with high levels of amenities, and formerly rural counties that have become de facto exurbs by virtue of proximity to strong metro areas.  Also working against a general rural rebound is the continuing decline in the number of Americans who move each year–a trend which continued during the pandemic.  Finally, there’s the issue of “The Big Sort”–Bill Bishop’s term for the increasing tendency of migrants to choose to move to communities where people share their interests.  Migrants from predominately blue cities will tend to gravitate to those communities that echo their values, a finding highlighted in a recent Redfin survey.  As Yancey concludes:

This Redfin report raises the question of whether left-leaning office workers are going to want to pick up their computers and move to conservative areas. If not, then that greatly reduces the opportunities for those conservative rural areas to benefit from any migration of remote workers. Maybe only the conservatives will relocate and maybe that will be enough, maybe more than enough, to make a difference demographically.

 

New Knowledge

Racial discrimination in renting.  It’s long been the case that Black and Brown Americans have faced discrimination in rental housing markets. A new study measures discrimination in large metro areas.

The study tests for discrimination by sending landlords inquiries about potential rental apartments using fictional applicant names that are highly correlated with White, African-American or Hispanic identities.  For example, the survey sent out inquiries identified as being from Shanice Thomas, Pedro Sanchez, and Erica Cox among others. The study randomly sent more than 25,000 inquiries to property managers, and tallied the rate of responses by racial/ethnic identification in 50 metro areas across the country.

Overall, the study found that those with potential renters with identifiably African-American or Hispanic names were less likely to receive a response from a property manager than those with identifiably White names.  African-American names were about 5.6 percent less likely to receive a response; Hispanic names were about 2.7 percent less likely to receive a response.

The study stratified its findings by city, so its possible to compare the degree of discrimination by region and metropolitan area.  Overall, the study found that discrimination was more widespread in the Northeast and Midwest, and tended to be lower in the West.  It also found that there was a correlation between racial/ethnic segregation and discrimination:  Metros with higher levels of segregation tended to have higher levels of measured rental discrimination against Black and Brown applicants.

The study ranks metropolitan areas by discrimination (as measured by differential response rates).  The following chart shows the response rate differential for African-Americans (left column) and Hispanics (right column), ranked from highest (most discrimination) to lowest.  The horizontal lines on the chart are the 90 percent confidence interval of the estimated differential for each metro area; lines intersecting the zero vertical line are not significantly different from zero.

Chicago, Los Angeles and Louisville have the highest levels of disparity for Black respondents; New Orleans, Columbus and Jacksonville have the lowest.  Louisville, Houston, and Providence have the most discrimination against Hispanics;  Phoenix, Sacramento and Tampa have the least.  Interestingly, the authors report that there is very little correlation between discrimination against Black as opposed to Hispanic applicants at the Metro level.  This suggests that local market factors may be at work.

Peter Christensen, Ignacio Sarmiento-Barbieri & Christopher Timmins, Racial Discrimination and Housing Outcomes in the United States Rental Market, NBER WORKING PAPER 29516.  DOI 10.3386/w29516

In the news

Bike Portland directed its readers to City Observatory’s analysis of Metro’s failing “Climate Smart Strategy.”

The Week Observed, January 21, 2022

What City Observatory did this week

Metro’s “Don’t look up” climate strategy.  In the new film, Leonardo DiCaprio and Jennifer Lawrence play scientists who find that the nation’s leaders simply refuse to take seriously their warnings of an impending global catastrophe. Their efforts even produce a backlash, as skeptics simply refuse to look at the sky, even as a planet-killing comet becomes visibly larger day after day.

In Portland, life imitates art.  Metro, Portland’s regional government, says it has a plan to reduce transportation greenhouse gases But in the 8 years since adopting the plan, the agency hasn’t bothered to look at data on the region’s production of greenhouse gases from transportation—which have increased 22 percent, or more than one million tons annually. In effect, Metro’s Climate Plan is “Don’t Look Up.”

Must read

How would large scale upzoning affect land prices?  At Strong Towns, Daniel Herriges argues that the belief that upzoning must always raise land prices is a classic example of the fallacy of composition.  His key point is that they price and market dynamics of upzoning a single lot (or a limited area) in a market with high demand will produce very different effects than a much broader upzoning.  In a world where multi-family development sites are rare, prices would be higher than where such sites were common.  To the extent that small, scale spot-rezonings of property increase the values of those properties, its because such properties are rare.  And the real question is not whether the value of up=zoned properties increase, so much as what happens to the values of already more densely zoned property; arguably increasing the number of properties where more density is allowed decreases the value (or lessens the increase in value) in sites that previously had that designation.

Driving inflation.  Headline CPI inflation reached roughly seven percent last month, and the biggest increases in prices in the past year have been closely linked to the increased cost of driving, as pointed out by the Eno Foundation’s Jeff Davis.  Over the past twelve months, gasoline prices have jumped up 49.5 percent (from their depressed, pre-vaccine lows, and used car prices are also sharply higher, up 37.3 percent.

There are a couple of key takeaways here:  First, our car-dependent transportation system is something that makes most US households vulnerable to inflation.  Second, those who live car-free or car-light lifestyles are actually insulated from many of the negative effects of this particular bout of inflation.  And before you get too upset with the current inflationary peak, it’s worth noting that fuel prices have long been volatile (and will continue to be), and that the current surge in used vehicle prices is most likely a very short-term reflection of recent bottlenecks in new automobile production, something that’s likely to abate in the next few months.

New Knowledge

Chain reaction:  How building new market rate housing increases supply for low and moderate income households.

When new housing gets built, it’s most common that its occupants move from somewhere else, usually in the same city.  When they do, the homes they left are then available to be rented out (or sold) to other households.  These knock-on vacancies aren’t obvious in most common housing data and are hard to track, but are a key way, that, over time, housing becomes more available to households throughout the income spectrum.

In Finland, however, which has a national housing register that tracks the occupants of every housing unit, it’s possible to see what happens when new housing gets built, and how this chain of moves plays out.  A new study from the VATT Institute for Economic Research does just that, using anonymized data to track household moves in the Helsinki Metropolitan Area for a period of 10 years.  Helsinki has a metropolitan population of about 1.2 million.

The authors identify new private market and social housing units built in Helsinki, and track the chains of vacancies created when households move into these new units.  While higher income households tend to move into newly built, market rate units, the housing that these households vacate tends to then be occupied by successively lower income households.  The key finding, quantitatively, is that the construction of 100 new market rate units in central Helsinki leads to 60 housing vacancies in the bottom half of the neighborhood income distribution, and of these, about 29 are in the bottom quintile of the neighborhood income distribution.  As the authors conclude:

. . . people moving into the new centrally located buildings . . .  have much higher incomes and are more likely to be highly-educated than both the Helsinki Metropolitan Area (HMA) population on average and the people who move to other locations in the HMA during our time window. New housing built in expensive areas of the city does indeed primarily house the better-off. However, the moving chains triggered by these new units reach middle- and low-income neighborhoods quickly, within a year or two. Our register data also allows us to show that low-income individuals are part of the moving chains. This is direct revealed-preference evidence that low-income individuals in the city area also benefit from new expensive housing, even when the new units are allocated to individuals higher up in the income distribution.

The findings of the Finnish study closely mirror research by Evan Mast for US cities.  Mast found that the construction of new market rate apartments in the US led to a chain of moves which resulted in increased housing opportunities for low and moderate income households, a process we likened to adding more chairs in a game of musical chairs.

In addition to tracking chains for market rate housing, the authors also track moving chains created by the construction of publicly subsidized social housing.  Construction of 100 new units of social housing leads to 75 housing vacancies in the bottom half of the neighborhood income distribution, of which 43 are in the bottom quintile.  This means that while social housing does provide more housing supply for lower income households, market rate production does nearly as well.  These charts compare moving chains market rate (right) and social housing (left) by income group.  While more than half of market rate units are initially occupied by housings in the top income quintile (>P80, pink), their share steadily declines in successive rounds, and below middle (<P50, blue) and lower income (<P20, gray), take most units in later rounds.  The pattern of vacancies by income group in rounds 4-6 are very similar for social and market rate housing, suggesting that both kinds of increment to supply ultimately result housing for low and moderate income households.

This finding echoes a result obtained by Chapple and Zuk in their study of the Bay Area which found that the construction of two new units of market rate housing had the same effect in reducing displacement as constructing one unit of subsidized affordable housing.

Bratu, Cristina & Harjunen, Oskari & Saarimaa, Tuukka, 2021. “City-wide effects of new housing supply: Evidence from moving chains,” Working Papers 146, VATT Institute for Economic Research.

The Week Observed, January 28, 2022

What City Observatory did this week

Why Portland shouldn’t be moving elementary and middle schools to widen freeways.  We’re pleased to publish a guest commentary from Adah Crandall, a high school sophomore and climate activist, who recently testified to the Portland School Board in opposition to move two schools to accommodate the Rose Quarter I-5 freeway widening project.  Crandall and others have been protesting the project weekly at Oregon Department of Transportation headquarters in Portland, and spoke out against the project’s impact on school kids:

As a former Tubman student, I know the pollution at Tubman is dangerous- no students should have to worry about if the air they’re breathing at recess will one day cause asthma or lung cancer. But the decision to move the school rather than fight the freeway expansion follows the same short- sighted line of thinking that started the climate crisis in the first place. Yes, you can move student’s away from the direct threat of pollution, but you cannot move them away from the life of climate disasters they’re inheriting as a result of your decision to support fueling this crisis without making ODOT even study the alternatives.

Crandall invoked the school district’s own anti-bullying policy, urging them not to be bullied by the Oregon Department of Transportation, and instead, to stick up not only for the interest of their students, but for the district’s professed interest in fighting climate change.

Must read

Housing supply and rents in Tokyo:  20 percent more apartments = 10 percent lower rents.  Tokyo is one of the world’s largest and densest cities, and surprisingly, has relatively affordable housing.  And recent data suggest that the city has managed to expand its housing supply and lower rents.  The following charts were prepared by Urban Futurist.info  and posted on Twitter, and are based on Japanese statistics.  Between 2008 and 2018, nominal rents per tatami mat (a Japanese metric for expressing home size, notionally similar to rooms) fell from about 6,400 yen to 5,800, a decline of almost 10 percent.  At the same time, the total number of privately rented apartments in the Tokyo Prefecture increased by almost 20 percent, from 2.2 million to 2.7 million.

Unlike US cities, Tokyo has land use laws which readily allow increased density in urban areas, making it easy to increase housing supply where there is demand.  The result seems clear:  allowing more housing to be built causes rents to decline.

Road Warriors:  Climate concerns collide with freeway widening plans in PortlandBloomberg’s Laura Bliss offers an in-depth exploration of the controversy brewing in Portland over plans to widen the I-5 freeway at the Rose Quarter.  (Our lead commentary is authored by Adah Crandall, who’s profiled in the CityLab article.)  It’s not just a local battle:  around the country, state highway departments are dusting off long dormant highway expansion plans, thanks in no small measure to the availability of billions of additional federal dollars as part of the new infrastructure law.  Those projects are colliding head on with the climate pledges that states and cities have made.  Transportation is now the largest source of greenhouse gas emissions in the US, and the phenomenon of induced demand means that additional roadway capacity not only fails to alleviate congestion, but it adds to vehicle travel, and increases greenhouse gas emissions.  Young climate advocates in Portland have clearly grasped the connection between wildfires and 117 degree days and ODOT’s plans to widen freeways, and they’re calling political leaders to account for the generational inequity of these projects.

What causes homelessness:  Drug abuse or housing shortages?  Michael Schellenberg’s popular new book “San Fransicko:  Why progressives ruin cities” tries to make the case that drug abuse and decadent and permissive policies, not housing shortages, are behind the rise of homelessness in many western cities.  It’s a provocative hypothesis, and there’s little question that drug addiction is correlated with homelessness, but Ned Resnikoff of The University of California San Francisco’s Benioff Homelessness and Housing Initiative deeply disputes Schellenberg’s thesis.  Resnikoff offers a thoughtful critique of the book, and digs deep into several key studies underlining the correlation between housing costs and homelessness.  For example, he points to this from the Harvard Joint Center of Housing Studies:

There’s little question that the causes, and solutions for homelessness are complicated.  And, as Resnikoff emphasizes the relationship between drugs and homelessness is bi-directional:  those troubled by drug addiction are more vulnerable to homelessness; those who lose their housing are more susceptible to drugs.  Trying to claim that housing availability and housing costs aren’t the root problem doesn’t bring us closer to a solution.

The Week Observed, February 4, 2022

What City Observatory did this week

Climate and our Groundhog Day Doom Loop.  It’s Groundhog  Day—again—and we’re stuck in exactly the same place when it comes to climate policy.  Scientists are regularly offering up increasing dire warnings and every more irrefutable evidence of climate change. Extreme weather events: fires, floods, drought, hurricanes are becoming increasingly more common.

And our climate policy still mostly consists of telling ourselves that we’ll reduce our greenhouse gas emissions a couple of decades from now.  We’ve been tracking Oregon’s climate progress every Groundhog day for the past seven years, and just as Bill Murray experienced, nothing has changed.  If anything, its getting worse:  greenhouse gas emissions have increased, and we’re planning to spend additional billions widening freeways, which will surely make the problem worse. Maybe next year will be different.

Must read

New perspectives on road safety from US DOT.  Transportation Secretary Pete Buttigieg announced a major new emphasis on safety for the department: the National Roadway Safety Strategy.  There’s a lot to like here:  The plan acknowledges that the current toll of death and injuries is unacceptable, and its going to take more than exhortations to better behavior to solve the problem.  The strategy promises to look at road design, and considering safety to non-occupants in vehicle safety standards. USDOT is going to take another look at the Manual of Uniform Traffic Control Devices, which has traditionally favored car movement over pedestrian and bike safety.  Harvard’s David Zipper, interviewed Secretary Buttigieg for Slate, and asked him a lot of the tough questions we would have liked answered, notably why more attention hasn’t been devoted to addressing the growing size (and lethality) of sport utility vehicles.  This interview will give you some clear insights about what to expect from the US DOT strategy.

Portland climate activists challenge their state DOT.  Vice’s Aaron Gordon has a report from the front-lines of the Youth vs. ODOT climate battle in Oregon.  He profiles high school student Adah Crandall, who’s been leading weekly protests in front of the highway department’s Portland offices for most of the past year, and who’s been skipping classes to testify to state and local decision-makers about the climate crisis, pleading with them to stop spending public funds expanding fossil fuel infrastructure.

More problems with inclusionary zoning in Portland (Maine).  A number of cities have adopted inclusionary zoning requirements that mandate that those who build new apartment buildings set aside a portion of the new units to be rented at below market rates for low and moderate income households.  While well-intentioned, the cost of complying with these requirements can often prompt a reduction in new construction, with the result that fewer units get built, and housing shortages worsen.  That’s exactly what appears to have happened in Portland, Maine, which instituted its requirement that most new apartment projects set-aside xx percent of their units as affordable housing. MaineBiz reports:

In 2020, prior to the new inclusionary zoning provision, 756 residential units were put on the planning books. In the roughly one year since passing the provision in November 2020, only 139 units had been put on the books — a decrease of 81.6%, according to a study by the Boulos Co.

We’ve previously chronicled how this appears to have reduced housing production in Portland, Oregon, a new report from Portland, Maine says that their requirements have had a similar effect, reducing new apartment starts by 25 percent from pre-IZ levels.

Moving more cars trumps safety, Texas edition.  For the past five years, San Antonio has been working to convert its Broadway Street from a wide, auto dominated arterial to a calmer, safer, more pedestrian oriented street.  Before the Interstate system was built, Broadway STreet was the route of a state highway and was controlled by the Texas Department of Transportation, but with most traffic moved to the interstate, the city has been working to redevelop the area, and had been cooperating with a plan to create a road diet for Broadway, to support public investments that will make the neighborhood more livable and walkable.

This plan for San Antonio’s Broadway Street nixed to favor car movement.

All that came to a crashing halt last week when TXDOT pulled the plug on its prior agreement to return Broadway to city jurisdiction.  Press reports suggest that Texas Governor Greg Abbott intervened, putting a higher priority on moving cars over making a place more suitable for people.

New Knowledge

How higher gas prices reduced sprawl.  The advent of cars about a century ago enabled much more decentralized development patterns.  A new study concludes that the price of gasoline has a direct and significant effect on how much we sprawl, and how much farm and forest land is lost to urbanization.

The heyday for sprawl was the second half of the twentieth century, but since then, even though US population and income have continued to increase, the rate of decentralization has fallen sharply.  Led by Daniel Bigelow, an economist at Montana State University, the authors used an inventory of US land to track development over time.  Nationally, land conversion (from resource uses to urban development) peaked at about 2 million acres per year in the mid 1990s, but has fallen to about a quarter that amount in recent years.

The authors look at the economic correlates of land conversion and find that the rate of land conversion  is correlated with long term changes in gasoline prices.  Declining gas prices produce more land conversion; increasing gas prices produce less land conversion.  The authors write:

Computed at the average county-level commuting time of 19 min, the average annual gasoline price decrease of $0.05 during the last two decades of the 20th century boosted annual land development by 6.06%, while the increase in gasoline prices in the second half of the study period ($0.03 annually) decreased land development by 2.84% per year.

Graphically, they summarize the connection between gas prices and urban land development across areas with differing commute times as follows:

The key finding of the study is that changing gas prices had their largest effects in locations with the longest commutes.  In the 1982-2000 period, when real gasoline prices were decreasing, the positive effect on land development was greatest in locations with longer commutes.  In the 2001-2015 period, when real gas prices were increasing, the declines in land development were greatest in the areas with longer commutes.  As the authors elaborate:

[In the first half of the study period] average change in gasoline price increased development by 6.11%–7.12% for counties with average commuting times longer than 15 min, while counties with shorter commuting times were unaffected by gasoline price changes (figure 4). Similarly, in the second half of the study period, when gasoline prices were increasing, the estimated decrease in land development was largest for counties characterized by longer commuting times. Counties with average commuting times over 12 min saw a decrease in development ranging from −2.63% to −3.31%, with shorter-commute counties not seeing any impact.

The impact of these changes over time is significant.  They estimate that the slowdown in land conversion caused by higher gas prices kept more than 4 million acres from being converted to urban use since 2000.  These findings have significant implications for understanding the effects of vehicle electrification:  electric cars are likely to have far lower operating costs that fossil-fueled vehicles.  This study suggests that declining fuel costs will trigger another wave of sprawl:  keep that in mind when someone claims EVs are “non-polluting.”

Daniel P. BigelowDavid J. Lewis and Christopher Mihiar, A major shift in U.S. land development avoids significant losses in forest and agricultural land.” Environmental Research Letters 2022.

In the news

Willamette Week tells the story of how Louisville squandered a billion dollars on a massive freeway project.  It’s plan to double the width of the I-65 bridges over the Ohio River exactly parallels plans for an I-5 Columbia River Bridge in Portland.  The problem, as we’ve pointed out at City Observatory, is that tolling reduced traffic enough that no expansion was needed.

The Week Observed, February 11, 2022

What City Observatory did this week

The “replacement” bridge con.  It’s telling that perhaps the largest single consulting expense for Oregon and Washington transportation departments’ efforts to revive the failed multi-billion Columbia River Crossing project is $5 million for “communications” consultants.  The project has emphasized a misleading rebranding to call it mere “bridge replacement” project, when it fact less than 30 percent of the cost of the nearly $5 billion project is actually for replacing the existing highway bridge, according to independent accountants. And since the project is proposing to effectively double the existing capacity of the I-5 bridge over the Columbia River by building two side-by-side bridges, half of the expense of the bridge part of the project is really an expansion, not a replacement.

Most of the cost of the so-called “Interstate Bridge Replacement” project is for widening the freeway and rebuilding interchanges for miles north and south of the bridge crossing.   The actual “replacement” cost of the current bridge is somewhere between $500 million and one billion. Calling $5 billion, 5-mile long freeway a “replacement bridge” is like saying if you buy a new $55,000 truck it’s a “tire replacement.”

Must read

How traffic engineers ruin your town.  Recovering engineer Chuck Marohn of Strong Towns explains how the apparently innocuous and technocratic approach of traffic engineers systematically undermines the livability of urban spaces.  He relates how in his hometown, traffic engineers have dictated that one local mainstreet must carry 30,000 cars a day, without forcing them to ever slow down.  In order, engineers prioritize speed, volume, safety and cost.  Local quality of life doesn’t figure in, with the net result being, as Marohn puts it:

The city must be degraded, the quality of life of its residents diminished, and local business opportunities stifled, in order to improve the convenience of commuters who have chosen to live outside of the city. The design of this street must prioritize their convenience over all else.

The priority afforded to car movement undercuts the livability of the city, drives away more residents and businesses to distant (and less trafficked) suburbs, and increases car-dependency.  What we need to do is set clear priorities that favor the interests of those who live in a city over those who are simply passing through:

What if we all agreed that the quality of the city’s neighborhoods, the success of its local businesses, and the safety of its residents, were the paramount concern?  . . . the new design of Washington Street would narrow lanes, widen sidewalks, bring in vegetation and human-scaled lighting, prioritize people walking, give deference to cross-traffic instead of throughput . .  while it might make commutes a little longer for those who have chosen to live outside the city, that delay would be offset in time by an increasing number of people who choose to live in the city’s neighborhoods, the improved business community that could expand offerings in response to a responsive local market, a shift in the number of trips that could now be made by biking and walking, and the local wealth saved from an overall reduction in vehicle miles traveled.

Why car dependent transportation is inherently discriminatory.    In an article originally published at INequality.org, and republished by Streetsblog, Basav Sen argues that the US transportation system fuels inequality and that prioritizing private vehicle use leaves the poor and people of color behind. The problem is fundamentally two-fold:  First, low income households and people of color are dramatically less likely to own or have access to a car; 40 percent of those in the lowest income quintile don’t own cars.  Second, even for those who do own cars, the cost of their operation is effectively a regressive tax, requiring them to spend a much larger fraction of their income on transportation than higher income households.

A transportation system where people have to rely on their own vehicles doesn’t merely exclude those who don’t own vehicles – it imposes a severe financial burden on poorer households that do own vehicles.

The USPS vehicle procurement process and subverting NEPA. By now, everyone’s read about the Postal Service’s plan to buy 160,000 inefficient, heavy, expensive, fossil-fueled mail trucks.  Vice’s Aaron Gordon has done some phenomenal reporting, showing for example that USPS specified the vehicles had to be at least one pound over the limit for “light vehicles” which would have required them to be (somewhat) cleaner).

The USPS approach to vehicle procurement also reveals a profound flaw in the environmental review process:  as a federal agency, the Postal Service has to do an environmental impact statement on its procurement, including, estimating how much pollution their vehicles will create, and considering alternatives (you know, like electric vehicles).  But the EIS is written by the agency, which in this case had already made its decision, and which crafted a  decidedly narrow EIS to back up their decision.  As Gordon explains, USPS ruled out considering EVs because some delivery routes (fewer than 10 percent) are longer that 70 miles, what it considers to be the range of electric vehicles, and failed to consider any alternatives which would have a mix of vehicles tailored to the actual lengths of delivery routes. Other agencies with more technical knowledge on emissions, like the Environmental Protection Agency had virtually no input on the decision, or the EIS.  As long as the polluter gets to write their own EIS, they’ll get just the results they’re paying for.  If the EIS isn’t conducted by a truly independent agency (as opposed to a profit-seeking contractor) it at least ought to be undertaken using scientific guidelines set by someone with subject matter expertise, and not a vested interest in the outcome.

New Knowledge

Who gains from rent control?  A new study from two University of Potsdam economists looks at the impacts of rent control on high income and low income households. Recent experimentation with rent control in Germany (where municipal limits were enaçted, and then struck down by courts), coupled with detailed micro-data on the rental marketplace, provides a basis for examining the impact of rent control on different households.

The paper offers two key findings:  first, that the benefits of rent control flow disproportionately to higher income households, and that second that rent control tends to increase the level of income segregation in the city.

While rent control policies are usually promoted as a way to help low income households, rent controls typically apply to all of the rental housing in a subject jurisdiction, regardless of the rent level of the apartment or the income level of the household. Many of the benefits of rent control are captured by higher income households.  In effect, rent control can dissuade higher income households from moving out of their rent-controlled apartments to new market rate housing.  This, in turn, means fewer apartments are vacated and become available to other households.

As we’ve noted, the moving chains generated by new housing development are a key way in which existing housing filters to moderate and lower income households.  If high income households remain in their rent controlled apartments, this source of new supply doesn’t materialize.

The second key finding looks at the whether income segregation increases or decreases under rent control.  One argument frequently made for rent control is that it enables low or moderate income households to remain in neighborhoods that would otherwise gentrify, increasing income integration.  But this paper suggests that the opposite happens, as high income households remain or increase in central neighborhoods.

We find that rent control leads to an increase in segregation: compared to the baseline, the city-wide dissimilarity index rises from 0.2323 to 0.2606. When looking at the ZIP code specific change in concentration, we find that the increase in concentration is particularly driven by an increase in the concentration of high income households in the regulated central city ZIP codes . . .  This finding stands in contrast to popular arguments which view rent control as a measure to preserve a relatively even income mix in central cities.

Rainhold Borck and Niklas Gohl, “Gentrification and Affordable Housing Policies,” CESifo Working Paper No. 9454, November 29, 2021. https://www.cesifo.org/DocDL/cesifo1_wp9454.pdf

In the news

City Observatory’s Joe Cortright is quoted in Bike Portland‘s article, “Parsing the ‘electric cars won’t save us’ debate.”

The Week Observed, February 18, 2022

What City Observatory did this week

Oregon’s highway agency rigs its projections to maximize revenue and downplay its culpability for climate challenge.  ODOT has two different standards for forecasting:  When it forecasts revenue, it says it will ignore adopted policies—especially ones that will reduce its revenue.  When it forecasts greenhouse gas emissions, assumes policies that don’t exist—especially ones that will magically make greenhouse gas emissions decline. Revenue forecasts are “purely based on historical data” and don’t include adopted policies.  Greenhouse gas emission forecasts are based on “goals” and “wishes” and are explicitly not an extrapolation of past trends. The inflated revenue forecasts are used to justify (and help fund) highway widening; the greenhouse gas emission forecasts are used to absolve the agency from any responsibility to reduce driving related greenhouse gas emissions.

Must read

How freeway widening undercuts our climate policies.  Will the trillion dollar bipartisan infrastructure law be the foundation of stronger efforts to fight climate change, or will it be used to subsidize highway expansions that will increase greenhouse gases?  That’s the challenge, explains Brad Plumer, writing in The New York Times.  Transportation has emerged as the single largest source of US greenhouse gas emissions. Parts of the infrastructure bill, such as those promoting electric vehicles and charging stations, have the potential to reduce greenhouse gases.  But highway expansions encourage more driving and increase pollution:

The core problem, environmentalists say, is a phenomenon known as “induced traffic demand.” When states build new roads or add lanes to congested highways, instead of reducing traffic, more cars show up to fill the available space. Induced demand explains why, when Texas widened the Katy Freeway in Houston to more than 20 lanes in 2011, at a cost of $2.8 billion, congestion returned to previous levels within a few years.

Plumer notes that some states, such as Colorado, have adopted policies to consider the greenhouse gas effects of highway expansions.  But for most states, highway departments are still in denial that induced demand exists, or simply don’t care.

FHWA Policy Guidance for Infrastructure Funds.  Bloomberg’s Lillianna Byington, reports on the Biden Administration’s efforts to lay out policy guidance for the Infrastructure Investment and Jobs Act (IIJA) which would prioritize climate and “fix-it-first” efforts.   The Federal Highway Administration is pushing states to fix existing roads with the bill’s expanded funding. She reports:

In “most cases” highway money through the infrastructure law should be used to repair and maintain existing infrastructure before spending on “expansions for additional general purpose capacity,” Federal Highway Administration Deputy Administrator Stephanie Pollack said in the guidance memo dated Dec. 16.

The efforts to emphasize climate and repair have gotten pushback from some highway agencies and Republican Governors who want unfettered discretion to spend money on expansion projects.  While its likely that states will still have considerable sway on the spending of formula allocation funds, many of the new resources available under IIJA are competitive grants that will be decided by FHWA, likely using these policies.

Promoting transit:  A great way to improve road safety.  Writing at Slate, David Zipper highlights a routinely overlooked strategy for making our transportation system safer:  promoting more travel by transit.  Most road safety efforts don’t ever question our mode of travel.  Zipper points out that travel by bus and train is dramatically less risky for passengers than travel by car.  In addition, buses and trains are almost certainly safer for vulnerable road users, like those persons traveling by bike or on foot, because professional drivers are, on average, considerably safer than car drivers.  A key part of our safety strategy ought to be making transit more convenient, more frequent and cheaper as a way of promoting safety.

New Knowledge

Fixer Upper:  A primer on improving US housing policy.  Brookings Institution economist Jennie Schuetz is one of the most incisive housing policy scholars in the nation, and she has a new must-read book:  Fixer Upper:  How to repair America’s broken housing systems.

Housing is often a complex and wonky subject, especially when addressed by economists, but Schuetz makes her case in a clear and non-technical way.  In a nutshell:  We have too little housing in the places that are in highest demand, largely because we’ve devolved much of the authority for approving housing to local or neighborhood groups whose interests conflict with broader social and environmental objectives.  We also have a national system of subsidies that encourages low density development at the urban fringe (adding to environmental damage and burdening us with greater transportation challenges).  The system of subsidies is overwhelmingly skewed to helping wealthy people buy ever larger homes, while we provide rental housing assistance to less than fourth of the low income households who would be eligible. Around this kernel are a series of inter-related political, economic and social issues, all of which influence and are influenced by housing policy.

Much could be done to make US housing policy better.  Schuetz neatly condenses her prescription into a lucid series of chapter headings that underscore what needs to be done to improve US housing policy.  For example:

  • Build More Homes Where People Want to Live
  • Stop Building Homes in the Wrong Places
  • Give Poor People Money
  • Homeownership Should Be Only One Component of Household Wealth
  • High-Quality Community Infrastructure Is Expensive, But It Benefits Everyone

Each chapter explains the rationale behind this advice, and lays out the academic evidence in favor of the policy.  The book concludes with two chapters laying out the political challenges of improving housing policy, including overcoming the myopia imposed by the highly local nature of land use decision making, and figuring out how to construction new political coalitions that can enact innovative policies to address housing.

There’s much here, both for readers who are well-versed in housing policy and politics and those who aren’t.  If you’re just starting out to try to make sense of the complexity of housing policy, “Fixer Upper” serves as a comprehensive and practical orientation.  And if you consider yourself a housing expert, you’ll find a thoughtful framework for integrating all of the varied aspects of housing policy.

Jennie Schuetz, Fixer Upper:  How to repair America’s broken housing systems, Brookings Institution Press, February 22, 2022

In the news

In its article on infrastructure and climate change, The New York Times pointed its readers to City Observatory’s analysis of the failure of the widening of Houston’s Katy Freeway to alleviate traffic congestion:
Writing at Planetizen, in an essay entitled, “Urban Villages for the Proletariat, Todd Litman describes how dense urban growth promotes equity, and cites City Observatory’s commentary on how the role of cities in promoting intergenerational economic mobility.

The Week Observed, December 17, 2021

What City Observatory did this week

The financial fallout from Louisville’s I-65 boondoggle.  As we showed earlier, Kentucky and Indiana both wasted a billion dollars on doubling the capacity of I-65 across the Ohio River, and also showed how to eliminate traffic congestion.  The $1 to $2 tolls it charges I-65 users slashed traffic in half, and led to essentially none of the expensive new capacity they built being used.  But tolls don’t cover anywhere near the cost of building the widened crossing.  The two states borrowed heavily against future federal grants and hoped for toll revenues, as well as pledging their own resources.

They used creative finance to “backload” the repayment of the debt–keeping payments artificially low in earlier years–hoping that higher tolls and more traffic will bail them out. Already, it’s clear that strategy is failing–traffic is so far below projections, that Kentucky has had to refinance its debt, extending the period of repayment by almost another decade, and adding tens of millions of additional interest cost.  The latest projections show the I-65 crossing will never carry as much traffic as it did before it was widened, and that future generations will be paying off the project’s debt, even as they increasingly suffer the consequences of climate change.

Must read

1. House Bill would force states to address transportation emissions.  The fate of the “Build Back Better” legislation is very much in the balance.  The Washington Post examines one of the bill’s key provisions would create incentives for states to spend federal transportation dollars in ways that reduced greenhouse gas emissions.  State highway departments bristle at the thought, arguing that climate change isn’t their responsibility–notwithstanding the fact that transportation is the largest source of greenhouse gas emissions in the nation.  Their efforts to dodge responsibility prompted the Center for American Progress’s Kevin DeGood to comment “It is farcical to suggest that state DOTs do not deeply influence greenhouse gas emissions from the transportation sector.”

The Post’s story highlights the growing grassroots recognition that transportation is a huge, and largely unaddressed aspect of the climate crisis.  In Portland, young climate activists have been protesting at the headquarters of the Oregon Department of Transportation, calling for an end to freeway widening to save the planet.  The Post quotes 15 year old Adah Crandall:

“As a high-schooler, my peers and I shouldn’t have to be spending our time organizing against freeway expansions because we’re worried about the climate impact. That’s pretty ridiculous,” she said. “I feel a responsibility to do that because I’m afraid of the decisions that will be made if we don’t keep showing up.”

2. Are Seattle’s affordable housing fees resulting in fewer townhouses?  For the past several years, Seattle has been implementing its Mandatory Housing Affordability (MHA), which requires most multi-family developers to either dedicate a portion of their units to low or moderate income households, or pay a fee.  A new report from a group of smaller scale developers in Seattle claims that the fee has eliminated most new townhouse construction in the city.

Townhouse under construction — The Seattle Times)

Townhouses, which are often duplexes, are the kind of gentle density that creates somewhat more affordable housing in expensive cities like Seattle.  The fact that developers have to pay the average $32,000 per unit fee up front, prior to construction, is especially onerous.  In theory, the MHA program gives developers a “density bonus” allowing them to build more square feet of housing than under prior zoning rules, but for townhouses, the economic value from adding a fourth walk-up floor on a townhouse doesn’t necessarily offset the cost of the fee. The group points out that townhouse permit applications have fallen 67 percent in the past two years.  Fewer townhouse units being constructed means even more competition for the existing housing stock, which is likely to make the city’s affordability problems even worse.

3. Austria stops building freeways.  Greens formed a coalition government with conservative parties in Austria, and bargained for key cabinet positions, including the transportation portfolio.  The new minister for transportation announced that henceforth, the country will stop expanding road capacity as a means of meeting its greenhouse gas reduction obligations. “More roads mean more cars and more traffic,” Transport Minister Leonore Gewessler told the AFP news agency.

New Knowledge

Price elasticity of demand for travel.  How much do changes in prices, especially the price of fuel, matter to how much, and by what modes people travel?  That question is one of the microeconomic keys to understanding urban travel demand, and not incidentally, to addressing climate change.  A new paper from University of Florida economist Javier Donna sheds new light on the answer, using a novel and detailed methodology aimed at overcoming the problems with earlier estimates.  His key finding:  in the long run (periods more than a year) travel behavior is much more responsive to gasoline prices and previously estimated.  Donna reports that with respect to gas prices, travel is elastic:  a one percent change in gas prices generates more than a one percent change in automobile travel.  This suggests that pricing can be a much more powerful and effective tool than previously thought.

This is not the first study to estimate price elasticity of demand for travel–there’s actually a substantial literature.  But many of these estimates are confounded, ironically, by the high volume of data about gas prices.  Gas prices fluctuate on a weekly and sometimes daily basis (usually driven by fluctuations in the global oil market).  Travel behavior responds to prices, but individual consumers/travelers face what economists call “switching costs”–for example, if you’re a regular car driver, riding the bus requires learning the bus schedules and routes.  In the face of minor and frequent ups and downs in gas prices, consumers exhibit considerable inertia, and its usually the case that only fairly large and seemingly permanent (or at least long-lasting) price changes prompt many people to change behavior.

What that means as a practical matter is that simple-minded statistical measures that simply look at the gross correlation between weekly fuel prices and vehicle travel volumes find only a modest (in economic terms “inelastic”) effect on travel patterns.  The short term noise tends to obscure finding any long term effects, and, as with most economic goods, elasticities are greater in the long-run than the short-run.

Donna’s paper constructs and calibrates a model that explicitly incorporates switching costs.  It uses data gathered in the Chicago area from 2001 through 2009, to look at how shifts in gas prices influence demand for travel (measured by the volume of cars on Chicago area roads and transit ridership).  For comparative purposes, Donna also constructs two other models that mimic the structure of earlier modeling efforts (myopic and static) to show how his more complex and realistic dynamic modeling produces different results. The key estimates are shown below.

Donna estimates that the elasticity of travel with respect to gas prices is about .5 in the short run and about 1.3 in the long run.  This means a 10 percent increase in gas prices is associated with about a 5 percent decline in travel in the short run, and a similar increase is associated with about a 13 percent decrease in travel in the long run. These measures are for permanent or sustained increases in prices, not daily or weekly fluctuations.  The the “long-run dynamic” estimate–the one that matters most for policy–is considerably higher than the estimate from either the myopic or static models (which fall into the range of estimates produced by other studies.

Donna’s paper suggests that prices can an even more important role in supporting good transportation policy than previously thought.  His finding that travel demand is actually elastic with respect to price (a one percent increase in gas prices decreases driving by more than one percent), means that pricing will change travel behavior, a fact, unfortunately, that many transportation planners are simply in denial about.  Perhaps, no longer.

Javier D. Donna, Measuring Long-Run Gasoline Price Elasticities in Urban Travel Demand”, RAND Journal of Economics, forthcoming, https://doi.org/10.1111/1756-2171.12397. and

Donna, Javier D., Measuring Long-Run Gasoline Price Elasticities in Urban Travel Demand (May 10, 2021). Available at SSRN: https://ssrn.com/abstract=3285200.

In the news

Willamette Week featured our analysis of Oregon Department of Transportation financial plans showing the agency has no expectation of complying with the state’s greenhouse gas reduction goals.
StreetsblogUSA re-published our commentary on Oregon DOT’s duplicity around climate.

The Week Observed, December 10, 2021

What City Observatory did this week

1. ODOT’s real climate strategy:  Pollution as usual.  Oregon’s highway builders are keeping two sets of books, one claiming that it cares about climate issues, the other shows that its financial plans depend on never reducing greenhouse gas emissions.  The Oregon Department of Transportation has a glossy, highly promoted and hopelessly vague “climate strategy,” replete with statements of concern, palliative suggestions and a green fig-leaf logo.  But the agency’s true plans are reflected in its budget and financial documents.  The agency’s revenue forecast shows that its planning for—and counting on—burning just as much gasoline, and creating just as much greenhouse gases at the end of this decade as it does now.

These financial projections—which ODOT presents to bond markets as evidence of its best estimate of future conditions—show that Oregon transportation emissions won’t decrease at all through 2029, and will fall vastly short of the greenhouse gas emission reductions pledged by Governor Brown and mandated by state law.  These financial projections show that this agency is cynically keeping two sets of books when it comes climate:  one, for public consumption that pretends to care, and a second set, that reflects its real intent to continue polluting as usual.

2. Drive-thru restaurants are making cities less livable and killing the planet.  Drive-thru’s for restaurants, coffee shops, banks and other retail establishments are some of the most city-soul crushing infrastructure imaginable.  They take up space, create a desolate and pedestrian hostile environment, and cement car-dominance.  And more than that, idling while waiting to pull up to the window and receive your order produces pollution.  We estimate that your twelve ounce latte, dispensed at a drive-in window generates as much as a pound of carbon.

Must read

1. Our self-imposed scarcity of nice places. Strong Town’s author Daniel Herriges nails it in this essay:  our urban problems in the US stem from a profound shortage of nice places, a problem that is almost entirely self-inflicted:  We’ve made it illegal, through zoning, lot setbacks and by making urban spaces safe for cars, to build the kinds of places American’s most prize (dense, interesting, walkable, mixed use neighborhoods).  The fact that such neighborhoods are so rare makes them expensive:  They’re prized and in short supply, which tends to drive up their prices.  Herriges:

In fact, our shortage of nice places is almost totally self-imposed. And it’s precisely because 98% of the North American built environment is so blah that the 2% of places that are really well-designed environments quickly get bid up by the rich and become inaccessible to the rest of us. The solution to this isn’t to stop creating such places, but to create vastly more of them.

And that correlation between great urban places and high prices has gotten so fixed in the mind of many that there’s a tendency to equate neighborhood improvement with gentrification.  The irony, of course, is that many of today’s great urban neighborhoods were originally constructed as middle class working man’s housing; it’s only because it’s not possible to build more such housing today that such places are unusual and therefore expensive.  The solution to our affordability problems is to legalize dense, mixed use urban neighborhoods so that everyone has access to them.

2. More cycling would reduce traffic congestion, lower greenhouse gas emissions, improve health and create jobs.  A study of London looked at the likely impacts from raising that city’s cycling mode share from its current 2 percent to 14 percent.  It found that this would generate a number of benefits, creating more local jobs in the cycling sales and service.

 It would also reduce car driver trips by around 4 billion vehicle km by 2030 and associated greenhouse gas emissions by nearly 680,000 tonnes. This increase in cycling will produce wider economic benefits of the order of £4.8 billion by 2030. As well as bikes replacing car passenger trips, cargo and e-cargo bikes have the potential to replace up to 7.5% of van mileage by delivery and service companies, and associated carbon, pollution and congestion. A cycle mode share of 14% by 2030 would also create over 25,000 additional green jobs. . .  and these are jobs that can be created quickly.

Rather than building more freeway capacity, if cities were to invest in more cycle infrastructure, more shared cycle systems, and incentives for e-bikes, they’d make vastly greater progress in reducing greenhouse gases, and making cities healthier, more prosperous and more livable.

New Knowledge

Parking and transportation behavior:  A natural experiment.  We’ve long know that the shape of the built environment (density, transit access, parking availability) influences travel decisions.  But simply observing that people in say, denser neighborhoods walk, bike or take transit more often than others isn’t necessarily compelling that density is a cause; it could just be that people who like to walk, bike and ride the bus choose to live in denser neighborhoods.

The research gold standard for ferreting out the cause-and-effect connections between such phenomena, is the randomized trial.  In general, it’s hard to do randomization, but a housing lottery for affordable apartments in San Francisco provided just such a random pool for researchers to study.  Thousands of people entered the lottery for these newly built units, and were assigned randomly to apartments with varying housing and neighborhood characteristics (density, transit access, available parking spaces, etc).  Researchers surveyed these lottery winners about their travel habits to estimate how different aspects of the built environment of each apartment correlated with travel patterns.

Their core finding amplifies the work of Donald Shoup:  The price and availability of on-site parking had the largest observable effect on car ownership.   The car ownership rate for households living in buildings with no on site parking is less than half what it is for apartment buildings with one or more parking space per unit–the standard required by zoning in many communities.  

They conclude that parking is a powerful, and under-appreciated policy lever for improving transportation and livability in urban settings:

Our findings suggest that the potential for fewer private automobile trips is large and does not depend on car-free households relocating to car-free buildings, or on people who like to ride transit moving to transit-rich neighborhoods. We show that household decisions on car ownership and travel depend on transit accessibility and walkability, but even more so on parking supply. . . . Where streets are relatively walkable and transit service is frequent, parking emerges as the key factor shaping household travel behavior — and parking is a factor that is highly amenable to low-cost policy reforms that can rapidly provide benefits.

The author’s also tackled another claim:  That those without easy access to parking (and therefore ability to commute by car) would be disadvantaged in pursuing employment opportunities.  The researchers found no correlation between parking availability and employment outcomes for survey participants.

Adam Millard-Ball, Jeremy West, Nazanin Rezaei, Garima Desai, “What Do Residential Lotteries Show Us About Transportation Choices? Actually, quite a lot., Transactions, Issue 8 | December 2021

In the news

At the Oregonian, Ted Sickinger writes that “Oregon will fail its climate goals if ODOT acts on big freeway projects.”  The article quotes City Observatory research documenting the Oregon DOT financial projects showing that the agency has no intent or expectation that its efforts will produce any reduction in transportation greenhouse gases through 2029, in violation of Governor Kate Brown’s environmental executive order.  Sickinger also notes that OregonDOT has simply ignored Governor Brown’s direction to evaluate road pricing as a way to avoid expensive, unnecessary freeway projects and to reduce greenhouse gas emissions.

Lloyd Alter, writing at Treehugger.com, quotes Joe Cortright’s analysis of the positive health, safety and environmental impacts of higher gas prices in his story “Gas is too cheap.”

The Week Observed, December 3, 2021

What City Observatory did this week

How Portland powered Oregon’s economic success. After decades of lagging the nation, Oregon’s income now exceeds the national average. While some seem to think its a mystery:  It’s not.  It all about a flourishing Portland economy, especially in the central city of the region.  This success has been powered by an influx of talent, especially well-educated young adults drawn to close-in urban neighborhoods.

Income growth in Multnomah County accounts for essentially all of the net improvement in Oregon incomes relative to the nation over the past decade or more.  Rising incomes, especially in the city, have markedly reduced racial and ethnic income gaps. The secrets of economic success:  talent, quality of life, urban amenities, and knowledge industry clusters.

 

Must read

The inequity and economic irrationality of urban freeway construction.  In a precise and trenchant analysis for the Toronto Globe and Mail, Todd Litman eviscerates the economics and rationale for expensive urban freeways.  The province of Ontario is proposing to spend billions to widen major highways in the Toronto region, in the futile attempt to reduce congestion, which as Litman shows, can never be accomplished by adding more capacity.

Litman’s analysis cuts through to two key economic facts:  Wider urban roads cost vastly more than any of their users would willingly pay or their use.  As he points out:

Also, new highways are far more expensive than most people realize, typically costing tens of millions of dollars for each kilometre of lane. Considering land, construction and additional operating expenses, the cost-recovery price for additional highway capacity – the toll required to repay its incremental costs – is typically 50 cents to $2.00 per vehicle-kilometre, far more than what motorists pay in fuel taxes.

As we’ve pointed out at City Observatory, any time highway users are asked to pay even a small fraction of the costs of these roads, demand evaporates.  In a very real sense, the only reason people use new freeway capacity is because somebody else pays for it.  In the aggregate, freeway widening is a value-destroying proposition:  the costs of additional capacity exceed, by perhaps an order of magnitude the value the users place on capacity.  This is the real “equity” problem in transportation:  taxing everyone to provide a largely illustory benefit to a few.  Again, Litman:

As a result, projects such as Highway 413 represent a huge public subsidy to a relatively small number of future users. Although the Ontario government provides no economic analysis, these projects are likely to cost billions of dollars to build and benefit only a tiny portion of Ontario residents. Anybody who will not drive regularly on these new facilities should protest this inequity.

Freeways are still destroying urban neighborhoods.  The role that freeway construction played in destroying city neighborhoods, and especially the homes of people of color is oft-told, but it’s often treated as something that occurred in the distant past.  Writing the LA Times, Liam Dillon shows that the highway onslaught is ongoing, and that new and expanded freeways have wiped out tens of thousands of homes.

. . .  that widenings, extensions and other freeway construction continue to take a significant toll on communities even now.  More than 200,000 people have lost their homes nationwide to federal road projects over the last three decades, according to a Times analysis of federal transportation data. The actual total is higher because many states fail to report how many homes are taken annually.

It’s also worth remembering that the direct destruction of homes and businesses due to freeway construction is just the first effect.  The added volume of traffic makes these neighborhoods more polluted and less livable.  Academic research has shown that population loss continues for decades after freeway construction in urban neighborhoods.

New Knowledge

Subsidies for homeownership encourage sprawl and damage cities.  For decades, the US has offered generous subsidie to homeownership, and granted preferential treatment to new housing. These benefits have not flowed similarly to renters or to existing housing.  As a practical matter, the subsidies disproportionately benefit new suburban greenfield development (as has highway construction).  The result is that public policy is tilted highly toward sprawl.  A key component of this is the favorable tax treatment provided to homeownership.

The study takes advantage of a kind of natural experiment afforded by the repeal of a German homeownership subsidy in 2005.  Up until that date, homeowners could qualify for a special tax credit for the purchase of a home which provided, over eight years, a subsidy equal to about one-sixth the price of a home.  Because—as in the US—owner-occupied housing, especially newer units, is located in suburban rather than city locations, the availability of the credit pushed households to choose more suburban locations.  The authors show that when the credit was repealed, there was an immediate and noticeable re-centralization of population.

Germany’s cities have re-centralized conspicuously ever since the subsidy was repealed. Controlling for distance from the city center and for various fixed effects, population in every central ring (i.e. a ring among the third of rings closest to the center) grew by 5% between 2005 and 2017; while population in every peripheral ring (i.e. a ring among the two thirds of rings closer to the urban fringe) contracted by 2% over the same period.

The author’s estimate that repeal of the homeownership subsidy resulted in about 200,000 fewer housing units in suburbs and about the same amount of new housing in cities, over a period of a years.  Subsidies for homeownership, given the nature of the housing market, are anti-city.  This evidence puts the lie to claims about suburbs (and homeownership) representing the revealed preference of consumers.  Instead, it shows that public policies, especially the tax code, push people to choose suburban locations, and undermine cities.

Daminger, Alexander; Dascher, Kristof (2021) : Homeowner Subsidy Repeal and Housing Recentralization, Beiträge zur Jahrestagung des Vereins für Socialpolitik 2021: Climate Economics, ZBW – Leibniz Information Centre for Economics, Kiel, Hamburg

In the news

City Observatory’s Joe Cortright is quoted in the Oregonian’s story, “Oregon will fail its climate goals if ODOT acts on big freeway projects, environmentalists say,”  “

 

The Week Observed, November 19, 2021

What City Observatory did this week

Why we shouldn’t be whining about higher gas prices. Gas prices are going up, and it’s annoying to have to pay more, but let’s take a closer look at how much we’re paying for gas. Even with a recent uptick, gas prices are still lower than they were a decade ago. Cheap gas is burning the planet, and undercuts all of our efforts to lower greenhouse gas emissions.  In real, inflation-adjusted terms gas prices are still more than $1 per gallon less than they were from 2011 through 2014.

Cheap gas prices depress transit ridership, encourage more driving, lead consumers to buy less efficient vehicles and more SUVs, and undercut efforts to encourage transport electrification.  US gas prices are still only about half of what they are in most European countries.

Must read

1. Housing policy advice for states.  Around the country, states are rethinking the policy framework they put in place that governs local land use decisions, with an eye to easing housing affordability.  Housing policy is complex, and some simple prescriptions turn out to be bad medicine.  Brookings Institution economist Jenny Schuetz has some succinct advice for state policy makers.  Here are four things to keep in mind:

  • Base policy on a clear analysis of housing market conditions
  • Look to increase supply in high demand locations
  • Provide financial support for low income households
  • Use housing policy to reduce climate risk

States have generally delegated so much discretion to local governments that they’ve used it to block development in high demand locations, which has worsened housing affordability, sprawl, and transportation problems and contributes to climate change.  As Schuetz observes:  “building too few homes in places with high demand has serious economic, social, and environmental consequences for metro areas, states, and the country.”  In addition to this short non-technical article, Schuetz has published a longer companion paper that reviews policies in four states, and distills lessons for policymakers.

2. This is what “transit oriented development” looks like.  Many US cities are dabbling in increasing urban densities in neighborhoods well-served by transit, but for the most part, the scale of such development is modest, and frequently provokes NIMBY reactions.  Vancouver British Columbia, which is building a major extension of the region’s automated SkyTrain system is planning for a much more ambitious redevelopment of its Broadway neighborhood.  The plan envisions 30 and 40 story buildings, both residential and commercial.  The development would create a kind of second city skyline South of False Creek.

This is more than just about tall buildings:  it’s about accommodating tens of thousands of new residents and workers in a dense, walkable urban environment.

The emerging direction of the densification strategy calls for increasing Central Broadway’s population by up to 50,000 to about 128,000 residents — an increase of 64% compared to 78,000 residents today. This would be achieved by growing the number of homes in the area from over 60,000 today to up to 90,000 units, with much of this is intended to be more affordable forms of housing.  Added office, retail, restaurant, institutional, and creative industrial spaces would grow the number of jobs from 84,400 today to up to about 126,000 jobs. These residential and employment targets through redevelopment are for the next 30 years through 2050.

If you’re going to make a public investment in transit, it makes sense to allow and encourage this scale of development to maximize the effectiveness of the investment.  Vancouver’s example should show US cities how to think at scale.

3. Yet another induced demand explainer.  This shouldn’t even be necessary at this point; the concept of induced demand is now so well-established in the scientific literature that its called the “fundamental law of road congestion.” In dense urban environments, adding more road capacity simply encourages more driving, and quickly results in more vehicle miles traveled and no improvement in traffic congestion.  But our highway departments still aren’t learning, and many political leaders have yet to grasp the seemingly counter-intuitive idea that more capacity makes traffic problems worse.  Our friends at Transportation for America have a clear and simple explanation of induced demand, and link it to the recently released SHIFT induced demand calculator. They also point out that we can see induced demand, simply by looking in the rearview mirror.  Over the past two decades, we’ve added more than 5,000 lane miles of interstate highway.

All the lanes we’ve built have led to a predictable increase in driving. From 1980-2017, per capita vehicle miles traveled (VMT) increased by 46 percent. In 1993, on average, each person accounted for 21 miles of driving per day in those 100 urbanized areas. By 2017, that number had jumped to 25 miles per day. Every year, Americans are having to drive farther just to accomplish the same things we did back in 1993 every day.

Building more roads doesn’t solve congestion, it just generates more driving, more pollution and more greenhouse gases.

New Knowledge

How does Air BNB affect housing markets?  Several studies have supported the notion that on-line short term rental services, like AirBNB, tend to push up local rents and may displace existing residents.  But that’s just a first-order effect.  Higher rents and greater demand in some neighborhoods then prompts property owners to build more housing.  A new study published in the Harvard Business Review says that the economic impacts of this induced investment can be substantial.
Concerns about the effects of short term rentals on affordability and displacement have led many local governments to restrict or prohibit such rentals.  The variation in these policies over time and across communities provides a kind of natural experiment to see how these restrictions affect development.  The study focuses in on variations in building permit activity in Los Angeles County (which has dozens of different municipalities with widely varying policies).  The authors compared the changes in building permit activity in jurisdictions that restricted short-term rentals (STR) with the changes in activity in nearby or adjacent municipalities without such regulations.  They found that building permit activity was higher in jurisdictions that didn’t restrict short-term rentals.
On the sides of these borders without STR regulations, there were 9% more non-ADU permit applications and 17% more ADU permit applications than on the sides with restrictions. Clearly, demand for STRs has been driving the creation of extra housing capacity in LA, and it’s been especially driving growth for housing that is suitable for home-sharing (i.e., ADUs).
The study suggests that the demand created by AirBNB prompts investors to build more housing in neighborhoods that allow short term rentals.  Increased development increases market for local retail businesses and adds to the property tax base and local revenues, effects that can benefit local neighborhoods.  The research suggests that we need to take a more nuanced view of short-term rentals.  Also: if we’re concerned about affordability and displacement, we might want to look at the whole range of supply constraints, including zoning restrictions, which make it difficult to expand housing availability in the face of increasing demand.

Ron Bekkerman, Maxime C. Cohen, Edward Kung, John Maiden, and Davide Proserpio, “Research: Restricting Airbnb Rentals Reduces Development,” Harvard Business ReviewNovember 17, 2021

In the news

In an interview at New City, Infrastructure Fellow Mike Bloomberg highlights City Observatory’s analysis of the problems with the latest federal legislation, which we’ve labeled the “Bad Infrastructure Bill.”
Next City‘s Sandy Smith provides a nice summary of our analysis of the power of pricing to eliminate traffic congestion in Louisville, Kentucky.
StreetsblogUSA also re-published our analysis of the Louisville highway widening debacle.

 

The Week Observed, November 12, 2021

What City Observatory did this week

Has this city discovered how to solve traffic congestion?  Why aren’t they telling everyone else how this works?  A miracle in Louisville.

Actual, un-retouched photo of rush hour traffic on I-65 in Louisville.

Louisville charges a cheap $1 to $2 toll for people driving across the Ohio River on I-65.  After doubling the size of the I-65 bridges from six lanes to 12, tolls slashed traffic by half, from about 130,000 cars per day to fewer than 65,000.

Kentucky and Indiana wasted a billion dollars on highway capacity that people don’t use or value. If asked to pay for even a fraction of the cost of providing a road, half of all road users say, “No thanks, I’ll go somewhere else” or not take the trip at all.

The fact that highway engineers aren’t celebrating and copying tolling as a proven means to reduce congestion shows they actually don’t give a damn about congestion, but simply want more money to build things.

Must read

1. In Memoriam:  Tony Downs.  Economist Tony Downs died this past week.  He famously applied economic analysis to a wide range of subjects, including elections and voting, and most especially transportation.  Downs is credited with coining the term “induced demand” something that’s now been repeatedly proven in a series of detailed studies.  It’s now referred to as “the fundamental law of road congestion,” and Downs is its intellectual father. As the New York Times wrote of his work:
Mr. Downs moved away from politics in his books “Stuck in Traffic” (1992) and “Still Stuck in Traffic” (2003), in which he postulated “Downs’s Law,” applying it to roads without tolls: “On urban commuter expressways, peak-hour traffic congestion rises to meet maximum capacity.” He attributed the congestion to what he called “induced demand.”  He argued that the best way to reduce traffic is to impose a fee, toll or other form of congestion pricing during rush-hour, an idea that has gained currency in recent years in congested cities like New York.
2. What didn’t cause gentrification in San Francisco:  Building market rate housing. In a widely reported story, the San Francisco Board of Supervisors voted to turn down land use approval for a 495 unit apartment building that would have had 125 affordable units, replacing a parking lot in the city’s downtown.  The reason for their opposition, ostensibly, was fears the project would accelerate gentrification.  Randy Shaw, a local affordable housing advocate, and author of “Generation Priced Out” says the supervisors have it exactly backwards:  gentrification was caused by failing to build more market-rate housing:
Despite the Board’s belief, new market rate housing has not driven San Francisco’s gentrification process. . .  gentrification has overwhelming occurred in neighborhoods that had little to no new multi-unit development.  San Francisco began gentrifying in the late 1970’s. It was entirely unrelated to new market rate housing development. In fact, the massive gentrification of San Francisco through the 1980’s and again during the 90’s dot com boom was accompanied by little new housing; it was instead fueled by increased housing demand pursuing a fixed housing supply.  The city’s failure to build more housing to meet rising demand fueled the gentrification of once affordable neighborhoods.
It’s the acme of NIMBYism to block a market rate housing project that manages to meet a very tough 25 percent affordable housing hurdle, and claim that somehow by preserving a parking lot, you’re preventing gentrification.  This vote means more market pressure on rents because there will be several hundred fewer market rate apartments, and also fewer affordable options.  That, not building housing, will worsen affordability for everyone, and amplify displacement and, yes, gentrification.

 

The Week Observed, November 5, 2021

What City Observatory did this week

The Opposite of Planning:  Why Portland’s Metro government needs to turn down the highway department request for more money to plan future freeway widenings.  On paper, and to admirers, Portland has a pretty potent regional government.  Metro is directly elected, and empowered to make important regional transportation decisions.  It’s being asked by the Oregon Department of Transportation to bless spending a tranche of $36 million (out of an anticipated bill of $200 million) just to plan a revival of the failed I-5 Columbia River Crossing project (which will ultimately cost upwards of $5 billion).  We outline the reasons Metro should say “No!;” all of them are core to Metro’s professed commitment to planning.

Just a portion of the plan to Super-Size I-5 at a cost of $5 billion.

The project is based on 15-year old traffic data used to create obsolete models, it fails to incorporate what we no about induced demand, and how larger freeways simply generate more traffic and increased greenhouse gas emissions.  Despite the fact that the state and region have said they’re going to use road pricing, the project’s planning has failed completely to consider how pricing will affect road use, and how it may reduce or eliminate the need for added capacity.  All this spending on roads flies in the face of supposed commitments (Hello, Glasgow!) to address climate change.  Finally, the Oregon and Washington DOT’s don’t actually have a serious financial plan for the $5 billion cost, and to add insult to injury, they’re asking Metro to divert money from maintaining existing roads to subsidizing the planning for this expansion.  It’s the opposite of planning.

Must read

1. The problem with minimum bike parking requirements.  The hegemony of the automobile in urban transportation leads some to assume that, logically, promoting other modes requires somehow repeating for bikes or walking or transit the preferential policies that have favored cars. One case in point:  bike parking requirements.  Some communities now look to stimulate bike use by mandating bike parking as a condition of land use approvals. While superficially that may seem fair, New American Planning argues that the trouble is the connection between biking and particular land uses varies substantially from place to place, and we don’t have a really valid baseline for judging how much parking is really needed or useful.

2. A detailed picture of segregation in Philadelphia.  Last month, we ranked the nation’s 50 large metropolitan areas based on 2020 data on their racial segregation.  Philadelphia is one of the ten most segregated metro areas by that measure. The Philadelphia Inquirer has an impressive and detailed by of data journalism exploring the geography of segregation in the City of Brotherly Love.  The article has copious maps showing concentrations of different racial and ethnic groups.
It also has a lucid and well-illustrated explanation of the “dissimilarity index” commonly used to measure segregation, and compare levels of segregation over time, and across cities.  The Inquirer piece is a model exploration of segregation at the city level.
3. Our self-imposed scarcity of nice places.  Writing at Strong Towns, Daniel Herriges takes a close look at how the commonplace workaday neighborhoods of the 19th Century have become some of the most desirable and expensive places to live in any American city.  He notes that when they were built, streets like Milwaukee Avenue in Minneapolis were solidly working class.  Today, with their density, walkability and assortment of mixed use buildings, they now command top dollar in the real estate market.  It isn’t just the historic quality or quaintness of these neighborhoods that gives them their high value:  it is that they are in short supply.  For decades, all we’ve built is sprawling, low density auto-dependent places, with the result that the few remaining walkable places have a scarcity value.  In part, as we’ve argued at City Observatory, that’s because we’ve made it illegal, via zoning codes and parking requirements, to build this kind of neighborhood.
Herriges also makes an important point about recent “New Urbanist” efforts to replicate these design principles in purpose-built new communities like Seaside Florida.  Critics have observed, correctly that such communities have become expensive and elite.  But has Herriges points out, that’s not inherent to their design, its again, because such places are in short supply.  Developers of new urbanist places actually find that the public infrastructure is less expensive, largely because development is more compact.  The fact that dense, walkable places–whether old or new–command high prices isn’t a signal that their somehow elitist, but rather a clear market sign that we need much more aggressively increase the supply to meet an unrequited demand.

New Knowledge

Cities grew and and became more diverse over the past decade.  Brookings Institution demographer Bill Frey has a look at the trends population growth and in the racial and ethnic composition of large US cities based on the latest 2020 Census data.  In general, over the past decade, large US cities added more residents, and also became more diverse.  Of the 50 largest cities, 46 gained population in the decade between 2010 and 2020; and population growth in these cities was collectively faster in the past ten years (8.5 percent) than in the ten years from 2000 to 2010 (5.6 percent).  In addition to accelerating population growth, large cities led the nation in increased diversity.

Over the past two decades, the share of the population that is Non-Hispanic white in large cities has declined from about 42 percent to 36 percent.  In the aggregate the 50 largest cities are now a little more than one-third Non-Hispanic White, a little less than one-third Hispanic, about 20 percent Black and about 10 percent Asian. While the trend of increasing diversity is playing out nationally, these large cities are considerably more diverse than the nation as a whole.

Not surprisingly, increasing racial and ethnic diversity shows up most in the younger residents of cities.  Among younger age cohorts, the fraction of the population that is Black, Hispanic or Asian is higher than it is for older age cohorts.

It is important to emphasize that these are data for the population inside the city limits of the largest cities.  As we’ve frequently noted at City Observatory, municipal boundaries capture widely varied portions of their metro areas; some large cities encompass much of their region, including lower density suburban style development; while in other places, the largest city is just the higher density, older urban core.

William H. Frey, 2020 Census: Big cities grew and became more diverse, especially among their youth,
Brookings Institution Report, October 28, 2021

In the News

Streetsblog republished our essay calling out the one place where socialism flourishes in America:  in parking.

Strong Towns republished our commentary “Ten reasons not to trust your state DOT claims about highway widening.”

 

The Week Observed, October 22, 2021

What City Observatory did this week

America’s least and most segregated metro areas:  Evidence from Census 2020.  Racial segregation remains a chronic problem in US metropolitan areas.  Data from Census 2020 provides a hyper-detailed, decadal check-in on the state of segregation.  The good news is that Black-white segregation continues, slowly, to decline in virtually all US metro areas.  The bad news is that it is still at relatively high levels.

We use data from the Brown University Diversity and Disparity project to rank the 53 most populous US metro areas by their Black-White dissimilarity index.  The most segregated large metros tend to be in the Northeast and Midwest (Milwaukee, Detroit, New York and Chicago) Among the nation’s least segregated metro areas:  Tucson, Salt Lake and Portland.  Some cities have made more progress than others:  Portland has gone from more segregated than the typical metro area in 1970, to one of the three least segregated today.

Must read

1. An induced demand calculator for every state and metropolitan area.  Over the past decade, a series of research studies have firmly established the science behind “the fundamental law of road congestion.”  Widening un-priced roads in urban areas does nothing to relieve congestion because added capacity or increased throughput generates additional peak hour trips in exact proportion to increased capacity.  Following on pioneering work byJamey Volker, Amy Lee and Susan Handy at UC Davis, the Rocky Mountain Institute, working with a series of other organizations, has adapted the California induced demand calculator to metro areas and counties across the nation.  Here’s a sample output for a 12 lane-mile freeway widening in Portland:

The calculator allows users to estimate the increase in vehicle miles traveled and added greenhouse gas emissions from highway expansion projects in any state.  (Full disclosure:  City Observatory participating in reviewing this calculator tool).

2. DOT traffic forecasts are a deceptive, unscientific black box.  State DOT’s routinely use traffic forecasts to build a case for highway expansions, and to portray their negative environmental consequences as minor.  But traffic models are complex, opaque, and regularly manipulated.  This is readily apparent in Maryland, where that state’s DOT just a few months ago maintained that one highway alternative would produce gridlock, but now that its preferred alternative hasn’t been selected, it got a new set of model runs to say just the opposite.  Freeway-widening opponents in Maryland have called on the US Department of Transportation to withdraw the project’s Supplemental Environmental IMpact Statement, which relies on these baldly manipulated forecasts.  The estimable Ben Ross of Maryland Transit Opportunities Coalition explains:
The state’s computer model claims that widening the Beltway in Bethesda will get rid of traffic jams on US 50 to Annapolis. And it says that feeding three more lanes of cars into the awful Beltway-270 merge at Wisconsin Avenue will reduce traffic there. This model lacks all credibility and is not a legitimate basis for public decision-making,  When federal and county planners initially suggested widening the American Legion Bridge and I-270 without adding lanes at the Wisconsin Avenue merge, the Maryland Dept. of Transportation rejected the idea and insisted it would create a new bottleneck at the merge point, But then the political winds reversed, and the new bottleneck somehow vanished. How can anyone who drives on the Beltway believe this model?” 
This kind of opaque, black-box modeling that invariably produces results that confirm state DOT biases is the opposite of the open, science-based process called for by the National Environmental Policy Act.  Reliance on these back-room models undercuts the ability of the public to understand, question, and when necessary challenge these projects.  The USDOT needs to crack down on this abuse and manipulation.
3. Are historic districts the new exclusionary zoning?  The mechanisms by which communities and neighborhoods achieve and enforce racial segregation have steadily morphed over time.  A century ago, segregation was explicit policy or widely acknowledged fact, with different racial groups limited to specific neighborhoods by zoning, covenants and steering by real estate agents.  As courts and civil rights laws struck down more explicit measures, racial discrimination has been achieved by more indirect methods, like single family zoning.  As the NIMBY movement pushes to eliminate apartment bans and legalize missing middle housing more broadly, there remains the risk that racial and economic segregation will be perpetuated by other, still more subtle means.  In Portland, Sightline’s Michael Andersen takes a close look at how federal historic designations for neighborhoods could end up perpetuating segregation, unless the city constructs its policies carefully.  Andersen warns that in Portland:
. . . a series of interlocking national, state and local rules will give a group of landowners in an area the ability to essentially override zoning without sign-off from a single elected official.
This article helpfully outlines some relatively minor tweaks that could avoid this outcome.  As cities around the nation wrestle with expanding housing supplies and easing restrictive local rules, they need to be wary that exclusionary practices will just adopt new forms.

New Knowledge

Crime rates went down 20 percent during the pandemic.  The sensational news of the past year or so has been a spike in homicide rates during the pandemic.  The FBI reports that the total number of murders in the US grew an unprecedented 30 percent between 2019 and 2020, to 22,000.  That’s led many observers to conclude we’re in a new era of rising crime.

But it appears that the rise in homicides—though tragic and alarming—is not indicative of crime, or even violent crime in general.  Data from the national survey of crime victimization, which has been collected on a consistent basis for for than three decades show that virtually every category of violent crime declined in 2020 compared to 2019.

  • The violent victimization rate declined from 21.0 per 1,000 persons age 12 or older in 2019 to 16.4 per 1,000 in 2020.
  • The number of violent crimes, excluding simple assault, fell from 2.0 million in 2019 to 1.6 million in 2020.
  • The number of burglary and trespassing victimizations declined from 2019 (2.2 million) to 2020 (1.7 million).
  • About 40% of violent victimizations and 33% of property victimizations were reported to police in 2020.

The victimization survey data are a particularly important measure of crime, because not all crimes are reported to police, and variations in reporting over time and across different jurisdictions makes crime report data somewhat noisy.  It’s hard to know how to interpret this decline.  With the pandemic, 2020 was an unusual year.  Stay-at-home orders and caution prompted many of us to refrain from leaving our homes and participating in a wide variety of social activities.  It may well be there was less crime because fewer people were out and about.  There are good reasons to be concerned about the increase in homicides, but clearly we need a much more nuanced look at crime to understand what’s going on than headline alarms about a general surge in crime.

Rachel E. Morgan, Ph.D. and Alexandra Thompson, Criminal Victimization, 2020, National Center for Justice Statistics, NCJ Number 301775, October 2021

 

The Week Observed, October 15, 2021

What City Observatory did this week

Ten reasons you can’t trust DOT claims that widening highways reduces pollution.  Highway departments are fond of ginning up traffic projections and air quality analyses claiming that wider highways will reduce pollution.  It’s an elaborate con.  We take a close look at Portland’s proposed $1.2 billion I-5 Rose Quarter freeway widening project and itemize the ten reasons why no one should believe claims made about its environmental effects made by the Oregon Department of Transportation.

I-5 is now 83 feet wide; ODOT plans to widen it to 160 feet, enough for a ten lane freeway.

Must read

How (not) to lessen the nation’s housing affordability challenges.  Politico examines the debate in Congress over proposed measures to give assistance to first-time home buyers to help them better compete for increasingly expensive homes. Several measures aim to lessen the racial wealth gap by providing down payment assistance to first-time, first-generation home buyers.  This approach acknowledges that the cumulative effect of historic discrimination in the housing market disproportionately affects buyers of color today:  If your parents couldn’t manage to buy a home or if they ended up in a neighborhood with depressed home values, they couldn’t do much to help their children with a down payment.  While there’s a great need to redress these historic inequities, one of the challenges with a sizable aid program for first time buyers is that it simply adds to the demand for housing, while doing little to increase supply.  More capital available for housing in this market is likely to drive up home prices even faster; we may improve affordability for a few, but may worsen it for the many.

Peter Norton on the perils of the self-driving car.  Historian Peter Norton has chronicled the industry-led re-writing of the literal and figurative “rules of the road” by automobile manufacturers a century ago.  This re-write elevated autos and reduced those on foot to second-class citizens. The process was fed by a combination of technological optimism and a sense of manifest destiny which Norton says, the purveyors of self-driving cars are looking to repeat to advance and enshrine autonomous vehicles on the nation’s streets.  In an interview with Bloomberg’s David Zipper, Norton warns that the next new thing is always, a distraction from really addressing transportation needs.

High-tech “solutions,” always just over the horizon, are supposed to offer the anticipated deliverance. The lack, however, lies not in technology but in the aspiration itself. Meanwhile the supposed solutions, in promising an eventual end to all our afflictions, divert us from transport sufficiency: an unspectacular state in which everyone can meet their practical needs.

A “must-watch”:  Why building more housing is critical to addressing climate change.  CalYIMBY has a new video making the case that building more housing, especially in the right locations, is essential to tackling the climate crisis.  As the video’s author’s explain, sprawling single family development and apartment bans lead to long commutes and more driving, making our climate crisis worse:
For decades, planners and climate experts have known that sprawl-style, single-unit housing development leads to more pollution from cars. The reason is simple: Single-unit houses require more land than multi-family homes, and end up forcing their residents into ever-longer car commutes. The more people have to drive, the more climate pollution they cause.

New Knowledge

David Card wins the Economics Nobel.  Berkeley economist David Card was one of three labor economists who received the Nobel award for economics this week.  Along with Card, Joshua Angrist and Guido Imbens were recognized for their pioneering empirical analyses of labor markets.

Card and his late-co-author, Alan Krueger, were joint authors of a series of studies of the effect of minimum wages on employment, famously published in the book “Myth and Measurement” published in 1995.  It’s an appropriate title.  The notion that a minimum wage necessarily had to lower employment levels was taken as an article of faith by many, if not most economists.

Card and Krueger put that theory to the test with a series of careful analyses of quasi-experimental situations, for example, looking at the variation in employment rates in adjoining states, one of which raised its minimum wage and one which didn’t.  Their work was the initial challenge to the received wisdom about the negative effects of minimum wages on employment; its subsequently been buttressed by a wide range of studies that have applied and extended Card and Krueger’s empirical techniques.  As Bloomberg’s Noah Smith writes:

Card and Krueger (who sadly died before he could receive the prize) examined a 1992 minimum wage hike in New Jersey, and found that it didn’t result in a loss of jobs. They compared New Jersey to neighboring Pennsylvania, and found no job loss. They compared high-wage restaurants in New Jersey to low-wage restaurants — still, no job loss. Maybe even a little bit of job gain, actually.

Nowadays, that’s hardly an unusual finding — indeed, it’s the norm, and economists have changed their outlook on the issue as a result. But back then, it was almost heresy. The basic theory of competitive supply and demand says that when you raise the minimum wage, people get thrown out of work!

The empirical observation has triggered a shift in the theoretical analysis of labor markets:  rather than being perfectly competitive, they are quasi-oligopolistic, with firms having market power over wages, which the minimum wage counterbalances.  That’s not to say that there’s not minimum wage that wouldn’t cost jobs, but within some range—now easily exceeding $15 most everywhere—the minimum wage isn’t a job killer.

There is an urbanist side note here as well:  The University of California, Berkeley awards its Nobel Laureates free lifetime parking spots on campus.  Card, however bikes to work.  Perhaps this is an opportunity for another economist, UCLA’s emeritus professor Don Shoup to come to the rescue and suggest a suitable solution; and Shoup is another scholar that the Nobel committee ought to have its eye on for his painstaking application of economic principles to the mundane field of parking.

In the news

Blogger Sam Sklar responded to our commentary “Where we embrace socialism in the US.” While he agrees with the thrust of the commentary, he’s skeptical most Americans will recognize that our attitudes about “free parking” are fundamentally socialist.

The Week Observed, September 24, 2021

What City Observatory did this week

Freeway-widening grifters: Woke-washing, fraud and incompetence.  The Oregon Department of Transportation has been trying to sell its $1.25 billion freeway widening project as a way of restoring the historically Black Albina neighborhood that was decimated by three highways the agency built in the 1950s, 1960s and 1970s.  It’s absurd on its face to suggest that making a freeway wider—and increasing traffic and pollution—will help the neighborhood, but ODOT has double-down on that phony claim with a “woke-washed” media campaign that depicts a rebuilt neighborhood with new housing, a career center, and other development.  A newly released brochure doesn’t show the plan to widen the freeway to as many as ten lanes or even any cars, but does depict imaginary buildings which are not actually part of the project.  The fictional career center even carries a plaque reading:  “constructed by Black artisans in 2022.”

Trouble is that the project’s budget contains no funding for any of these highlighted amenities.  It’s a cynical, misleading attempt to sell a freeway.  Plus, a 4-page, full color mailer that ODOT sent to thousands of Portland households is studded with dozens of typographical errors.

Must read

1. You don’t have to be afraid of public transit. The coverage of public transportation in the United States is constantly intertwined with people’s fears. From crime to homelessness to illness, public transit is typically presented as dangerous and undesirable. Is this unappealing image a reality? Vice’s Aaron Gordon thinks not. Gordon argues, “Before we can fix American public transportation, we need to fix the way we talk about it.” This article debunks the incorrect, poorly constructed arguments against public transportation in the media. Gordon breaks down the lack of credible sources and the misguided framing of transit in the media and politics. Gordon writes,

“Mass transit is a window into the American psyche . . .  It is a subject on which facts always seem secondary to narratives, a service covered through the psychology and experiences of those who don’t and won’t use it, or who aspire to never have to use it again, rather than those who do and must, or heaven forfend even prefer it to driving.”

When it comes to public transportation, narratives hold the power. The most dominant one  discourages riders and promotes a misguided sense of “danger” on mass transit. Gordon is fighting for a change in the narrative, advocating for one that removes the Boogeyman from transit nationwide.

2. The high cost of transportation in the United States. The average American spends roughly 13 percent of their household expenditures on transportation. On average, an American household spends $9,737 each on transportation costs annually, with the bulk of it being car ownership and maintenance. In this blogpost, The Institute for Transportation and Development Policy explored the distribution of these expenditures across income levels. Their analysis examined transportation expenditures by income level. As the income quintile rises, the portion of expenditures on transportation decreases. This means that the lowest income bracket pay the largest portion of their income on transportation. So, while on average, Americans spend about 13 percent of household costs on transportation, the lowest quintile of Americans spent 29 percent. Due to the lack of affordable transportation options, low income households are forced to spend their money on personal vehicles and the extensive associated costs. Europe’s transportation system, on the other hand, is different. Plentiful transit, denser land use, and tax policies create a more efficient, equitable cost structure. The proportion of transportation expenditures increase by income quintile, and the revenue from it is implemented into the public transportation system. If the United States want to reduce the financial strain on its poorest citizens, we need to focus on creating public transit that supports all people.

3.  Incentives are important tools for policymakers and entrepreneurial firms to uplift local businesses and support local economic development. Recently, Smart Incentives published a new  Kauffman Foundation-funded report and a technical appendix detailing the existing condition of business incentives, highlighting successes and failures and a path towards better policy. Here, Darrene Hackler presents three recommendations for maximizing the impact of incentives. The first improvement starts with policy design – strategic planning and comprehensive research are needed to create effective incentives. The most effective programs are part of a larger economic development strategy, one that identifies and tackles the needs of the local ecosystem. The second level of improvement is through program management and implementation. Proper administration and guidance throughout the incentive program leads to better results. The final level is monitoring and assessment:  \ policymakers should establish data and research standards to create a program that they can learn from and compare their results to similar efforts.

New Knowledge

Revising the history of red-lining. Redlining, the practice of not making mortgage loans in low income and predominantly minority neighborhoods, is part of a widely reported history of segregation in the US. There’s a kind of “just-so” story, that points to maps drawn in the late 1930s by HOLC, the federal government’s Homeowners Loan Corporation, which rated neighborhoods in cities across the nation as “high risk” (red) and low risk “green.” It’s often assumed that these HOLC maps were a prime reason for segregation and lending discrimination. But a new paper makes it clear that the reality is more complex.

HOLC was just the first of two major housing programs established as part of the New Deal, and was a temporary measure to refinance existing home mortgages after private lending collapsed. It was augmented by a permanent agency, the Federal Housing Administration (FHA), which guaranteed new home mortgages. While HOLC did all of its lending in the mid-1930s, FHA has continued.Key features of FHA lending, particularly restricting new home mortgages to chiefly suburban areas led to charges the agency was discriminating against people of color and low income households. There’s clear substance to these charges, but FHA systematically destroyed its records and maps of lending activity, leaving present day scholars with scant data on the actual geographic patterns of its lending. The HOLC maps, on the other hand, famously survived, and have been digitized for use by researchers.

It’s common to conflate HOLC and FHA in telling the tale of systemic housing discrimination, but Fishback and his co-authors cast doubt on the claims that HOLC played a key role. After examining local records of mortgage activity in three cities, they found that HOLC refinanced loans to black homeowners in rough proportion to their representation in the local population. In comparing HOLC maps and lending activity by HOLC and FHA, the authors find that even “redlined” areas had substantial HOLC lending activity. In contrast, FHA did largely exclude low rated neighborhoods from lending.

In addition, HOLC’s maps were created after the agency had issued nearly all of its mortgages. And finally, the maps themselves, though public today, were kept confidential by the agency at the time.

It’s clear that housing discrimination was rampant in US cities in the 1930s, but it didn’t begin with, nor was it limited just to these two agencies. Homeowners, realtors, city officials, banks and others widely practiced segregation and lending discrimination, and the maps that HOLC drew were more an illustration of extant discrimination than its cause.

Price Fishback, Jonathan Rose, Kenneth Snowden & Thomas Stores, New Evidence on Redlining by Federal Housing Programs in the 1930s, September 2021, National Bureau of Economic Research Working Paper, No. 29244

In the news

City Observatory’s Joe Cortright was a featured guest on KGW-TV’s “Straight Talk” public affairs program, debating the merits of the proposed $5 billion effort to revive the Columbia River Crossing.

The Week Observed, September 3, 2021

What City Observatory did this week

Portland’s Clean Energy Fund needs accountability.  Portland voters approved a ballot measure creating a $60 million annual fund to invest in community-based clean energy projects, particularly ones that promote equity.  It’s a well-intended program, but in practice the review process that’s been developed does too little to establish measurable results, either in reducing greenhouse gas emissions, or improving the lot of historically disadvantaged communities.  We look at the critique of the program offered by former Portland City Commissioner Steve Novick and state greenhouse gas commission chair Angus Duncan.

Purple mountain tragedy. Oregon’s iconic Mt. Hood sees its snowpack disappearing.

Must read

1. Paul Krugman connects the dots: Urbanism, Housing shortages and a polarized nation.  California, in a microcosm represents the key challenges of urbanism in a knowledge economy.  The state has flourished because its created strong industry clusters and attracted (and educated) tons of knowledgable workers.  In the past, the growth of such industries has produced wide benefits, but it California, acces to those benefits is limited by the state’s highly constrained housing supply–the product of NIMBY land use policies.  As a result, access to opportunity is cut off for many.  As Krugman writes:

” . . what we see in California is that while highly educated workers are moving in to serve the tech boom, less-educated workers are moving out,”

Working-class families should be able to share the region’s economic success. Yet, with housing costs rising, these families are being driven out.  In turn, this pattern of migration is amplifying the blue/red polarization of American communities. Krugman argues for increasing housing density in California could broad access to opportunity. Building more housing helps create diverse, healthy cities. Housing is at the core of The Golden State’s problems.

2. What if we applied scooter rules to cars?  Streetsblog’s Kea Wilson has a devastatingly on-point critique of the profound double-standard at work in US transportation policy.  Tiny electric scooters–a relatively new transportation innovation on urban streets–are subjected to a range of technological and regulatory standards far in excess of those applied to the much deadlier and prolific two-ton metals boxes that dominate US streets.  Scooters have built-in speed limiters, and also have “geo-fencing” that keeps them out of some areas (and regulates their speed still further in others).  Many even have built-in technology to prevent drunk operators, and cities and states have strict weight and size limits.  How much safer would the US transportation system be if we applied even a fraction of this level of regulation and scrutiny to the vehicles that kill or maim tens of thousands of person each year?

3. Are students an environmental menace? Airports, highways, and college classrooms are all in the same boat when we consider environmentally damaging infrastructure. One of these is not like the others, yet, according to the California Environmental Quality Act, they all fit in the same category. A lower court judge recently ruled that “increases in student enrollment above the current enrollment level at UC–Berkeley could result in an adverse change or alteration of the physical environment.” Siding with a local group, Save Berkeley’s Neighborhoods, the court halted the university’s ability to increase enrollment. Now, the university must reassess the ecological cost of its students.  Slate’s Henry Grabar writes on this decision and talks to members of the Berkeley community across both sides of the debate. The author illustrates how this controversial decision is tied to the NIMBYism of Berkeley residents. He writes, “the ‘environment’ protected by a university enrollment freeze isn’t the atmosphere, or ecologically sensitive habitats, or the wildlands of urban sprawl. It’s the front lawns and quiet nights of Berkeley homeowners.” Are college students really an environmental burden on the same scale as highway construction? What benefit does limiting the number of diverse students at a prestigious university do for our climate goals?  The final irony, of course, is that the CEQA process doesn’t ask what would be the environmental impact of not admitting more students to Berkeley; because the university is denser and better served by transit than alternatives, its likely that the total environmental impact of educating these students would be less than having them go to college elsewhere.

New Knowledge

Cities, clusters and knowledge:  Why the pandemic likely is reducing innovation and productivity.  Berkeley’s Enrico Moretti has written several papers documenting the economic gains we get from urbanization (and the economic costs we experience from restricting the growth of the most successful, productive cities).  Wired drew attention to a new paper from Moretti looking specifically at the role of cities, and the close collaboration and interaction that they facilitate, in stimulating innovation.

Moretti’s paper, “The Effect of High-Tech Clusters on the Productivity of Top Inventors” looks specifically at patenting activity as an indicator of innovation.  Inventors of all sorts are more inventive if they’re in a milieu with lots of other inventors.  Moretti looks at a variety of measures of how patenting changes with the size of the local industry cluster:   inventors in larger clusters tend to be more productive, on average, than those in smaller clusters, and importantly, those who move to larger clusters become more productive.

As the Wired article summarizes:

. .  . people in the tech and biomedical sectors who move from a small industry cluster to a larger one like those in Silicon Valley, Seattle, or Austin become 12 to 14 per cent more productive, Moretti’s research shows. They become more productive almost immediately after they move, and they sustain their productivity for several years afterwards.

The converse is also true:  When a cluster shrinks, the inventors in the cluster become, on average less productive.  Moretti looks at the decline in patent productivty in Rochester New York, following the decline of the photo-optic cluster anchored by Kodak.

All this has implications for remote work in the wake of Covid:  If researchers distant from strong clusters are less inventive, that’s likely to make them and the firms that employ them less successful.  In contrast, cities with strong local clusters of businesses are likely to generate more novel ideas, and be more successful over time.  As we’ve argued at City Observatory, the competitive aspect of lower remote work productivity isn’t apparent when everyone has to work at a distance, but as more and more of us return to the office, or to the lab, those who work remotely (and the firms that employ them) will increasingly be at a disadvantage.

Enrico Moretti, The Effect of High-Tech Clusters on the Productivity of Top Inventors, University of California, Berkeley, March 2021.

 

The Week Observed, August 27, 2021

What City Observatory did this week

Is the campus 100 percent clean energy?  (Only if you don’t count the cars and parking lots).  Stanford University announced that its near to realizing a goal to move all of its campus electricity to solar production, and that predictably generated a lot of positive press, some of which made the more sweeping claim that all of Stanford’s energy was now sustainable.

The university’s statement was more circumspect, and the campus sustainability plan notes that it still has a lot of work to do to deal with so-called “scope 3” emissions (all the greenhouse gases that come from people traveling to and from campus on university business.

Must read

1. Why we need to reduce driving to address climate change.  In an essay for the Rocky Mountain Institute, Brian Yudkin, Duncan Kay, Jane Marsh and Jackson Tomchek  argue that new policies to change land use and encourage more active transportation are central to reducing driving and greenhouse gas emissions from cars.  The essay neatly recites the facts about the central role driving plays in climate change, and sketches out policy options, especially including promoting urban infill development and transit-oriented housing, as well as pricing housing and stopping investments in new highway capacity. They push back directly on the predictable questions about equity, noting that we routinely overlook the inherent inequity of our current car-dependent transportation system.

Transportation investments overwhelmingly favor the personal vehicle over public transit and other non-drivers despite the oversized environmental and social costs associated with driving. Further, because the federal government uses tax dollars to fund transportation, non-drivers are essentially subsidizing drivers.

2. Is Uber burning through its cash reserves?  Self proclaimed ride-hailing “disruptors” Uber and Lyft have made a mark on urban transportation over the past several years.  The big question going forward is whether they have viable financial models. Analyst Hubert Horan has sifted through the two company’s latest financial reports, and concludes that neither is profitable, and both are quickly burning through the cash generated by their initial public offerings.  Uber has tried, during the pandemic to pivot to food delivery, but with no apparent positive effect on profitability.  Horan calculates that Uber had negative 38 percent profit margin on sales in the first half of 2021.  He goes on to argue:

Uber’s operating crisis has seriously, perhaps fatally, undermined the narrative that its stakeholders had accepted for so many years. Customers have begun to doubt their longstanding view that they could rely on Uber to provide service at good prices whenever they wanted.  If current prices persist, they will begin to realize that Uber now costs more than the traditional taxi companies they drove out of business. . . Uber had never “disrupted” urban car service economics. It can no longer provide the subsidies to keep drivers and customers happy. As noted, Uber’s cash position has fallen by over $8 billion since the 2019 IPO. While it still has $6 billion on hand, a company that has lost $28 billion in the last 5 ½ years cannot expect to be able to raise significant new equity . . .

The flood of investor cash that has gone into ride hailing has propped up its adoption and growth, but it’s far from clear that, at least at current pricing levels, that its sustainable.

New Knowledge

Evaporating evidence for the existence and importance of “food deserts.”  Few ideas have gotten more traction, more quickly than the idea of “food deserts”  the notion that some communities, especially the poor and people of color, suffer from poor nutrition because of a lack of nearby grocery stores.  The “food desert” metaphor has a simple, and powerful imagery, but increasingly the evidence for this thesis is evaporating under closer scrutiny.  Already scholars have challenged the purported link between nutrition and store proximity; a new report from the Brookings Institution takes a closer look at some of the problematic analyses of geographic proximity.
Geographic analyses often count the number of stores in nearby neighborhoods and use this metric as a proximity for food access. But as George and Tomer illustrate, this often misses the fact that nearby grocery stores may be technically in a different neighborhood, a problem compounded by the fact that Census Tract boundaries often follow major arterials and stores are located right on the boundary between neighborhoods.
A related problem is that census tracts, commonly used to define neighborhoods, are geographically much smaller in cities than they are in suburbs or rural areas.  In a city, a tract might include only a few dozen blocks, while a suburban tract would incorporate many square miles, overstating the proximity of stores in suburbs relative to dense urban neighborhoods.
George and Tomer also cite USDA survey data showing that few American consumers (regardless of income or car ownership) shop at the nearest store; the average travel to a store is about 2-3 miles.  Looking only at the nearest store or even the immediate neighborhood doesn’t capture actual shopping choices.
While the ability to map purported “deserts” captures the eye and the imagination, George and Tomer remind us that income and financial security, not store proximity is by far a bigger factor affecting nutrition and health for most Americans.  Nearby stores are of little use if you don’t have the income to afford healthy food.  As the author’s succinctly put it:
. . .  food deserts are a red herring in terms of ending food insecurity in the United States. As the USDA stated bluntly in a 2014 study of Supplemental Nutrition Assistance Program (SNAP) participants: “Geographic access to food was generally not associated with the percentage of households that were food insecure.” Even with perfect, universal access to food retailers, millions of Americans would not be able to afford enough food, or enough of the kinds of food, to meet their household’s needs.
The food desert is a graphically compelling “just so” story that over-simplifies the food security challenge, and diverts attention from more fundamental causes.  As the Brookings authors conclude, we need a new approach to thinking about this problem.
Caroline George and Adie Tomer, Beyond ‘food deserts’: America needs a new approach to mapping food insecurity, Brookings Institution, August 17, 2021. https://www.brookings.edu/research/beyond-food-deserts-america-needs-a-new-approach-to-mapping-food-insecurity/

In the News

Slate’s Henry Grabar quotes City Observatory’s Joe Cortright in his article, “The perverse reason its easier to build new highways than new subways.”

The Week Observed, August 20, 2021

What City Observatory did this week

Cost of Living and Auto Insurance. We often compare the affordability of different cities with a clear focus on housing prices and rents. This week at City Observatory we are interested in the role that insurance plays in the cost of living across metropolitan areas. Location has a major influence in the amount of money consumers pay to insure their assets. A driver in Detroit can pay thousands of dollars more insuring their car than a driver in Chicago. Why? Here, we explore the notable variation of auto insurance across the largest metropolitan areas in the United States and consider the reasons for these differences.

We find that Detroit, New Orleans, and Miami have the highest annual auto insurance rates and that there is wide variation across metro areas. Racial demographics and state insurance policy appear to be major players in this variation. Millions of Americans drive their cars every single day. A car dependent society is reliant on car insurance as well. Rates can make a major dent in your wallet every year and where you live may be the reason why.

Cost of Living and Homeowner’s Insurance. The impacts of climate change on our built environment have been increasing in recent years. Disaster payouts from reinsurers, the firms that insure the insurance companies, were the fifth highest in history last year. As the world becomes a more volatile place, homeowners insurance rates will be adjusted accordingly. In this piece, we examine how homeowners insurance rates vary across the largest metropolitan areas in the United States and how these rates contribute to the overall cost of living. What we found – Miami, New Orleans, and Oklahoma City had the highest rates while five California metro areas had the lowest.

The risk of extreme weather in Southern cities appears to heighten the rates of homeowners insurance. As a result of this, insurance rates in generally affordable metro areas like Oklahoma City and Memphis heighten the overall cost of living, potentially high enough to cancel out the lower rents. Homeowners insurance rates have notable variation across the metro areas. In the future, the price of insuring your home could have a significant influence on the overall cost of living in disaster-prone metropolitan areas as we continue to experience the impacts of climate change. When the risk for extreme weather increases, the role of insurance will increase too.

Must read

1. Vancouver considers road pricing.  No urban center in the United States has implemented an extensive pricing system that charges vehicles for road usage. Vancouver, BC may be the first. Back in 2018, the regional transportation authority, TransLink, commissioned a mobility pricing study. What they found in Metro Vancouver were “increased travel time reliability, reduced traffic and a potential reduction in transit fees, among others.” Last November, the city began a $1.5 million study into mobility pricing. While a politically challenging solution, mobility pricing appears to be a great policy tool for reducing greenhouse gases and encouraging other transportation options. Vancouver is a city with goals to reduce congestion and improve its carbon footprint, hoping to be the greenest city in Canada. If they really desire that title, charging vehicles for road usage may be a impactful solution.

2. Colorado greenhouse gas budget for the highway system?  Colorado Public Radio reports that Colorado Department of Transportation (CDOT) has published a rule that would tie future transportation system investments to progress in meeting the state’s adopted greenhouse gas reduction goals.  At first glance, this seems like an important step in the right direction:  If Colorado regions aren’t making progres toward greenhouse gas reduction goals, they should shift their spending measure that encourage transit, walking and cycling, and de-emphasize road capacity.  There’s a strong parallel here to the Clean Air Act, which limits highway construction in places that have failed to achieve attainment for national air quality standards.   In broad brush terms, the regulations do that, but with an important asterisk:  they assume that the state will make heroic progress in the adoption of electric vehicles.  What happens, it will be interesting to know, if EV adoption falls short of the state’s very optimistic assumptions?  As always with policy measures, and climate policies in particular, the devil is in the details.

3.Interview with Courtney Cobbs:  Issues of equity and sustainability are deeply intertwined in transportation systems across the United States. Courtney Cobb, co-editor of Streetsblog Chicago, writes about the intersection of these issues, seeking to improve Chicago’s transportation. Here, she answer questions about sustainable transportation in the Windy City. Cobb explains the lack of focus on the city’s transit system and the need for better bus and bike infrastructure. When asked what she could redesign Chicago’s transportation, she states, “The vast majority of our buses would have priority on the streets. So they would have their own dedicated lane, they would have signal priority where the lights change to benefit the bus.” The advocate expresses her vision for the future and how to incorporate equitable solutions into Chicago’s transportation system. Cobb gives an eloquent interview about the current state of Chicago’s transportation, the improvements it needs, and what it could transform into

New Knowledge

Sprawl v. Mid-Rise v. High Rise: Which is best for climate?  A new paper claiming that mid-rise (3-8 story) buildings are the climate-friendly sweet spot for urban development has gotten a lot of attention in the past week.  The claim is that—in Goldilocks fashion—neither low density nor high density development are optimal for building emissions, but that mid-rise development is “just right.”
The paper published in Nature’s Urban Sustainability looks at the lifecycle greenhouse gas emissions associated with constructing and operating buildings, particularly residential buildings.  It constructs a series of models of different urban forms, with data calibrated from actual cities.
The finding comes with a huge asterisk, however:  The study only looks a building-related energy use, and not associated transportation energy.  Transportation is a larger source of greenhouse gas emissions that building operations, and moreover, land use development patterns and density have a profound impact on transportation uses, and therefore greeenhouse gas emissions and energy use.  Sprawling development patterns result in much higher levels of car ownership, more driving, less walking, and less efficient transit.  Leaving out transportation emissions is a serious flaw in this study.
The study points to Paris as a paragon of “mid-rise” develoopment in contrast to a high rise city like New York.  Few cities achieve even Paris levels of density or transit availability, which means that  the “high rise v. mid-rise” argument is a bit of a red-herring.  Perhaps more importantly as the work of Paris Mayor Anne Hidalgo to reduce car traffic, encourage cycling and promote 15-minute living shows, this higher level of density is a cornerstone to achieving significant reductions in transportation-related greenhouse gases.
Pomponi, F., Saint, R., Arehart, J.H. et al. Decoupling density from tallness in analysing the life cycle greenhouse gas emissions of citiesnpj Urban Sustain 1, 33 (2021). https://doi.org/10.1038/s42949-021-00034-w

In the News

Writing for Smart Cities Dive, Wayne Ting, CEO of Lime, referenced City Observatory’s Eli Molloy work on micromobility in Miami.

“A recent City Observatory study found that Miami’s e-scooter fees charged scooter operators 50 times more than car operators on a per-mile basis to travel on city streets. This is not the way to incentivize sustainable transportation.”

 

The Week Observed, August 13, 2021

What City Observatory did this week

1. Tackling climate change will require electric cars, and a lot less driving.  We’re pleased to publish a guest commentary from CalYimby’s Matthew Lewis looking at the challenge of addressing the role of transportation in climate change.  Electric vehicles are a step in the right direction, to be sure, but Lewis argues we’ll need to do a lot more to reduce driving if we want to make progress in reducing greenhouse gas emissions on the timetable needed to avert global catastrophe, and also to minimize all the other social, environmental and health consequences that flow from our auto-dependent development patterns and lifestyles.

2. BIB:  Bad Infrastructure Bill.  This week, with much fanfare about bi-partisanship, the Senate passed its version of an infrastructure bill.  Transportation for America, NACTO, Streetsblog and others have offer detailed analyses of the bill, aka “BIB” or Bipartisan Infrastructure Bill.  We summarize its contents in the form of four lamentations:  it’s going to squander more than $200 billion, mostly on widening roads that will increase pollution, it has not accountability for actually better results, it is a massive dose of “asphalt socialism” that makes a mockery of the supposed “user pays” principle.  Plus, we could have done so much better, based on legislation that passed the House.  In the end, “BIB” is a fitting name for the bill, as its really the kind of legislation that would chiefly endear itself to the Michelin tire mascot:

 

Must read

1.  America’s sprawling urban growth.  Where has land development been surging across the country? Which cities have been growing fastest? This interactive Washington Post piece showcases the nation’s land development growth from 2001-2019. The biggest sprawl we’ve seen in the country has been in the South and Southwest. Maricopa County, including Phoenix, has seen the most growth since 2000 with more than 270 square miles of new development. Other Sunbelt cities, like Boise, Las Vegas, and Atlanta, have likewise recorded staggering growth. Housing affordability and availability seems to play a vital role in the inward movement of development and sprawl. Industries have been moving into the Sun Belt which has helped contribute to its sprawl. Demographic shifts also play a meaningful role. As baby boomers walk into retirement, the older population grows, which sustains the growth in Florida and the Southwest, home for many retirees. This piece offers a compelling look at America’s evolution in the new millennium. Go pick where you are from to examine the growth your hometown has seen in the last two decades.

2.  Road design privileges cars.  In 2012, Jeff Speck published Walkable City illustrating a range of ideas of reducing car-dependence and increasing walkability in urban areas. Over the last decade, some of these then-progressive and groundbreaking ideas have become increasingly popular. Here, Governing interviews Speck revisiting topics from his book and discussing the impact of remote work on cities. His answers explore the growth of the urban planning field, its current state, and how it can continue to improve. The polls show that people want more livable cities. In Boston, 81 percent of randomly polled individuals said they wanted to keep the street parklets put in place due to COVID and 79 percent were in favor of keeping the additional bike lanes. While the public opinion is tilting toward walkability, institutions and policies lag: Cars still control the roads. Speck explains this lack of progress,

Intellectually, and in terms of platforms you see among progressive politicians, there has been a lot of ground gained since 2012. But it’s still super hard to get these things done, as we’re seeing now with the reversion of some of these COVID-19 amenities back to the way they were before. Typically, a minority of people who speak loudly are pretty effective in overruling majority public opinion in favor of more walkable places.

Speck also discusses how road design can impact driving behavior and the livability of cities. They talk about the differences across metro areas as well as the significant effects of COVID on the roads. We may be approaching a decade since Walkable City was published, yet its core concepts remain to be important today.

3.  Houston’s Freeway Fighters push back against TXDOT.  Drama surrounding TxDOT’s I-45 expansion project has been rising as the public comment period draws to a close. Its plan to expand the freeway to mitigate congestion has been widely criticized by opponents arguing that expansion would worsen air quality and displace hundreds of families. Opponents have asked for a smaller, less disruptive plan that would lessen environmental damage and neighborhood dislocation. In response, TxDOT has threatened project funding:  “If the project is not approved, a TxDOT spokesperson said last week that the agency could take the money and use it elsewhere.”  Even though the public comment period has ended this week, the fight against TxDOT’s threat and their highway plan will continue in weeks to come.

New Knowledge

The diffusion of technology.  One of the most important characteristics of today’s knowledge economy is the clustering or agglomeration of activity in tech centers.  The clustering phenomenon is well-documented and is both the goal and bane of economic development efforts (the goal is building one’s own cluster; the bane is the difficulty of dislodging activity from dominant clusters to other places.  A novel new study from five economists and financial analysts uses textual analysis of patents, job postings and corporate earnings calls to track the origin and diffusion of new technologies (like cloud computing), from the places they are invented to the rest of the economy.  As the authors explain:
By intersecting the text from these three sources, we are able to trace mentions of disruptive innovations from their original patents to the conversations of executives and investors at large firms, and finally to job postings that advertise positions involved with using or producing these technologies. Using this approach, we are able to determine which innovations or sets of innovations (‘technologies’) affect businesses, trace these back to the locations and firms where they emerged, and track their diffusion through jobs advertised in different regions, occupations, and industries over time.
The study begins by identifying the geographic origins of economically significant breakthrough technologies.  These commercially successful technologies are geographically concentrated in a few clusters (40 percent are in California metro areas, for example), and these commercially successful patents are more geographically concentrated than overall patent activity.
The work shows that technology (and tech jobs) tend to stay concentrated, but do in fact diffuse over time.  One of the more interesting dimensions of the study is its analysis of the connection between technological diffusion and job skill levels.  Over time, as technologies mature, more job listings for low skilled jobs include references to the technology (that is, over time, technology moves from primarily or exclusively high skilled work, to wider cross-section of the labor force.
Similarly, the authors are able to track the diffusion of location of jobs as well.  For any given technology, low skilled jobs diffuse from tech centers more rapidly than high skilled jobs. It takes 20 years on average for low skill jobs associated with a new technology to diffuse, but as much as four decades for higher skilled employment.
Nicholas Bloom, Tarek Hassan, Aakash Kalyani, Josh Lerner, Ahmed Tahoun, How Disruptive Technologies Diffuse, VOX EU, 10 August 2021, https://voxeu.org/article/how-disruptive-technologies-diffuse

In the News

Streetsblog USA republished Matthew Lewis’s City Observatory guest commentary on EVs and reducing greenhouse gases.

The Week Observed, August 6, 2021

What City Observatory did this week

America’s berry best cities.  It’s the height of the summer fruit season and berries are ripening across the country.  Nothing beats a fresh local berry in season. We’ve ranked the nation’s most populous metro areas based on their commercial production of all kinds of berries:  cranberries, raspberries, strawberries, blackberries and blueberries.  The berry-best metros by our measure include Boston, Portland and Tampa.  Here’s the complete list of the 25 metros with the most planted berries.

Berries are more than just a seasonal diversion:  they’re marker of the kind of distinctive local products and experiences that enrich city living.  We quote both Paul Krugman and Jane Jacobs, who wax poetic about how the spatial and the seasonal variations in experiences influence our well-being and the wealth of cities.

Must read

1. Are battery buses ready for Primetime?  Nearly all of the nation’s transit buses run on diesel fuel.  Ultimately, the climate friendly solution to transit will be to electrify, and a few transit systems have been rolling out battery electric buses, with mixed results.  Are these just temporary teething problems of a new technology, or symptomatic of long term weaknesses?  Vice’s Aaron Gordon looks at the experience of Foothills Transit in Los Angeles, which has been operating battery buses for more than a decade.  The big question going forward is whether the problems that have cropped up can be fixed soon enough to help make a difference to the climate crisis.

2. Paris as a paragon of housing affordability?  Sightline’s Alan Durning is currently writing a series of articles called “Winning Housing Abundance.” Paris takes the stage in his most recent piece. Why? Their remarkable fight out of a housing shortage. Before 2008, Paris was a lot like other Western countries. Their big cities were in residential lockdown and stark social divisions could be seen across neighborhoods. After 2008, something different happened in Paris – they built houses, and lots of them. In this article, Durning explores how France did such a tremendous job growing the housing stock in Greater Paris and the lessons we can learn from their policies. Strong national leadership, effective rental support and social welfare, and quasi-public developers helped the country succeed in supplying homes. There was a housing transformation started by strong, inspired political leaders. Every neighborhood in Paris was pushed to do its part, and they provided. As a result, legitimate growth occurred across the entire metro area. Durning leaves readers with one lasting question, “If Paris could do it, why not we?”

3. Malls to apartments?  It’s happening in Salem, Oregon.  As we all know, between the pandemic and the growth of e-commerce, it’s been a tough time for retail. Many chain stores and malls have gone dark over the past few years, leaving unused property and parking lots.  At the same time, the housing market has gotten tighter.  The obvious opportunity would seem to be converting vacant stores into apartments.  That’s exactly what seems to be slated for downtown Salem, Oregon, where a former Nordstrom store is going to be re-developed into 160 apartments.  If it makes sense in a mid-sized city, perhaps there are many more such opportunities in the nation’s larger metros?

New Knowledge

Evaluating equity and effectiveness of climate change strategies.  Many cities around the country are pursuing climate change strategies, and increasingly, there are calls for such strategies to be implemented in an “equitable” fashion.  The trouble is that equity, like beauty, tends to be subjectively defined and be in the eye of the beholder.  A new paper from the Harvard Kennedy School of Government offers some practical advice on how to evaluate various investment alternatives both for their impact on climate and equity.

In 2020, Denver voters approved a sales tax increase dedicated about $30 to $40 million annually to fighting climate change, and directing that monies be spent to pursue equity in the community. (We’ll set aside for a moment a question the author’s sidestep: whether a sales tax (generally regressive) is a sensible way to fund a climate initiative, especially compared to pricing carbon emissions, parking, or driving).

As the authors point out, individual investments tend to vary in their effectiveness and equity components.  While there are some investments that admirably meet both objectives, more commonly there’s a tradeoff between equity and efficiency.  They depict this tradeoff as follows:

Perhaps the paper’s most salient bit of advice is that the evaluation should be done more on a portfolio than on an investment-by-investment basis.  Not every single project will equally advance equity and climate goals.  The objective should be to construct a portfolio that maximizes both, in all likelihood with a diverse array of options.

When building such a policy portfolio, it is important to note that not every program has to be “win-win” in terms of equity and GHG impact, but rather CASR should consider the aggregate climate and equity impact from an entire suite of policy options.

Integrating equity considerations into climate planning has produced many bold pronouncements, but few carefully thought out methodologies.  This is one that’s worth a close look.

Rani Murali & Emily Kent, Equitable and effective climate change mitigation:  Policy Analysis in the City and County of Denver, Colorado, Harvard Kennedy School of Government, April 2021.

In the News

Investment website Seeking Alpha quoted City Observatory’s Joe Cortright on the likely effects of increased gas prices on driving patterns and commercial shopping behavior.

“And when inflation pushes gas prices higher, as it did in 1980 and 1981, there is even more incentive to choose the closest option when grocery shopping. According to an analysis by economist Joe Cortright, higher gas prices result in fewer miles driven. Perhaps you’ve noticed gas prices climbing recently. Consider, too, that Walmart has more stores than any other grocery chain in the United States, making it even more likely that it is the discount retailer that is closest to you.”

The Week Observed, July 23, 2021

What City Observatory did this week

Selling Oregon into highway bondage.  Oregon is moving ahead with plans to issue hundreds of millions—and ultimately billions of dollars of debt to widen Portland-area freeways.  And it will send the bill to future generations, and perversely, commit the state to ever increasing levels of traffic in order to satisfy bond-holders.  Just the right strategy for dealing with a manifest climate crisis, right?

The single largest source of greenhouse gas emissions in the state is transportation, and those emissions are growing, up 1,000 pounds per person in the Portland metro area in the past five years.


And the Oregon Department of Transportation’s plan for responding:  building even more highways in the Portland metro area; and not only that, going deeply into debt to pay for those highways—burdening future generations with both the financial and environmental costs of more driving.  In theory, the bonds will be repaid by tolls, but the agency has no history of forecasting toll revenue, and routinely overspends its budgets on megaprojects. And tolling essentially commits the state to increasing traffic levels in order to pay back bondholders.  It’s the height of intergenerational climate warfare, and its packaged as business as usual, technocratic trivia.

Must read

1. Returning to the office:  Young workers value in-person work.  There’s a lot of glib prognostication that the advent of zoom will lead to remote work displacing office work, and undercutting city economies.  But the assumption that everyone does well working remotely misses the fact that in-person workplace interactions are more  powerful and important for some functions (team-building, socialization, feedback) and some workers (especially younger ones).  While seasoned veterans can take their accumulated knowledge and networks and work remotely, young workers are still learning and forming complex ties and forging an identity.  As  Erica Pandley at Axios:

Transitioning to remote work is far easier for veteran employees who have already developed social capital in the workplace and know how a company operates. Freshly minted members of the workforce stand to miss out on those valuable skills and opportunities if they can’t come back to the office. . . . A whopping 40% of college students and recent graduates prefer fully in-person work, according to a new poll by Generation Lab.

The survey says three quarters of young workers feel they miss out of office community by working remotely, and two in five would lose out on mentoring.  Two-thirds of young workers say they want in-person feedback from their managers, rather than receiving a written report or chatting over Zoom.  This kind of face-to-face interaction is what has propelled the growth of city economies, and been especially important to young people, for decades.  We shouldn’t expect it to disappear now.

2. Would Texas be better off spending $25 billion on something other than more highways?  Texas has recently begun to move ahead on plans to expand highway I-35 through downtown Austin. The stated goal is to provide more efficient and safer travel for the hundreds of thousands of drivers who use the highway everyday. Although, as we’ve explored numerous times at City Observatory, increasing the number of lanes does not lead to more efficient travel. So, what if the state of Texas did not expand their highways? Megan Kimble at the Texas Observer takes it a step farther. She ponders:

“What if, instead of building our aging roads back wider and higher—doubling down on the displacement that began in the 1950s and the climate consequences unfolding now—we removed those highways altogether? What if we restored the scarred, paved-over land they inhabit and gave it back to the communities it was taken from?”

Kimble writes a thoughtful and informative piece on the state of Texas highways and the possibility for transcendent change. She interviews planners and activists within the state, exploring what the potential for the highway land could be. This “What if” question is not just imagination. It is an idea for real, substantial change. Expanding the roadway will only bring more congestion. Taking it away could create more equitable and sustainable communities for all, particularly those harshly impacted by the highway’s construction.

3.  More reasons not to believe the Urban Mobility Report.  If you read City Observatory in the past few weeks, you likely saw our critique of the Texas Transportation Institute’s newest “Urban Mobility Report.” This flawed and biased report failed to provide a critical analysis of transportation policy. Instead, it presents a road building propaganda piece that neglects the necessary fundamentals of a qualified report. Writing for Planetizen, Todd Litman provides his own critique of the UMR. He explains that the paradigm in transportation planning has shifted. The new approach has become comprehensive, focusing on multi-modal transportation, social equity, environmental quality, and more. Does the UMR take this new paradigm into consideration? Not at all. Litman criticizes the UMR’s sole focus on congestion and automotive travel. He expresses that the UMR’s analysis is “neither comprehensive nor objective.” He writes that the report “tends to overvalue urban roadway expansions and undervalue other congestion reduction strategies that provide more co-benefits, such as the congestion reduction strategy that also increases affordability, improves public health and safety, or reduces pollution emissions.” Litman provides a thorough and needed critique of the UMR and explores more appropriate solutions to the important transportation issues.

New Knowledge??

Our “New Knowledge” feature this week has question marks appended because, as we’ll explain, we’re not at all sure this particular article gets things right.  But because it purports to shed light on the very salient question of how upzoning city neighborhoods affects population change, we think it’s worth a closer look, and some skepticism.

Columbia University’s Jenna Davis has published a journal article, and a web commentary summarizing this work.  The paper looks at the demographic change in New York City census tracts between 2000 and 2010, and reports finding a correlation between city initiated zone changes and an increased probability that the share of non-HIspanic white population increases.  Tracts that had city upzoning were about 25 percent more likely to experience an increase in white population share than tracts that weren’t upzoned.  It’s pretty clear that many will read this study as somehow proving tha upzoning makes neighborhoods whiter.  Davis says her findings imply that “upzonings might accelerate, rather than temper, gentrification pressures.”

We think there are several problems with drawing this conclusion.  They include timing, threshold effects, other demographic factors, causation, and a poorly described dependent variable.

The dependent variable in this analysis is an increased share of non-Hispanic white population.  The paper doesn’t say how many Census Tracts experienced an increased share of white population in New York between 2000 and 2010.  Demographic change in most neighborhoods is slow and small; relatively few neighborhoods totally change their demographic composition. To put this in perspective, over the 40 years between 1970 and 2010, just 10 of 443 majority Black census tracts in the New York metro area transitioned to majority white composition (about 2 percent).  (Data from the Brown University Longitudinal Tract Database).

The fact that there’s a correlation between upzoning and the probability of a white share increase doesn’t say anything about the direction of causality:  It is possible, even probable, that the city chose neighborhoods to upzone where the white population share was increasing.  The study also doesn’t apparently have any minimum threshold for the increase in the white share of the population:  even a 0.1 percent increase in share might qualify.  Also, it’s worth noting that even if the share of the white population increases, that doesn’t necessarily imply that the number of non-white residents of a neighborhood decreases; new housing units could allow both more white and more non-white residents in a neighborhood, even if the shares of the two groups change.

That possibility is heightened by the timing of upzones.  According to the paper, the average upzone occurred just 18 months before the 2010 Census.  That’s not enough time to actually build additional housing, and this suggests that the demographic change, if any, occurred mostly before the upzoning.  Notice that the paper doesn’t look to see whether, or how many additional housing units were built; it’s implicitly arguing that the rezoning itself, even without construction, caused the observed demographic change.

It’s also worth noting that other variables included in the estimation had a much bigger impact on neighborhood change.  In particular, there was a strong correlation between an increased share of the white population and the share of the neighborhood that was either married or had children.  One key component of population change is births, and if a particular neighborhood had a disproportionate share of of White residents in their child-bearing years relative to neighborhood’s overall population, you would expect the demographic composition of the neighborhood to shift based on births of white children—a factor unrelated to zoning.

Jenna Davis, “How do upzonings impact neighborhood demographic change? Examining
the link between land use policy and gentrification in New York City.”  Land Use Policy 103 (2021).

Jenna Davis, “Double edged sword of upzoning?”  Brookings.edu.

In the News

Streetsblog quoted City Observatory’s Joe Cortright in it’s article about expensive plans to rebuild a failed pedestrian bridge over DC’s Anacostia Freeway.

“The way our infrastructure decisions are presented is that we’re always doing piecemeal, remedial patches to the existing flawed system. DC will spend it looks like $25 million to rebuild back something that is at best a kind of band-aid … Once you’ve spent $25 million ‘fixing’ this pedestrian bridge, you’ve essentially committed yourself to the indefinite future of the freeway it crosses.”

The Week Observed, July 9, 2021

What City Observatory did this week

1. Miami’s double standard for charging road users.  The City of Miami is hoping to make their streets a safer place for bikes and scooters by building protected lanes along three miles of the city’s downtown. The city plans to pay for this infrastructure by taxing each registered scooter daily. This fee got us thinking – how does the price we charge scooters compare to the price that we charge cars? What we found: e-scooters are paying four to 50 times as much to use the public roads as cars. Automobiles take up the most space, create unsafe environments for other users, and damage the roads. Scooters help reduce congestion, fit on a sidewalk, and provide environmentally friendly ways to travel through a city. Yet, scooters pay significantly more to use the road. If we want greener, safer, and more efficient transportation systems, we ought to reconsider how we charge its users.

2. It works for bags, bottles and cans, why not try it for carbon?  A newly enacted law in Denver requires grocery stores to charge customers a dime for each grocery bag they take. Echoing similar bag fees in other places, this provides a gentle economic nudge to more ecologically sustainable behavior. And the evidence is that it works–it London, single use bags are down 90 percent. We ought to be applying the same straightforward logic to carbon pollution, and ironically, the bag fee, on a weight basis is 20 times higher than the charge most experts recommend for carbon pricing.

Must read

1. Giving up on the “user pays” principal for cars and driving.  The nation is in need for roads, bridges, and transit infrastructure. Gasoline taxes, a “user pay” system, have long been a method that the United States has funded these needs, but the tax level has been untouchable for nearly 30 years. As a result of inflation and increasingly fuel-efficient cars, the Highway Trust Fund has struggled to get the same impact from the gas tax. The House’s new surface transportation bill has no mention of this gas tax. Instead, the Highway Trust Fund is receiving cash from deficit financing. The bill is financed by  a one-time $148 billion transfer from the U.S. Treasury to Highway Trust Fund. What this amounts to is a massive subsidy to more driving, more pollution and more greenhouse gas emissions.  If we really had a “user pays” system, as with a carbon tax or a congestion fee, we’d have a lot less traffic and congestion.

2. Which downtown’s are most dependent on a return to the office?    Downtowns are much like investment portfolios – the more diverse, the more resilient they will be. The New York Times’ Emily Badger and Quoctrung Bui examine the state of downtown business districts across the nation. According to their analysis, some cities have downtown business districts with office space taking up 70-80 percent of all real estate. These office dominated cities are susceptible to shocks, recessions, and especially global pandemics. Kourtney Garnett, the head of Downtown Dallas, Inc., offers a solution to strengthen these city centers: “Build more residences, diversify the economy, give people reasons to linger past 5 pm.”  There’s a surprisingly large variation across metro areas in terms of the amount of downtown space devoted to offices, retail, hotels, residential and other uses, as this NYT chart illustrates.

3. Let’s build mixed income housing in high income, high amenity, high opportunity neighborhoods.  The United States housing supply shortage is a dire issue that continues to grow. The struggle to find affordable housing with desired amenities has pushed middle-income households further and further from city centers, increasing sprawl and commute times. Suburban growth also tends to fuel auto-dependence and increase total vehicle miles of travel by households, key contributors to climate change.. The need for housing supply and the climate impact of increased transportation present two huge problems. In this article, Zack Subin suggests a combined solution to both problems. He writes that, “we must reduce vehicle miles traveled by investing in inclusive, complete, compact, transit-oriented communities.” Subin explores the potential opportunity for mixed-income housing to create low-carbon-footprint neighborhoods and increase the housing supply. If we create housing in the right neighborhoods, we could see significant positive impacts in the nation’s pollution emissions and housing supply. Climate advocates, come join the housing movement so we can meet our pollution and equity goals together.

New Knowledge

The food landscape of rural America.  There’s been a lot of concern raised about food access in urban areas, in particular the idea that food deserts contribute to poor nutrition (the evidence for this is weak).  There’s little dispute however, that physical access to food stores is lowest in the nation’s rural counties, where there are few stores and people routinely have to travel great distances.  A new report from the USDA reports on the changes in the rural food retailing landscape.

The report looks at trends over the 25 year period from 1990 through 2015.  Through most of that time the total number of food retailers in the US was increasing, but since the Great Recession, the most common types of food retailers—grocery stores, convenience stores and supercenters—have declined in number.  In contrast, the number of Dollar stores has increased.

The report focuses on the trends in the nation’s non-metropolitan counties.  Overall, most of the food retailers in these counties are single-establishment firms (think small, often-family owned traditional grocery stores).  While more numerous than other types, these stores are much smaller (grossing about $800,000 per year, compared to $4-10 million per year for a chain supermarket), and their numbers are dwindling.  Weak economies in rural areas, and competition from national chains, and more recently Dollar stores, is cutting into the financial viability of these businesses.

The net effect of these changes was to reduce food availability and choice in many non-metro areas:

Among urban nonmetro counties, the percentage of counties with fewer than 8 food retailers per 10,000 people increased from 21 percent to 33 percent. The number of food retailers per capita decreased by 16 percent for these counties. Among rural nonmetro counties, the percentage of counties with fewer than 8 food retailers per 10,000 people increased from 11 percent to 27 percent. The number of food retailers per capita decreased by 19 percent for these counties. The percentage of rural nonmetro counties with no food retailers increased from 1 percent to 3 percent.

Overall, the median number of grocery stores per capita in non-metro areas decreased about 40 percent since 1990, that means that rural residents are even further from food, on average, than they were 25 years ago.

Stevens, Alexander, Clare Cho, Metin Çakır, Xiangwen Kong, and Michael Boland June 2021. The Food Retail Landscape Across Rural America, EIB-223, U.S. Department of Agriculture, Economic Research Service.
https://www.ers.usda.gov/publications/pub-details/?pubid=101355

In the News

The Overhead Wire named our City Observatory analysis of the latest iteration of the deeply flawed Texas Transportation Institute Urban Mobility report—“It’s back and it’s dumber than ever“—it’s most read story on July 6.

The Week Observed, July 2, 2021

What City Observatory did this week

1. The Texas Transportation Institute is back, and it’s still wrong about traffic congestion.  Every year or so, a group of researchers at Texas A&M University produce report purporting to calculate the cost of congestion in US metro areas. Their flawed and biased methodology has been discredited multiple times, but the researchers continue to make unsupportable claims about alleged numbers of lost hours due to congestion.  This year is a particular blown opportunity to make sense of the nature of our transportation problems because congestion went down virtually everywhere, by unprecedented amounts.

If you were researchers really interested in figuring out a way to reduce congestion, you would use this experience to figure out how to craft policies that mimic (minus the downsides) the traffic reducing effects of the pandemic.  The experience of the pandemic shows that reducing traffic demand, which can be accomplished by road pricing, would be more effective and less costly that building new roads, but you’ll find no mention of “pricing” in the latest TTI report.

2. What’s behind the big run-up in home prices? Lower interest rates.  In the past year, average home prices in the US have risen more than 14 percent according to the Case-Shiller index.  There’s been widespread frenzy in the housing market, with bidding wars and a paucity of homes for sale. The decline in home mortgage interest rates is one of the key factors behind the increase in home prices.  Mortgage rates have fallen roughly a full percentage point, from about 3.7 percent in early 2020 to 2.7 percent in early 2021.

Lower mortgage rates translate directly into greater ability to bid a higher amount for the purchase price of a home.  The decline in rates enabled a household that could afford a $1,800 monthly mortgage payment to get a $340,000 mortgage, rather than just $300,000.  That decline in interest rates is an important, but likely one-off stimulus to home prices.

Must read

1. Undercutting urbanism and transit with downzoning in Philadelphia.  Here, in a nutshell, is America’s urban problem.  As Plan Philly explains, the Philadelphia City Council just voted to impose a height limit of 38 feet on the city’s Girard Avenue, a street that has a major trolley line that connect to two of the city’s big subways.

Dense housing and transit work best together.

By limiting the size of homes than can be built on this street to three-story rowhouses, the city simultaneously worsens housing affordability, reduces transit ridership, and likely contributes to neighborhood displacement.  We know that there’s a shortage of walkable, transit served locations, and transit ridership is directly proportional to neighborhood density.  As Daniel Trubman pointed out, limiting density along Girard Avenue doesn’t reduce demand for housing. Instead, it simply displaces the demand to other parts of the neighborhood, likely driving up home prices and increasing displacement:

The strategy in every other Philadelphia neighborhood has been to upzone the major avenues. By downzoning everything, there will just be more impetus to buy and flip the individual houses.

The effort to protect desirable, transit-served neighborhoods from change with building restrictions, while politically popular, worsens housing affordability and undercuts efforts to promote transit ridership.

2. Remote work won’t save the “heartland.”  There are widespread prognostications that with the advent of Zoom, there’s just no reason why businesses or workers who’ve mastered remote work won’t move away from expensive cities.  If you can work from anywhere, why pay high rents?  Mark Muro and his colleagues at Brookings Institution throw cold water on this thesis.  They mine US Postal Service change of address data to examine the regional patterns of migration in the wake of the Covid driven work-at-home experience.  They find that while there has been some movement outward from the most expensive superstar cities (New York, San Francisco, Seattle), precious little of that migration has produced population gains in the so-called “heartland (which they define as the states not on the Pacific or Atlantic coasts, plus Texas and the Rocky Mountains).  

The reason:  Most moves are local, either within the same metro area or state, or within the same region.  Only a tiny fraction of moves from either coast are to the interior of the country, and Rocky Mountain states are doing better than the “heartland.” In addition, the Brookings analysts point out, remote work is declining rapidly as the pandemic ebbs.

3. A vision for sustainable, post-pandemic cities.  San Francisco-based SPUR is a world leader in thinking about urban policies.  This month, they have a short, timely and optimistic essay reminding us that done right, cities are the solution to many of our social and environmental challenges.  “Greetings from 2070” tells us the California of the future became more just, sustainable, and livable by building more apartments, tearing out urban freeways, and systematically ending the use of fossil fuels.  Denser, more affordable urban living with less driving, easier walking and cycling and better transit would enable us to build the kind of neighborhoods currently in short supply.

In the face of irrefutable daily evidence of climate chaos and in the wake of a devastating pandemic, projecting this kind of optimistic vision of a possible future is one key to moving us collectively in the right direction.  We certainly will find it difficult to make any progress if we simply rely on the indefinite continuation of the trends and policies that got us to this situation.

New Knowledge

Cities after Covid.  The widespread roll-out of Covid-19 vaccines is quickly restoring optimism that there will be life after this pandemic. But the question remains: what effects of the past year will be transitory and reversed, and what will become the proverbial “New Normal?”  Three prominent urban geographers tackle that important question in a new essay in Urban Studies.

According to Richard Florida, Andres Rodríguez-Pose and Michael Storper, the short answer is:  We don’t know.  This article provides a useful framework for thinking about Covid and cities, and sorts out glib (and largely wrong) claims about the effects of the pandemic from the more interesting and challenging questions.  Helpfully, the author’s dismiss the widely circulating claim that early-on associated the spread of Covid with cities and density.  While cities were harder hit earlier, this really reflected their global connectedness, rather than any intrinsic vulnerability that density created to the disease.  Indeed, during the pandemic’s more severe later waves, infections and deaths have been more prevalent in rural areas.

They have a very thoughtful discussion of the limits of telework, and come away skeptical that it will fundamentally undercut the forces concentrating economic activity in cities, especially superstars.  They note that while remote work is a reasonably good substitute for one-on-one meetings on projects already underway with colleagues that one already knows, the situation is very different for more complex and innovative tasks, and for building the networks and social capital that enable organizations to succeed.

There is little evidence that telework can successfully establish networks, socialise new participants and permit the evaluation of partners in creative work that depends on high levels of tacit knowledge and partner evaluation ‘between the lines’. . . . The use of telework will certainly accelerate with the forced experiment of the pandemic. But the substitution effect will reach its limits . . . as time goes on, as existing activities wear down their stock of acquired networks and as the need to move on to new projects with new collaborators increases.

For some workers, it is possible that work will be the hybrid model, a mix of office and remote work. Looking forward, a key question is how pandemic related health concerns and greater personal and business experience with remote working will influence the location of population and economic activity.

Florida, R., Rodríguez-Pose, A., & Storper, M. (2021). Cities in a post-COVID world. Urban Studieshttps://doi.org/10.1177/00420980211018072

In the News

Next City highlighted City Observatory’s analysis of the I-5 Rose Quarter freeway widening project in its article, “Critic, Contractor Call for Shrinking of Freeway Widening Project in Oregon”

The Week Observed, June 25, 2021

What City Observatory did this week

1. Cars kill city neighborhoods.  Across the nation, America’s cities have been remade to accomodate the automobile.  Freeways have been widened through city neighborhoods, demolishing homes and businesses, but more than that, the sprawling, car-dependent transportation system which is now firmly rooted across the nation is simply toxic to urban neighborhoods.  A recent study from the Federal Reserve Bank of Philadelphia shows that across metropolitan areas, population growth and decline is directly related to proximity to urban freeways:  urban neighborhoods close to freeways decline; suburban neighborhoods near freeways thrive.

In short, freeways are toxic to urban neighborhoods but a tonic to suburban sprawl.  In close-in urban neighborhoods, freeway construction was associated with an 80 to 100% decline in population within one mile of a freeway.  In and near city centers, the closer your neighborhood was to a freeway, the larger its population decline.  The reverse is true in the suburbs, where population growth was concentrated in those areas closest to freeways.  The evidence across six decades of freeway building shows us that freeways kill cities.

2.  The Bum’s Rush. The Oregon Department of Transportation’s $800 million I-5 Rose Quarter project has recently had a major shift in its plans. Just last fall, ODOT’s Director of Urban Mobility Brendan Finn stated that the project is only 15 percent designed and that there is “almost an amazing opportunity here to connect neighborhoods.” However, after this magical moment, ODOT claims that it is simply too late to think about doing anything differently than the original plan. This sudden change is completely different from the outlook expressed last fall. It disregards the community and their own consultants support for buildable covers. ODOT argues that it is too costly to consider buildable caps. Acquiring more land would be “necessary” to implement them, rather than narrowing the excessively oversized roadway it intends to build. The change from a work in progress plan to an unchangeable design showcases ODOT’s true desires once again. There is not a fight for “restorative justice” for the Albina neighborhood nor a push for the economically best model. ODOT is only interested in building a wider freeway.

Must read

1.  The case against freeway widening:  Milwaukee edition.  Around the country, urbanists, social justice activists and climate warriors are all challenging plans to squander billions of dollars widening urban freeways.  We’ve known for decades that wider freeways do not reduce traffic congestion, rather they simply increase traffic, air pollution and sprawl.  A battle rages in Milwaukee, where this coalition of local groups is fighting against state plans to spend upwards of billion dollars widening I-94. 

Writing at the local blog, The Recombobulation Area, Dan Shafer describes the multi-faceted community alliance that’s pushing back.  Their efforts are a template for freeway fighters across the nation.

2.   If it’s really a climate emergency, maybe we should start charging for parking.  Vancouver city planners are showing that they’re willing to take some serious steps toward fighting climate change.  They are proposing a $1,000 annual fee for residents parking high polluting vehicles on city streets.  The fee would be zero for non-polluting vehicles, like electric cars, and graduated based on vehicle emissions.  While this is definitely a second-best approach compared to a strong carbon tax or a congestion fee, such a measure sends a tangible economic signal to the region’s residents about the environmental consequences of high polluting vehicles. 

While the headline number of $1,000 sounds like a lot, it actually works out to about $2.75 a day (about $2.20 in US$), which is less than a single, one-way bus ticket in most US cities.  However, this measure is far from perfect, the fee only applies to newly purchased vehicles (model year 2022 or later). Grandfathering older vehicles may seem politically wise, but it would incentivize people to keep their old dirty cars for longer.  If the climate is really a crisis, maybe we should charge people more for polluting than for taking transit.

3.  Gentrification is not the real problem. Writing at Shelterforce, Brett McMillan argues that “we have a major problem with how we talk about gentrification in this country.” In this piece, he explains gentrification’s flawed theory and the greater problems that the term fails to cover. Neil Smith’s theory of gentrification was a hypothesis introduced in the 1970s to explain the demographic pattern of people moving from the suburbs back into the city. Numerous studies have found this theory to be insufficient. For example, scholars Lance Freeman and Tiacheng Cai found that the white “invasion” into areas with predominantly Black populations (50% or more of the population) has been a relatively infrequent phenomenon since 1980, despite a slight recent uptick.  McMillan is particularly critical of the failure of the gentrification literature to clearly define or document gentrification-driven displacement:

In 2020, a paper in the high-ranking academic journal Urban Studies criticized statistical analyses that showed limited displacement because their “progress in identifying [displacement’s] extent has been remarkably slow,” meaning, as I take it, that we ought to reverse the scientific method. Which is to say, rather than forming a hypothesis, rigorously testing it, and adjusting it in light of studies’ results in order to better understand problems motivating the analysis, such claims suggest we ought to make results conform to a pre-determined outcome.

McMillan supports his argument with links to a number of critical studies. He pushes to change the framework for discussion to address broader, structural issues. Neighborhood-level inequalities and housing shortages and the growth of concentrated poverty, it turns out are more serious issues masked by a too frequent focus on gentrification. McMillan states that wealthier people moving into and driving up costs in particular urban neighborhoods is merely a “symptom” of urban equality issues, not its cause. Solutions that address systemic roots like increasing housing supply and eliminating exclusionary zoning are necessary for greater housing equality. Controlling the conversation around the term “gentrification” fails to take into consideration the structural problems and the solutions which could alleviate the adverse effects of inequality.

New Knowledge

Inclusionary Zoning: Not a Cure for Exclusionary Zoning. A new research review from Bryan Graveline examines inclusionary zoning’s effect on housing affordability and residential segregation. It finds that inclusionary zoning is a poor tool to make progress on either of these issues. 

Both housing unaffordability and residential segregation are caused largely by exclusionary zoning, like single-family zoning and minimum lot sizes. However, despite its name, inclusionary zoning does not undo exclusionary zoning. Rather, it asks developers to set aside a percentage of units in each new development to be affordable to low-income renters.

While this policy may sound agreeable on first blush, the report finds that inclusionary zoning does not meaningfully address either of the problems it tries to solve and can actually worsen the housing affordability crisis. Graveline recommends that jurisdictions hoping to confront issues of housing affordability and residential segregation forgo inclusionary zoning and instead focus on repealing exclusionary zoning.

The report evaluates inclusionary zoning policies across four criteria:

  1. Effect on the housing market. Inclusionary zoning increases the price of new market-rate housing and decreases its supply. And because would-be tenants of new buildings live in older buildings when new construction is constrained, inclusionary zoning affects all segments of the housing market.
  2. Production of below-market rate housing. Most inclusionary zoning programs create less than 100 affordable units per year. This is a drop in the bucket compared to the need for affordable housing in most cities.
  3. Effect on residential segregation. This topic is understudied in the current literature, so it’s difficult to draw definitive conclusions. However, inclusionary zoning is only as effective at fighting segregation as the number of units it produces. Because it produces so few units, it likely does not have a meaningful effect on segregation.
  4. Effect on exclusionary zoning. Inclusionary zoning does not undo exclusionary zoning. In fact, it can entrench current exclusionary policies. Many inclusionary programs try to coax developers into creating affordable units by offering to reduce costly exclusionary policies (such as density limits). Some jurisdictions thus enact strict exclusionary policies just to give themselves leverage over developers. 

Graveline ultimately finds that inclusionary zoning does not accomplish its intended goals and can lead to perverse side effects. He recommends that rather than pursuing inclusionary zoning, jurisdictions address a more relevant cause of both housing unaffordability and residential segregation: exclusionary zoning.

In the News

In his Planetizen article explaining why many times “slower is better” for transportation systems (and for livable places) Todd Litman cites City Observatory’s analysis showing that residents of metro areas with higher average travel speeds are less happy with their transport experience.

The Week Observed, June 18, 2021

What City Observatory this week

1.  Race and economic polarization.  In the past several decades, racial segregation in the US has attenuated, but economic segregation has increased.  This is nowhere more apparent than in the residential patterns of Black Americans.  A recent analysis by David Rusk looks at the growing economic polarization of urban neighborhoods and its effects on the Black community.  In the heyday of segregation 50 years ago, Black Americans were effectively restricted to Black neighborhoods, regardless of their income.  Consequently, Black Americans were considerably less segregated by income than other Americans (high income and low income Black households tended to live in the same neighborhoods).  As the following chart shows, economic polarization for whites has moved upward only slightly since 1970.  In contrast, Black Americans, who were less polarized that whites in 1970, and now vastly more polarized by income.

In recent decades, however, upper and middle income Black families have been the ones moving to suburbs and integrated neighborhoods, with the result being that low income Black families are now more segregated both by race and income.  This growing economic polarization is a huge challenge for the nation’s cities and for achieving social justice.

2.  More evidence the Oregon Department of Transportation is lying about its Rose Quarter freeway widening project.  For several years, ODOT has been trying to “woke-wash” its $800 million plan to widen the I-5 freeway to as much as 10 lanes by asserting that the “covers” it will build over the freeway (really just extra-wide overpasses), will somehow repair the damage done to this traditionally African-American neighborhood when the freeway was first built in the 1960s.  A report prepared by an ODOT contractor obtained by alt-weekly Willamette Week, shows that if the agency is serious about constructing “buildable” caps that could support a building, it should be planning a much narrower freeway project, instead of constructing over-sized shoulders (that would ultimately be converted to traffic lanes).

The oversized overpasses that ODOT is proposing are so weak and expensive that they can’t support buildings without adding hundreds of millions of dollars to the cost of the project–something Oregon DOT almost certainly won’t do.  The revelations here confirm that ODOT is really only interested in a wider freeway, not restorative justice for the Albina neighborhood.

Must read

1.  A Little More Remote Work Could Change Rush Hour a Lot.  Coping with peak hour congestion is one of the key challenges of transportation policy. Transportation systems have long been structured to meet the needs of peak travelers, but when the pandemic shifted many jobs from the office to the home, rush hour traffic appeared to dissipate. The increase in telecommuting throughout the COVID-19 pandemic has diminished the  rush hour congestion across U.S. cities. Morning rush hour traffic in Washington DC has seen a profound reduction from before the pandemic.

In this New York Times article, Emily Badger explores the implications of remote work, and whether pandemic induced commuting changes will persist. Due to improving technology and major investments, the increased rate of remote workers will likely be maintained as we enter a post-pandemic world. Badger examines the positive impacts on those free from rush hour hell, as well as the essential workers most reliant on the transportation system. This increase in telecommuting presents a groundbreaking opportunity to rethink the structure and investments of transportation systems.

2.  Wall Street isn’t to blame for the chaotic housing market.  Investors bought a record $77 billion worth of homes in markets around the United States in the final half of 2020. Accusations have been thrown towards these “yield-chasing” institutional investors for purchasing up the single-family houses from consumers as demand increases and prices soar. However, Vox’s Jerusalem Demsas shows that these investors play only a small role in the real estate market. Poor data collection and argument framing have created questionable narratives that highlight the impact of large firms. Demsas argues that this discussion results from a continued failure to address the core reason for these market trends: the undersupply of housing. Local governments and homeowners blocking new homes from being built.  This contributed to the creation of this housing shortage, boosting the profitability and attractiveness of the market to large investors. At the same time, low mortgage rates and the increasing entrance of millennials into the housing market combine with this low supply to push the prices up. Scapegoating institutional investors fails to address the central problem — a housing abundance is needed.

3.  NIMBYism and the Externalities of Non-Development.We usually consider zoning as a local issue, but that is a flawed point of view. The combined impact of hundreds of local zoning decisions have notable effects on regional housing markets. Local restrictive land use decisions lead to raise housing costs, which spills into labor markets because workers are pushed  to less productive cities. In this piece, Will Wilkinson considers these detrimental aggregate impacts of local restrictive zoning regulations, particularly with the problem of “homevoters.” Economist William Fischel defines homevoters as risk-averse individuals who participate in public meetings and the local government to protect and enhance their local property values. These homevoters create reluctance among developers with their difficult negotiations and time-wasting processes. Wilkinson writes that homevoters cause inefficient allocations of development and a reduction in housing options. Homevoter NIMBY-ism with strong local controls creates external costs that impact the entire United States economy and exacerbates the affordability crisis. Wilkinson argues towards a “rebalancing of authority over land use regulation in the direction of state and even federal government.” State governments have the authority to adjust the jurisdiction delegated to cities. An adjustment away from local control might be necessary to ameliorate the adverse effects of homevoters and NIMBYism locally and nationally.

New Knowledge

Economic benefits of road widening: Discrepancy between outturn and forecast.  In 2014, a 16 mile stretch of the London orbital motorway, M25, was reopened after an expansion project. British road-building authorities used a travel demand model called Saturn to predict what would happen to traffic; their estimates forecasted that travel speeds would increase and traffic volumes would increase modestly. Their predictions stated that widening the motorway (orange) would increase the speed of travel at morning peaks by 6-10 kilometers per hour (roughly 4-6 mph) over time. The model’s results were used to calculate a favorable benefit-cost ratio to justify the project.  But was the model right? 

David Metz from University College London compared the outcome of the road widening to the forecasted model in this paper. Metz found that three years after the opening of the motorway, there was a substantial increase in traffic but no reduction in travel time. Looking at the investment appraisal, the investment was based largely on forecasted journey time savings. It was clear that the increase in travel speed failed to occur. The model’s inability to take induced demand into account resulted in this investment’s approval. M25’s added lane increased traffic but didn’t reduce travel time.

 

David Metz, “Economic benefits of road widening: Discrepancy between outturn and forecast,” Transportation Research Part A: Policy and Practice, Volume 147, May 2021, Pages 312-319

The Week Observed, June 4, 2021

What City Observatory this week

What ultimately destroyed Tulsa’s Greenwood neighborhood:  Highways.  This past week marked the centennial of the Tulsa Race Massacre.  In 1921, a racist mob attacked and destroyed the Black Greenwood neighborhood, killing hundreds. The Greenwood’s residents were resilient, rebuilding a neighborhood that thrived for almost 50 years.

According to a new book on the neighborhood’s history, the death of “Black Wall Street” ultimately came by the construction of the Interstate freeway system, slicing through the traditionally Black neighborhood.

Must read

1. Can “Opt-out” zoning reform break the political log-jam?  As we’ve argued at City Observatory, allowing more affordable housing is caught in a kind of prisoner’s dilemma: no local government wants to be the first to liberalize because they fear that no one else will follow.  It takes state legislation to produce change, but state legislature’s are usually deferential to local control, at least when it comes to land use. Writing at the Sightline Institute, Michael Andersen reviews newly passed legislation in Connecticut that would legalize accessory dwelling units throughout the state.  The new law comes with a catch:  if a two-thirds majority of both a city’s zoning board and city council votes to opt out of the law, they can do so.  There’s an element of “try-before-you-buy” to the law as well. The legalization of ADUs takes effect some months before cities are allowed to opt-out, so there will likely be some experience of what ADU legalization looks like before they have to make the decision.  While there’s a risk that truly exclusionary cities will easily get the two-thirds votes needed to opt-out, this approach seems like a worthwhile experiment to prod more cities into allowing this form of affordable housing.

2. The limits of the Fair Housing Act.  Residential segregation and its negative effects continue to persist more than a half century after the passage of the federal Fair Housing Act.  What will it take to achieve fair access to housing for all?  Richard Kahlenberg reviews a new book for the Washington Monthly looking at the past and future of the Fair Housing Act, with an eye to making policy changes.  While the federal law has limited some forms of overt discrimination, the more subtle and pervasive forms of exclusion, like restrictive local zoning, are producing many of the same effects.  The challenge is to craft measures that “affirmatively further” fair housing.  The big challenge is that segregation by class is accomplishing what used to be done by racial discrimination:

While racial segregation is slowly declining, income segregation has doubled since 1970. The Fair Housing Act outlaws discrimination on the basis of “race, color, religion, sex, familial status, national origin, or disability”—but it remains perfectly legal for municipalities to discriminate based on income, per se, by banning the construction of more affordable types of housing, such as duplexes, triplexes and apartments.

Since people of color have disproportionately lower incomes –a product of past discrimination–income segregation perpetuates these fundamentally unfair patterns, which is today’s biggest challenge to equitable housing.

New Knowledge

The Donut effect of Covid. How has the Coronavirus Pandemic affected urban economies?  A new study from Arjun Ramani and Nicolas Bloom looks at the changes in home values and population levels within and across the nation’s metropolitan areas.

It finds that there’s been relatively little population re-distribution (or value shift) across metropolitan areas.  People aren’t relocating wholesale from some metropolitan areas to others, nor are the contours of real estate values shifting.

The study does find that in the past year there has been a city-to-suburb shift in population movement and home price appreciation, chiefly in the dozen or so largest metro areas.  US Postal Service change of address forms display a relative surge of core-to-periphery moves since the start of the pandemic, while Zillow data show home prices and rents have performed relatively better in suburbs than central cities.  Strikingly, these effects only seem to hold for larger cities like New York, San Francisco, Seattle and others.  This may be explained by the very high proportion of workers in tech and professional service occupations who are more able to work from home in these cities.

Central business districts in very large metro areas showed an absolute decline in home values (with the CBD defined very narrowly as areas within 2 kilometers, or about 1.2 miles, of the center of a city). These trends are far more muted outside of these large metro areas, with modestly weaker performance in CBD’s in the 13th through 50th largest metros; and essentially no difference in smaller metros (those with populations of less than 1 million).

It should be little surprise that in the midst of a pandemic, when large scale gatherings of people for business, cultural, social or recreational activities are mostly banned, that the impact should be most felt in central cities.  Providing opportunities for lots of people to get together in close proximity is the raison d’etre and competitive advantage of large urban centers.  The really interesting question going forward is whether the pattern seen in the pandemic persists, or reverses as we are once again able to be in close proximity to one another.  One early sign:  rents in urban centers have ticked upward in the past few months.  These trends bear watching.

Arjun Ramani & Nicholas Bloom, “The donut effect of Covid-19 on cities,” National Bureau of Economic Research, Working Paper 28876, DOI 10.3386/w28876,  

 

The Week Observed, May 28, 2021

What City Observatory this week

1. Why highway departments can and should build housing to mitigate road damage.  For decades, American cities have been scarred and neighborhoods destroyed by highway construction projects.  Many places are contemplating measures to fix these problems, from freeway removals to pledges of “restorative justice.”  Given that highways directly and indirectly triggered the demolition of tens of thousands of homes in the nation’s cities, the most tangible way a highway department can repair the damage it did is not with vague pledges, but by paying to build more housing.  That’s exactly what some state DOTs, including Texas, Kentucky, and Nevada are doing.  The Federal Highway Administration has recognized as an award-winning best practice a project in Lexington Kentucky which is using highway funds to buy 25 acres of land and capitalize a community land trust to build 100 new homes to offset the damage done to an historically Black neighborhood that was decimated by the threat of highway construction.  For decades, highway departments demolished homes; a step toward restorative justice would be putting some of them back.

2.  The toll of Single Family Zoning:  Part 1We are pleased to publish a guest commentary from Anthony Dedousis of Abundant Housing LA, exploring the relationship between single-family zoning and socioeconomic indicators across the 88 cities in L.A. County. This policy bans the construction of apartments, which reduces the amount of housing and raises the cost of living. Apartment bans are rampant in L.A. County, with the median city zoning over 80% of the land for single-family housing. Using data from the South California Association of Governments and the America Community Survey, Dedousis finds that median household incomes, median housing costs, and homeownership rates tend to be higher in cities with a greater prevalence of single-family zoning.

It is clear that there is a positive association between the city’s socioeconomic makeup and apartment bans. Nearly every city with high median incomes similarly has high rates of single-family zoning. Limiting the stock of housing in a city presents a barrier to renter households and low- to moderate-income individuals from accessing affordable housing in these high-opportunity cities. This analysis provides clear evidence that apartment bans by suburban cities are a principal mechanism for creating and maintaining economic segregation in US metro areas.

3.  The toll of Single Family Zoning:  Part 2:  Single family zoning doesn’t just segregate the population by income, it’s a key ingredient in the persistennce of racial and ethnic segregation in US Metro areas.  In part 2 of his analysis of variations in zoning and demographics among Los Angeles County cities, Anthony Deodousis examines the correlations between the amount of land a city zones for single family housing and its racial and ethnic composition.  Not surprisingly, he finds that cities that zone more land for single family housing are more highly segregated than those with more land for multi-family housing.

Perhaps the most striking finding is illustrated in this chart with shows the segregation index (a measure of how racially and ethnically segregated a city is, higher values correspond to higher levels of segregation) and the percent of residential land designated for single family use.  Cities that designate 90 percent or more of their residential land area to single family homes have a segregation index ten times higher than those than designate half or less of their land for single family homes.  It’s a compelling illustration of the connection between single family zoning and segregation.

Must read

1. Business as usual:  The Senate Highway Reauthorization bill:  Just ignore the high minded rhetoric about dealing with climate change and “fix it first” policies for roads, the US Senate is all in on simply pumping even more money into a failed highway-industrial complex.  The latest evidence of this comes from the newly passed highway reauthorization bill emerging from a key Senate Committee this week.  Transportation for AMerica’s Beth Osborne bluntly catalogs the implications:

“The status quo is sending us backwards: This bill attempts to solve the problems of the transportation system with small, underfunded new programs while spending way more to continue to churn out those same problems. It allows states to opt out of lowering carbon emissions and continues to support strategies that are well known to raise them.  We don’t have time for another five years of creating more problems that will take 20 to 50 years to solve.

While President Biden’s American Jobs Plan, which would at least attempt to tackle climate issues flounders in a desperate attempt to be bipartisan, this deeply flawed reauthorization measure passed with a unanimous vote.  It’s symbolic of how deeply this malignant highway building pathology is lodged in the body politic.

2.  Weighing in on Infrastructure Policy.  The Washington Post weighs in on the debate over the nation’s infrastructure policy with a helpful, fact-based article looking at the highway maintenance backlog in contrast to state spending priorities.  They report that the nation faces a $435 billion rehabilitation backlog, and that a fifth of the nation’s major road are in poor condition, and that little progress has been made in reducing the maintenance backlog.  At the same time, state’s have spent nearly a third of their roadway capital on expanding roads rather than maintaining the one’s they’ve already built.

State’s rationalize system expansion on the theory that it will reduce traffic congestion, but as long-time City Observatory readers will know, that’s a false hope.  More capacity begets more traffic, quickly erasing any performance gains.  The Post cites UC Davis transport expert Susan Handy on the implications of induced demand:

For decades, researchers have found that when roads get wider, people tend to drive more, ultimately canceling out any gains in speed. Susan Handy, a professor of environmental science and policy at the University of California at Davis, said traditional tools for forecasting traffic demand to assess the benefits of new construction don’t effectively take that into account.  Researchers noted that traffic eventually increases by about the same percentage a road is widened, so boosting the size of a road by 10 percent will lead to about 10 percent more travel.

3. Don’t use parking requirements as a bargaining chip for affordable housing.  It is widely recognized that parking requirements tend to drive up the cost of housing, but many housing advocates will often fight to retain parking requirements in order to barter them away to persuade developer to build affordable housing units.  The results of a quasi-experiment in San Diego show that this approach is a bad bargain for promoting affordability.  It turns out that simply eliminating or greatly reducing parking requirements has a more positive effect on housing supply and therefore rental affordability than a complex system of negotiated land use approvals using parking space waivers as bargaining chips.

 in 2020, one year after comprehensive parking reform was implemented, there was a fivefold increase in the total number of homes permitted through San Diego’s density bonus program. A record-high 3,283 homes were built using the density bonus in 2020 — nearly half of all new housing permitted in the city that year. Total housing production citywide also rose, by 24 percent. More use of the density bonus program also translated into more affordable units. The program produced over 1,500 affordable homes in 2020 – six times more than in 2019.  Between 2016 and 2019, the program had never produced more than 300 affordable homes in one year.

New Knowledge

Synthetic microdata:  A threat to knowledge.  Each week at City Observatory, we usually profile an interesting or provocative research study.  This week, we’re spending a minute to highlight a potential threat to a key source of data that helps us better understand our world, and especially the nation’s cities: the public use microsample of the American Community Survey (ACS).  The ACS is the nation’s largest and most valuable source of data on population, housing, social and economic characteristics.  While the Census Bureau produces many tabulations of these data, its impossible to slice and dice data in a way that bears on every question.  So Census Bureau makes available what is called a “public use microsample” which allows researchers to craft their own customized tabulations of these data to answer specific questions.  At City Observatory, for example, we’ve used these data to estimate the income, race and ethnicity of peak hour drive alone suburban commuters traveling from suburban Washington State to jobs in Oregon–a question that would be essentially impossible to answer from either published Census tabulations or other publicly available data.

Microdata are valuable because they link answers to different ACS questions–linking a persons age, gender or race to their income, occupation or housing type, and on.  But because the microdata are individual survey responses, some are concerned that there’s a potential violation of privacy:  that someone could use answers to a series of questions to deduce the identity of an individual survey respondent.  While that may technically be a possibility, there’s no evidence it occurs in practice.  Still, Census Bureau is hypersensitive about privacy concerns, and has proposed replacing actual microdata with “synthetic” microdata, in order to make it even more difficult to identify an individual.  Essentially, synthetic data would replace actual patterns of responses with statistically modeled responses.  The trouble is, this modeled, synthetic data actually subtracts information, and makes it impossible for researchers to know whether the answers to any particular question are a product of actual variation, or just a quirk of Census Bureau’s model.  As University of Minnesota data expert Stephen Ruggles puts it, “synthetic data will be useless for research.”

The privacy threat from ACS microdata is a phantom menace.  Ruggles and a colleague at the University of Minnesota have just published a paper showing that attempting to use Census microdata to create individually identifiable records via database reconstruction would produce vastly more random (i.e. false) matches that real ones.   This undercuts the idea that microdata is an actual threat to privacy.

But a proposal to replace PUMS data with synthetic data is a real threat to our ability to better understand our world.  It is like requiring piano players to wear mittens when playing Beethoven sonatas:  the piano will still produce sound, but the result will be noise, not music.

Mike Schneider, Census Bureau’s use of ‘synthetic data’ worries researchers,  Some researchers are up in arms about a U.S. Census Bureau proposal to add privacy protections by manipulating numbers in the data most widely used for economic and demographic research, ABC News, May 27, 2021

Steven Ruggles and David VAn Piper, “The Role of Chance in the Census Bureau Database Reconstruction Experiment,” University of Minnesota, May 2021 Working Paper No. 2021-01 DOI: https://doi.org/10.18128/MPC2021-01

In the News

Bloomberg Business Week quoted City Observatory’s Joe Cortright in its article examining the Biden Administration’s housing policy.

Oregon Public Broadcasting cited Joe Cortright’s analysis of road pricing in its story on highway expansion and tolling legislation currently being considered by the Oregon Legislature.

 

The Week Observed, March 26, 2021

What City Observatory this week

1. How ODOT destroyed Albina.  Urban freeways have been lethal to neighborhoods, especially neighborhoods of color, in cities throughout the nation.  While the construction of Interstate freeways gets much of the attention (as it should), the weaponization of highway construction in minority neighborhoods actually predates the Interstate system.  In Portland, in 1951, the state highway department built a mile long extension of US Highway 99W along the Willamette River which cut the predominantly black neighborhood off from the waterfront.

That was the first of a series of road projects, culminating in I-5 and the Fremont I-405 Bridge which wiped out much of the neighborhoods housing, and propelled the neighborhood’s decline.  Albina’s population fell by nearly two-thirds from 1950 to 1970.

2. Highway greenwashing: Natick’s diverging diamond:  Highway departments know that their traditional asphalt everywhere solutions play poorly with a public more attuned to climate change and quality of life.  So nowadays, they dress up highway expansion projects with an artistically crafted veneer of green and pedestrian friendly images.  The latest example of this comes from Natick, MA, just outside Boston, where MassDOT is looking to turn an existing 1950s era highway interchange into a much larger “diverging diamond.”

 

Must read

1. Five against I-45.  Paris has Mayor Anne Hidalgo, Houston has County Judge Lina Hidalgo. Both are forceful and articulate advocates of re-thinking urban transportation to create fairer, safer, healthier and more successful cities.  In a concise must-watch eight minute video, Judge Hidalgo makes a definitive five point case against TXDOT’s plan to spend billions to widen I-45 through the middle of Houston.  It’s bad for road safety, for air quality and health, would displace hundreds and wouldn’t solve the highway’s chronic congestion problem.  Instead, she argues, the city ought to be investing in alternatives like bus rapid transit, narrowing the footprint of the project, and prioritizing people over futile attempts to make cars move faster.

If you want a picture of what urban leaders can do in the face of truly retrograde highway widening efforts, just watch this video . . . and share it with friends.

2. Embracing Reduced Demand.  City Observatory readers will be very familiar with the concept of induced demand, the idea that building more highway capacity tends to generate additional traffic, quickly erasing any congestion reduction benefits from wider roads. But the reverse is also true:  reducing road capacity generally causes traffic to disappear.  The phenomenon of “reduced demand,” says CNU’s Robert Steuteville, should be an important precept for transportation planning.  Too often, highway agencies push back against projects to calm traffic, reduce speeds and accomodate bikes and pedestrians, arguing that reduced road capacity will produce congestion or gridlock.  But as Steuteville points out, time after time, cities have made major reductions in road capacity, often by tearing out entire freeways, and rather than traffic getting worse, it largely evaporates.  The trouble with some current freeway removal projects, for example in Syracuse and Buffalo, is that if anything, engineers are still too timid in applying this fundamental lesson.  The boulevards designed to replace freeways are often over-sized, out of an irrational fear of gridlock.  He concludes:

You really don’t believe in induced demand, if you don’t also believe in reduced demand. Traffic engineering/planning is going on a hundred years as a profession. It’s time to learn from history, to believe the science, to get smart about street design, to fully use the idea of reduced demand where it has the potential to improve a city’s economy, society, and mobility.

3. A solution for the diverging diamond blues.  Jeff Speck, author of Walkable Cities, also weighed in on Natick’s proposed diverging diamond interchange.  His critique strikes a frustrated, plaintive tone, asking a series of devastating questions, including:

How can we create a park that nobody will use, and then give it unhealthy air quality?  How do we ensure that this large, valuable parcel never produces revenue? How do we communicate to aliens our plans to abandon earth?

Speck also offered a better solution, one that backs away from expanding the highway, slows traffic, and makes the area at least somewhat better for pedestrians.  Based on some successful examples in Indiana, he proposes that MassDOT consider a kind of elongated traffic circle, a peanut interchange, that naturally slows highway traffic and accommodates multiple turning movements (without consuming a vast land area and creating a disorienting experience for pedestrians).

New Knowledge

Less is more:  How reducing vehicle miles traveled improves the economy.  For decades, there’s been a naive assumption that more movement (meaning more driving) somehow makes us richer.  Turns out that was probably never true, and today, the places that are built in a way that enables people to drive less are more economically successful.  That’s the conclusion of a research review conducted by USC transportation professor Marlon Boarnet and his colleagues. This research emphasizes a theme we’ve long advocated at City Observatory:  urban form that reduces vehicle miles traveled (VMT) saves consumers money that they can spend to make their lives better.  That reallocated spending can produce measurable local economic gains.

The research is conveniently summarized in a new 2-page non-technical publication, “Urban Design that Reduces Vehicle Miles Traveled Can Create Economic Benefits.” This policy brief summarizes studies that look at the economic impacts of low-traffic neighborhoods.  It finds, neighborhood businesses can benefit from walkability, in some cases, street closures or traffic lane reductions are associated with improved neighborhood business activity, and that house prices are higher in places with VMT-reducing urban design.

Boarnet, M.G., Burinskiy, E., Deadrick, L., Gullen, D., Ryu, N. (2017). The Economic Benefits of Vehicle Miles Traveled (VMT)- Reducing Placemaking: Synthesizing a New View. UC Davis: National Center for Sustainable Transportation. Retrieved from https://escholarship.org/uc/item/5gx55278

 

 

The Week Observed, March 19, 2021

What City Observatory this week

1. An open letter to the Oregon Transportation Commission.  For more than two years, City Observatory and others have been shining a bright light on the Oregon Department of Transportation’s proposed $800 million I-5 Rose Quarter Freeway widening project in Portland.  All that time, ODOT has maintained its planning a minor change to the freeway, adding a couple of auxiliary lanes to the exiting four-lane freeway.  But new documents show the agency has long been planning a 160-foot wide roadway, more than enough for an eight or ten-lane freeway.  It’s apparent now, in retrospect, that agency staff have long known this to be the case, and have willfully concealed this information from the public through a combination of misleading illustrations and outright lies in response to direct questions about the size of the proposed freeway.  In an open letter to the Oregon Transportation Commission, City Observatory’s Joe Cortright calls for the agency to honestly disclose its plans, and to undertake a full and fair environmental impact statement that shows the traffic, environmental and social effects of the actual 10-lane freeway its proposing to build.

Cortright_to_OTC_RoseQuarterWidth_17March

2.  Are rents going up or going down?  We’re pleased to publish a guest commentary from Alan Mallach, author of Divided City, examining the past year’s trends in rents across major US cities.  Using data from Apartment List, Mallach traces out the variations across the country.  About a third of cities have seen declines, but two-thirds have registered increases.

The patterns have been anything but random:  declines have been concentrated in larger cities and in techy superstars, like San Francisco and Seattle.  Smaller and mid-sized metro areas haven’t seen these declines.  Mallach traces the most probable source of these variations to the differences in the demographics of the renter population in different cities.  The cities with the greatest declines are cities where large numbers of renters are young and affluent, a market to whom those cities’ rental developers and landlords have been increasingly catering in recent years. Many of these renters appear to be moving – in part out of these cities, but also in part to homeownership in the same cities.  These fast-growing tech cities have seen a fall-off in net migration in the past year, leading to at least a temporary glut of apartments, and falling rents.  Outside these cities, though, we haven’t seen demand collapse, and rents have continued to increase.

Must read

1. The conservative war against cars. We think of the “war on cars” as a kind of progressive or radical environmental agenda, but a very compelling case against massively subsidizing a transition to electric vehicles comes from a surprising source:  The American Conservative.  Jordan McGillis has a powerful article arguing that vehicle electrification represents car culture capturing climate policy, and, in the process blowing a once in a lifetime opportunity to fix the damage that cars have done to our communities, landscape and environment.  Here’s McGillis:

All of the effort directed towards EV adoption would be better expended on improving our development patterns, bringing them to human-scale and reducing the necessity of the automobile. The obvious reform candidate is zoning. . . .  Zoning exclusively for single-family homes artificially flattens our cities, necessitates daily automobile commutes, and increases our greenhouse gas emissions. . . . Instead of subsidizing new cars, we ought to allow more varied land use so that cars are not so central to our lives. Despite the EV campaigners’ fixation with the old paradigm, the scales of car culture are beginning to fall from Americans’ eyes. Walkability is in vogue and fewer young people today are viewing the car as a ticket to freedom.

Already the automobile industry is working to use electrification as an excuse not to question the  hegemony and destructive effects of our over-reliance on the private automobile.  The climate crisis and the lessons of the pandemic (where we’ve dramatically reduced our driving) should teach us that its possible to reclaim and rebuild our communities so we’re not so dependent on cars.

2.  Buses, buses, everywhere.  Please.  Farhad Manjoo writing at the New York Times makes the case that if we’re concerned about climate, cities and transportation equity, a first logical step would be doubling the frequency of bus service in the nation’s cities.  The arguments will be familiar to readers of City Observatory.  Too often, discussions of urban transportation and infrastructure get distracted by tech vaporware (like Elon Musk’s habitrail) or self-interested pleas from highway builders to throw hundreds of billions into the dead-end road building of the 1950s.  Manjoo quotes Transit Center’s Steve Higashide, Human Transit’s Jarrett Walker and Transportation for America’s Beth Osborne, all leading experts in this field.  More buses would be equitable, green and fast acting, and would help the nation’s cities rebound from the Covid recession.

3.  YIMBY for New York suburbs.  The Manhattan Institute’s Alan Kober writes about efforts to stem exclusionary zoning in New York State.  While California continues to be a national battleground in the YIMBY (Yes in My Backyard) efforts, the same problems of using widespread single family zoning and apartment bans to promote economic segregation is at work in the suburbs around New York.  Kober points to the wide disparities in income between the Bronx and neighboring Westchester County, which are a product in many respects of exclusionary zoning practices in the suburbs.  Kober outlines legislation pending in New York which would require local jurisdictions to allow apartments by right in transit served locations and to legalize accessory dwelling units in most residential zones.

New Knowledge

Jacob Brown and Ryan Enos, America’s deepening political segregation.  An extraordinarily fine-grained look at the extent to which Democrats and Republicans are segregated not just into red states and blue states, but red streets and blue streets.

Using voter files, Brown and Enos take the exact residential location of every voter in the United States and calculate their spatially weighted exposure to their 1,000 nearest neighbors. This allows measures of segregation between partisans at any level of geography.

One of the challenges of computing segregation indices is that data tend to get aggregated into arbitrary geographic units (counties, municipalities, precincts, census tracts) and while useful as a rough approximation, such units impart their own biases to understanding true segregation.  Brown and Enos have overcome that aggregation problem by computing, for every registered voter in the US, the partisan composition of the nearest 1,000 other registered voters.

This approach clearly illustrates the fractal quality of political polarization in the US.  We’re not merely divided into red states and blue states, blue cities and red suburbs, but within cities and neighborhoods, we’re highly sorted as well:  Republicans have mostly republican neighbors; democrats have mostly democratic neighbors.

Sorting at a gross level is no  no surprise, but choosing different neighborhoods in a city or different streets or housing in neighborhood also appears to be at work.  Choice of housing type and neighborhood amenities could be a factor driving partisan sorting:

While the best available evidence shows that most voters consider the partisan composition of an area to be low on their list of priorities when choosing neighbourhoods, it is still possible that partisan differences in income and lifestyle preferences, such as transportation and type of housing, may drive some voters to select different cities, neighbourhoods and, in some cases, streets or houses within neighbourhoods, even if partisanship is not an explicit criterion for selection.

The measurement of partisan sorting for 180 million voters” (2021). https://doi.org/10.1038/s41562-021-01066-z

In addition to the article, Brown and Enos have produced detailed maps showing the geography of the partisan divide in every US metropolitan area.

In the News

Next City’s Sandy Smith cited our commentary on the “Fundamental, Global Law of Road Congestion” in his survey of freeway-widening fights around the country.

Planetizen featured our analysis of the impact of Portland’s inclusionary zoning policy on new apartment construction.

In its reporting on the US DOT decision to suspend work on the proposed I-45 expansion project, Houston Public Media cited our analysis showing the failure of the widening of the Katy Freeway to reduce traffic congestion.

 

The Week Observed, March 12, 2021

What City Observatory this week

1. The failure of Vision Zero.  Like many regions, the Portland metropolitan area has embraced the idea of Vision Zero; a strategy of planning to take concrete steps over time to reduce the number of deaths and serious injuries from road crashes to zero.  A key step in Vision Zero is setting—and monitoring—well-defined targets for steady progress over time.  Metro, Portland’s regional planning agency has just produced a report card for the region’s efforts through 2019; and its pretty clear that the result is all “F’s”:  Of 25 Vision Zero indicators, the region is on track to achieve exactly none of them.  Rather than decreasing, as called for the in Vision Zero plan, roadway deaths are increasing sharply.

As the region’s safety planners have known for some time, roadway deaths are highly concentrated on the region’s multi-lane arterial highways.  But rather than fix these, the state highway department, ODOT, is devoting billions to widening major freeways (which are among the region’s safest roads already).  In addition, lower gas prices since 2014 have spurred more driving, more crashes and more deaths. Vision Zero is a noble target, but it’s apparent in the Portland region, that the current plan is failing in every respect.

2. Portland’s apartment market has its “Wile E. Coyote” Moment.  Ever since the implementation of one of the nation’s toughest inclusionary zoning requirements in early 2017, Portland’s apartment market has been like Warner Brothers “Wile E. Coyote”:  motoring on rapidly through clear air with no visible means of support.  Apartment starts and completions—propelled by a land rush of applications filed to beat the inclusionary zoning requirements—have produced increased apartment supplies, rising vacancies, and lower rents.  But finally, the coyote has looked down:  Apartment completions have fallen by roughly two-thirds in the past year, and there’s very little in the pipeline now.

There’s clear evidence developers are put off by the inclusionary zoning requirement:  there’s been a surge in 20 unit (and smaller) apartment buildings (exempt from the IZ requirement): meanwhile no new apartments in the 21-25 unit range have been completed in the past year.  As we warned two years ago, the temporary incentives created by the IZ’s “grandfathering” provisions provided a temporary respite for the housing market, and now the negative impacts on the inclusionary requirements are crushing the development of new apartments.  The negative effects on housing supply will be felt for many years, and will cast a long shadow over housing affordability in Portland.

Must read

1. Are people leaving California?  Natalie Holmes of the Policy Lab at the University of California offers an analysis of credit bureau change of address data to assess migration in and out of the Golden State.  They find that while there has been a sharp increase in net movement out of the City of San Francisco, there’s no evidence of an exodus from California as a whole.  Most of those leaving San FRancisco are moving to other locations in the Bay Area, and to a lesser extent elsewhere in the state, rather than other states.  As Holmes notes:

San Francisco has seen a 31% increase in departures and 21% decrease in entrances since the end of March 2020. Net exits from San Francisco increased 649%, from 5,200 to 38,800.

The credit report data for the past five years show a gradual and continuing increase in the number of out-migrants (blue line) and a flat-lining, and recent slight decline in in-migrants (yellow line).  The data show a strong seasonal pattern to movement activity.  Its also important to note that credit report data likely undercount certain groups (international immigrants and young adults, who may not have established credit records; in states like California, this likely significantly undercounts in-migrants).

2. A close look at the limits of NIMBYism.  Toronto’s Globe and Mail architecture critic Alex Bozikovic has a compelling essay on the origins of, and problems emanating from North American NIMBYism (Not in My Backyard).  In Canada, as in the US, land use decisions are heavily biased toward protecting the status quo, especially in the form of giving single family zoned neighborhoods effective veto power over new development.  The inevitable result has been rising home prices, declining affordability, and intensified economic (and racial/ethnic) segregation.  The solution according to Bozikovic is to allow much more density and variety of housing in cities, with an explicit objective of growing upward and inward, rather than ever outward.  In rousing few closing paragraphs, he writes:

What good is accomplished by locking down so much of our cities to change – so that you must be affluent to live there? Where would be the harm in allowing more big things next to small things, more houses giving way to apartment buildings? People of different generations, income levels and interests living together? . . .   More social housing would be crucial. Progressives in the world of housing policy are often skeptical of such a bargain. But our current regime does little for the less privileged; it builds segregation and inequality, and encourages middle-class people to settle in car-oriented suburbs.  . . . The policy objective should be simple: All new growth should happen in places that are already built out. No more building on greenfields. No more sprawl.

3. Making Houston safer for pedestrians.  A couple of months back, we took Houston to task for what we called “performative pedestrian infrastructure.”  The Houston-based Kinder Institute’s Andy Olin wrote a thoughtful reply to our analysis.  In the spirit of constructive dialog that its offered, we urge City Observatory readers to consider this article. Olin’s key point is that as auto-dominated as Houston is (and most other US cities are), you have to start somewhere.

New Knowledge

Even more dangerous by design.  For the past several years, Smart Growth America has been highlighting the safety flaws literally engineered into our current road system:  the way we build roads in the US inherently puts people, especially vulnerable road users on foot and on bikes, in harm’s way. Far from being accidents, the huge and continuing volume of crashes is a reflection of the conscious design choices we’ve made.  Dangerous by Design, 2021 shows that xxxxx

The report illustrates that our streets are disproportionately deadly for people of color and older Americans.  Hispanic, Black and elderly persons are much more likely to suffer death or serious injury as pedestrians.  There’s also a distinctive geographic pattern to pedestrian death rates; Sunbelt states from Arizona to Florida consistently have the highest rates of pedestrian fatalities.

The growing number of pedestrian deaths is symptomatic of the structural biases in road design:  Engineers routinely prioritize vehicle speed and throughput over making streets safe places to bike and walk.  Wide, multi-lane arterial roads, with sweeping corners and slip lanes encourage high speeds regardless of posted limits and create inherent danger for pedestrians. Smart Growth America argues for “complete streets” policies that prioritize the safety of vulnerable users over higher vehicle speeds:  until we adopt such policies, it’s likely the pedestrian death toll will continue to rise.

Smart Growth America, Dangerous by Design, 2021.

In the News

Strong Towns re-published our essay, “The Fundamental, Global Law of Road Congestion,” detailing international studies that show that additional freeway capacity simply generates a proportionate increase in traffic, meaning road widening is an inherently futile tactic for reducing congestion.

Alan Ehrenhalt, of Governing, gave City Observatory a shout out in his column:  “Where Americans are moving, and why they are really doing it.”  His conclusion:  despite what you may have read, there’s no urban exodus and young adults aren’t decamping from Brooklyn to Mayberry.

 

The Week Observed, March 5, 2021

What City Observatory this week

1. The fundamental global law of traffic congestion.  For years, urbanists have stressed the concept of induced demand, based on the nearly universal observation that widening urban roadways simply leads to more traffic and recurring congestion.  Repeated studies in the United States have confirmed that any increase in urban roadway capacity leads quickly to a proportionate increase in vehicle miles traveled.  A new study extends this analysis to several hundred European cities, and find the same result:  a one percent increase in road capacity tends to lead in short order to a one percent increase in vehicle miles traveled.

You can’t build your way out of congestion, anywhere.

The study finds two factors that ameliorate this pernicious relationship: tolls and transit.  Cities that charge a toll for the use of urban roadways have a dramatically lower level of induced demand, as do cities with extensive rail transit systems.  In North America, in Japan, and in Europe, all these studies confirm that widening freeways to reduce congestion is inherently futile.  We need to stop wasting money on this discredited and environmentally disastrous transportation policy.

2. A new framework for equity in economic development. We’re pleased to offer a guest commentary from Darrene Hackler, summarizing the results of a new report her firm has published on behalf of Pew Charitable Trusts.  Equity is the big challenge facing community and local economic development. Everywhere, there are some people and some neighborhoods that are left out or left behind no matter how strong the economy.  After surveying development efforts around the country, Hackler outlines three key steps that can guide effective and equitable economic development practices.  Her “Determine-Design-Evaluate” framework addresses how to tap local knowledge and support and how to establish accountability for performance.

 

Must read

1. Alan Durning on the contradiction at the heart of US housing policy.  Housing can’t be affordable and also be a great wealth-building investment.  Sightline Institute’s Alan Durning echoes a theme we’ve long emphasized at City Observatory:  there’s a fundamental and glaring contradiction between our two key housing policy aims. The subsidies we provide to home-ownership (skewed heavily to higher income households) and the restrictions on building more housing in high demand locations, help drive up home prices, which increases the wealth of incumbent homeowners, but simultaneously tends to make housing less affordable for everyone else. If we’re ever going to seriously work on making housing more affordable, we’ve got to confront local land use policies that restrict the supply of housing and drive up rents, and as Durning points out, we’ve also got to re-think the tax treatment of homes and mortgages.  As Durning says:

. . .  in an affordability-first housing economy, it’s a good bet that housing policy would stop subsidizing mortgage borrowing, property taxes, capital gains, ownership over renting, and the financialized mortgage industry.

While these tax breaks have been regarded as politically sacrosanct for decades, recent cuts to the mortgage interest deduction were included in the 2017 tax reform act, and Durning’s political analysis suggests that this may set the stage for further changes.

2. Rebounding rents in superstar cities? One of the most striking real estate market changes during the Covid-19 pandemic has been the sharp decline in apartment rents in some of the nation’s strongest urban centers. Rents have gone down in San Francisco, New York and other large cities, leading some to predict a prolonged decline.  But Apartment List’s latest rent report suggests that markets in these cities are starting to turn around, as evidenced by upticks in rent levels in large cities at the end of 2020.  Cities that saw significant declines in 2020 have seen modest increases in the first part of 2021:  San Francisco rents are up 1.2 percent; Boston’s are up about 3 percent.  Among the 10 cities with the largest year-over-year declines in 2020, only one, New York recorded declining rents in February, and that was a 0.1 percent decline.

As we’ve noted, housing has been subject to the “K-shape” effect, with higher income households driving up housing prices, while rents have fallen because job and income losses have been concentrated among lower income households. As the vaccine roll out continues, the Coronavirus recession fades, and the economy revives, it is likely that urban housing demand in these superstars will rebound, and rents will make up much of the ground given in 2020.

3. Increasing urban job density on the eve of the pandemic.  The Brookings Institution has new estimates of the increasing density of jobs in the nation’s large metro areas. Their analysis, based on geographically detailed federal employment data through 2018 shows that, in what turned out to be the penultimate year of the last economic expansion, that employment grew increasingly dense in large cities.  The biggest gains in density were recorded by “superstar” cities like San Francisco, New York and Seattle, but the pattern of increasing density held for most of the nation’s large metro areas.

There’s a pronounced industrial pattern to job densification; the biggest gains in job density are attributable to the clustering of information technology, professional service and financial jobs in a relative handful of large metro areas. There’s obviously been much speculation about how work-at-home practices may alter these patterns, but these data show that on the eve of the pandemic, there was a strong tendency for activity to concentrate in dense locations in just a few metro areas.

New Knowledge

Lived Segregation.  Most of our understanding of the nature of urban residential segregation comes from Census data on housing, and looks at the extent to which people from different income or racial/ethnic groups live in different neighborhoods.  That’s clearly important, but significantly, leaves out how much we mix as we move amoung neighborhoods and in and around cities on a daily basis (or at least the way we used to, in a pre-pandemic world).  As with so many things, the mass of big data created by our electronic connections creates a new source of information about patterns of segregation.  A new study from Brown University sociologist Jennifer Candipan and her colleagues uses geolocated twitter data to study patterns of movement in 50 large US cities.

They’ve used this data to measure the extent to which people travel between neighborhoods that have different racial/ethnic compositions.  They divide city neighborhoods into four broad categories (Black, White, Hispanic or Mixed) and measure the extent to which people who live in one type of neighborhood travel to the other types of neighborhoods.  Their summary measure, called the “segregated mobility index” which identifies the extent to which people from different races tend to travel only to neighborhoods in their same category.   As the author’s stress, this is fundamentally a measure of “neighborhood connectedness” and is not a measure of individual activity.  The following chart shows the relationship between residential segregation—as measured by a traditional black/white dissimilarity index—and segregated mobility (as measured by this study’s new segregated mobility index.  Both indices run from 0 to 1, with values closer to 1 indicating greater degrees of segregation (zero indicates no segregation;  one indicates perfect segregation).

This chart suggests that residential segregation has a strong influence on segregated mobility:  Cities with high levels of residential racial segregation also tend to have high levels of segregated mobility.  That said, there’s still considerable variation in segregated mobility among  cities with similar levels of housing segregation.  For example, New Orleans and Detroit have similar levels of housing segregation, but there’s much more neighborhood mixing, as measured by the segregated mobility index, in New Orleans, than in Detroit. This suggests that some cities do a better job of enabling/promoting inter-group mixing outside one’s local neighborhood than others.  Understanding what factors in cities facilitate greater mixing is a next step for this kind of research.
Jennifer Candipan, Nolan Edward Phillips, Robert Sampson, and Mario Small, “From Residence to Movement: The Nature of Racial Segregation in Everyday Urban Mobility,” Urban Studies, 0042098020978965.

In the News

KATU television news has a story reporting on the previously secret plans by the Oregon Department of Transportation to build a 10-lane wide freeway at Portland’s Rose Quarter. An Oregon DOT official concedes that the report shows that the agency has a “trust deficit.”

 

The Week Observed, January 8, 2021

What City Observatory this week

1. 2021 is when we have to get real about tackling climate change.  We’ve boiled our analysis of the climate challenge down to four key points:

  1. Pledges alone won’t accomplish anything. Saying you support the Paris Accords and plan to emit much less greenhouse gas a two or three decades from now doesn’t count for anything without immediate actions now to lower emissions.
  2. We’re falling further and further behind our stated goals of reducing greenhouse gas emissions, principally because since 2014 we’re driving more.  That’s true in cities around the country, including our own home Portland.  Transportation agencies are particularly complicit in this failure, and are chiefly rationalizing “pollution as usual” in the guise of climate strategy.
  3. Cities are central to the climate solution: We need to build more great walkable city neighborhoods, and more housing in the great walkable neighborhoods we already have.
  4. Pricing carbon (as well as pricing driving and parking) can go a long way to creating the incentives and leveraging the financial resources and human ingenuity needed to save the planet.

2. If Portland’s going to have a carbon tax, let’s make it apply to the biggest and fastest growing sources of greenhouse gas emissions.  The City of Portland has proposed a $25/ton “healthy climate” fee, which we think is a good idea, but doesn’t go far enough.  The city’s proposal would apply the fee only to about 30 “facilities” around the city that together account for only about 5 percent of the cities carbon footprint.  In our view, the fee ought to be extended to two of the largest sources of greenhouse gases, travel on the Oregon Department of Transportation’s interstate freeways in Portland, and flights in and out of Portland International Airport. Together, these two sources account for several times more carbon pollution than the three dozen facilities singled out in the cities proposed ordinance.

3.  A Green New Deal for Portland?  In the wake of the failure at the polls of a multi-billion transportation proposal, the region could use new leadership and a new direction for metro Portland’s transportation and climate efforts.  We’re pleased to present a guest commentary from planner and strategist Garlynn Windsong speaking to what the Portland metropolitan area could be doing to seriously address the climate crisis.

Must read

1. Streets before trust.  Alon Levy of Pedestrian Observations has an insightful column about the unfortunate paralysis created by insisting that we make community trust a pre-requisite for meaningful changes to the urban environment. In Levy’s view, the way we’re most likely to build trust with dispossessed communities in by tangible actions, not extended if sympathetic conversations.

There’s an emerging mentality among left-wing urban planners in the US called “trust before streets.” It’s a terrible idea that should disappear, a culmination of about 50 or 60 years of learned helplessness in the American public sector. . . . The correct way forward is to think in terms of state capacity first, and in particular about using the state to enact tangible change, which includes providing better public transportation and remaking streets to be safer to people who are not driving. Trust follows – in fact, among low-trust people, seeing the state provide meaningful tangible change is what can create trust, and not endless public meetings in which an untrusted state professes its commitment to social justice.

Nothing Levy says denies the real disempowerment that low income and BIPOC communities experience; it acknowledges that actions speak much louder than words. The problem is that too frequently the processes that are used to simulate trust building via public involvement become a substitute for more direct and tangible measures to redress past injustices.

2. Automobile Dependency and Equity.  The irreplaceable Todd Litman has a timely essay looking at the inherent inequity of our automobile centric transportation system.  One of the key findings:  while owning an automobile can increase a low income household’s access to jobs, on average it ends up costing them more for car payments, fuel insurance and repairs that it generates in increased income.  As Litman summarizes:

. . .  low-income households that obtained a car were able to work more hours and earn approximately $2,300 more per year, which sounds great, but they spent an additional $4,100 annually on their vehicles, so they ended up with less time and less money overall. For many lower-income people, automobiles are an economic trap: they force people to work harder so they can earn more money so they can pay vehicle expenses to commute to their job, making them worse off overall.

There’s nothing equitable about a transportation system that works well only for those who can afford and operate an expensive private motor vehicle.

3. A big fraction of metro moves appear to be short-term.  A key facet of the “urban flight” stories associated with the pandemic was data on people looking to change metro areas. Chris Salviati, who analyzes apartment web search activity at ApartmentList.com reports that a larger than normal share of movers are looking for short-term leases, suggesting that what movement has been observed may be reversed as the pandemic and recession subside.  For example, there’s been a big uptick in searches for short term leases in places like Honolulu:  Who wouldn’t want to ride out the pandemic there?

According to Apartment Lists analysis of millions of searches, declines in some markets like San Francisco have been driven more by a reduction in in-bound moves than an increase in out-bound activity. It appears that the market is reflecting uncertainty about the immediate economic situation, rather than a permanent, long-term shift in location preferences.

New Knowledge

It’s become increasingly clear in the past decade that induced demand makes freeway widening futile as a congestion reduction strategy.  Building more roads generates more and longer car trips, which show up as an increase in vehicle miles traveled (VMT), air pollution and sprawl. That’s a fact that the existing four-step models used by state highway departments to justify freeway construction and expansion projects simply aren’t designed to incorporate.  An analysis by transportation experts Jamey Volker, Amy Lee and Susan Handy of the UC Davis Institute for Transportation Studies concludes:
. . .  most models do not include all of the feedback loops necessary to represent the secondary effects of capacity expansion. These models were designed to estimate the effect of capacity expansion on travel times for a given population and employment level for the region. . . . Few models feed the estimated travel times back into the trip distribution or trip generation stages of the model, thereby ignoring the possibility that improved travel times will increase the number of trips that residents choose to make or the possibility that they will choose more distant destinations for their trips. Few models feed estimated travel times back into assumptions about the distribution and growth of population and employment that also influence the frequency and length of trips. In short, the models may do an adequate job of accounting for changes in route and shifts in mode, but they underestimate increases in VMT attributable to increases in trip frequencies and lengths that the capacity expansion will induce.
The academic literature on that subject has converged on the idea of unit elasticity, that in dense urban environments, a one-percent increase in highway capacity generates a one percent increase in vehicle travel.  Volker, Lee and Handy have used that analysis to construct an induced travel calculator, calibrated for California freeways, that estimates the additional vehicle miles of travel induced by an expansion of roadway capacity.
The article examines how induced demand has (and usually hasn’t) been addressed in planning for five major highway expansion projects in California.  They find that highway departments usually only address induced demand in response to public comments, rarely apply state-of-the-art modeling to their analysis, and routinely underestimate the effects of induced demand, by as much as an order of magnitude.  If we’re serious about reducing greenhouse gas emissions, highway expansion projects need to be routinely analyzed using models that reflect the best available science, and which explicitly model the impacts of induced demand.
Jamey M. B. Volker, Amy E. Lee, Susan Handy, “Induced Vehicle Travel in the Environmental Review Process,” Transportation Research Record, Volume: 2674 issue: 7, (July, 2020)pages 468-479

In the News

Bloomberg CityLab editor David Dudley has a long column, fondly remembering his lengthy pre-pandemic commute to work; in it he quotes our observations about the relative rarity of super-commuting.

 

The Week Observed, March 6, 2020

What City Observatory this week

1. The thickness of the blue line. Robert Putnam popularized the notion of social capital in his book “Bowling Alone,” which he illustrated with a number of indicators of social interconnectedness, like membership in non-profit organizations and clubs, including bowling leagues.  We have our own indices of “anti-social” capital, including the number of security guards per capita, suggesting that we feel like we (and our property) need more protection in some places than others.  This week we’ve computed the number of “cops per capita” in different metro areas.  Some cities have many more (Baltimore and New York are both in the top five) while others have far fewer police per 1,000 residents (Minneapolis, Seattle and Portland).

We have a complete ranking of cops per capita for the 50 largest metro areas, and also find that there’s a strong correlation between the number of security guards in a region (per capita) and the number of police, suggesting that some places really are more concerned about safety and security than others.

2. Are Uber & Lyft causing more traffic crashes? A year and half ago, a research paper suggested a correlation between the advent of ride hailing and increased crash rates.  We questioned the first draft of the paper, and a new version has just come out, which repeats some of the original claims.  We’re skeptical of the case for ride hailing causing more crashes; a better explanation is that the big decline in gas prices–which happened exactly as ride-hailing took off–triggered more driving, which led to an increase in crashes. That’s borne out by the fact that crashes increased even more in rural areas, where ride-hailing is rare.  Finally the revised paper now has some data on when crashes occur:  the increased rate of crashes occurs both at times when ride-hailing is common (Friday and Saturday nights) and when it is infrequent (weekdays), which suggests that ride-hailing is not the cause of increased crashes.  We still regard the case as “not proven.”

Must read

1. You can spell congestion without “con.” Transportation for America has a new report that comprehensively skewers the highway engineer’s conventional wisdom about roads and traffic. Entitled “The Congestion Con:  How more lanes and more money equals more traffic,” the report shows that contrary to common belief, we’ve been building roads faster than population has been growing, but the only result is to generate additional car travel and even more congestion. Our obsession with moving cars faster is at the root of these problems, as the report explains:

Car speeds don’t necessarily tell us anything about whether or not the transportation network is succeeding at connecting as many people as possible to the things they need, as efficiently as possible. Yet a narrow emphasis on vehicle speed and delay underlies all of the regulations, procedures, and cultural norms behind transportation decisions, from the standards engineers use to design roads to the criteria states use to prioritize projects for funding. This leads us to widen freeways reflexively, almost on autopilot, perpetuating the cycle that produces yet more traffic.

The culprit, as City Observatory readers know well, is induced demand, which comes in two waves.  First, in most urban settings, when new capacity becomes available, motorists rapidly change their travel patterns to occupy it–nature abhors a vaccum, and cars fill empty urban road space. Second, and more insidiously, over a period of years, the pattern of land uses becomes progressively more sprawled, as people move to homes further from jobs, and as businesses decentralize. The result:  everyone’s trips become longer and the entire area becomes even more car-dependent. And recurring congestion is then used as an excuse to build even more roads, repeating the cycle at ever larger costs (and scales). It’s a con-game that has hollowed out our cities, lengthened our commutes and menaces our planet.

The Congestion Con is a remarkably comprehensive synthesis of the case against road widening.  It shows that we’ve spent nearly half a trillion dollars on roads, and congestion has only become worse. It explains in detail why engineer’s crude mental models, and the obsession with car speeds predictably produces strategies that make the problem worse. It dismantles the pseudo-science behind traffic delay measurements that are often used to justify highway widening.  It’s a must, must read.

2. Can hyper local zoning solve our housing shortage?  Sightline Institute director Alan Durning is continuing his thoughtful and provocative series into the political of housing, trying to figure a way out of the corner that we’ve painted ourselves into by our dependence on local zoning. The latest installment considers a radical, and in some ways counter-intuitive solution:  hyper-local zoning.  The idea would be to let the landowners of a single block choose, but some supermajority requirement,  to rezone themselves, for example, by increasing the height limit from three-stories to six.  The idea is that if everyone on the block can benefit from higher values, that they would bear both the costs and the benefits of the action. It’s an interesting thought experiment in how we might change the incentives and rewards in redeveloping neighborhoods.

3. Confused and conflicting definitions of gentrification. Everyone knows what gentrification is, right? In concept, maybe.  But when it comes time to mark down the status of particular neighborhoods, there’s widespread disagreement. Sidewalk Labs Eric Jaffe reports on a new paper comparing several different academic studies of gentrification.  It finds that depending on the methodology one uses, you get very different pictures of the extent and pattern of gentrification.  Here are maps of Boston, using the definitions of gentrification from four different studies; red hues indicate gentrifying areas. (All the maps are of Boston; the city names in the upper right hand corner of each panel identify the source of the methodology used to compute gentrification or gentrification risk).

As Jaffe explains, the four different methods produce very different pictures:

There was very little overlap in terms of gentrification areas, with only seven common census tracts (out of 180 tracts in all of Boston) marked as gentrifying or “at risk” of gentrifying across the four map methods. There was also a very wide range of map coverage: the most conservative map method identified 25 at-risk tracts, while the most lenient identified 119.

This study complements a similar analysis published last year by Rachel Bogardus Drew of Enterprise Community Partners. If we can’t fully agree on what constitutes gentrification, and when, where, and whether its happening, its difficult to have a useful conversation, and perhaps impossible to reach a well-informed consensus about what to do. Gentrification may be one of those terms which has goes from obscurity to meaninglessness with no intervening period of clarity.

New Knowledge

The Young and Restless in Europe.  One of our research interests at City Observatory has been the growing concentration of well-educated young adults in US cities. A new report from Center for European Reform explores a similar theme for the European Community.
As in the US, there’s a growing correlation between the educational level of the population and the productivity of local economies.  The CER report finds that:
The most important question is: what makes a successful region? With a new regression analysis, we show that high productivity levels in regions are associated with three factors: they are part of – or geographically close to – successful cities; a larger proportion of their workforce are graduates; and their populations are younger. The association of a high share of graduates with productivity levels is also rising over time.
Young, well-educated workers are increasingly concentrating in the capital cities of European countries (which in general, tend to also be the most populous urban centers in their respective countries). The report conclude:  “Successful city-regions are gobbling up graduates and young people, and this trend seems to be increasing.” The following chart shows the increase in the share of the population with university degrees in each European country (the blue bars) contrasted with the increase in each nation’s capital city (the red squares).  In every case–save Brussels–the capital city has recorded a faster increase in educational attainment than the country as a whole.
And just as in the US, the movement of talent to cities is creating additional demand for housing, with the predictable result that rents are increasing faster in and near large cities than in other places.  The report shows data across several countries, but its most compelling illustration of this effect is data on the change in rents in the UK based on the relative distance from London. The closer one is to London, the more rents have risen since 2010.
This is powerful evidence that the key phenomena we’ve highlighted at City Observatory–the movement of talented young adults to cities, and the collision of increasing demand for urban living with a slowly changing urban housing stock–is producing a kind of shortage of cities that underlies housing affordability issues.
Christian Odendahl, John Springford, Scott Johnson and Jamie Murray, The big European sort? The diverging fortunes of Europe’s regions,  Center for European Reform, April 2019

 

 

The Week Observed, March 13, 2020

What City Observatory this week

Exploding whales and cost overruns. For years, the Oregon Department of Transportation has been pushing a mile-and-a-half long freeway widening project at Portland’s Rose Quarter, telling the Legislature in 2017 that it would cost $450 million.  That number has now ballooned to nearly $800 million, and could easily go over a billion dollars with the cost of making freeway overpasses strong enough to support buildings. Our review of recent largest ODOT projects shows that overruns aren’t so much a bug as a regular feature, with most large projects being 200 percent or more over budget by the time they’re completed.

Explosive cost-overruns are reminiscent of ODOT’s experience trying to move stranded whale corpse off an Oregon beach, an event still ranked as one of the most viewed youtube videos of all time. Once again, ODOT has miscalculated and it’s raining blubber.

ODOT then: Exploding Whales. ODOT Now: Exploding Budgets

 

Must read

1. Freeway revolts are back.  You may think of freeway revolts–widespread public opposition to building and expanding freeways in urban areas–as a relic of the 1960s or 1970s, but thanks to the dinosaur like tendencies of state highway agencies, they’re back.  CityLab’s Laura Bliss describes plans to widen freeways in Houston and Portland, and the emergence of citizen efforts to push back, in the name of protecting the climate, health and safety, and urban spaces.  Houston is planning to spend upwards of $7 billion to widen I-45; Portland is looking at $800 million (and possibly more than a billion) to widen I-5.  In both cities, coalitions of citizens and neighborhood groups are fighting these efforts. Houston City Council member Letitia Plummer clearly sees the freeway decision as pivotal to the city’s future:

“This is the moment in Houston’s history where the decisions we make now will affect every single thing we do for the next three to five generations.”

And the idea that Portland, a self-styled environmental leader, would be widening a freeway in the face of climate change is a shocker.

Players in Houston’s freeway fight said that they were surprised to hear that the Oregon city is struggling with the same problems.[Michael Skelly, a local businessman and founder of the Make I-45 Better Coalition]  said. “You’d think in Portland they’d be over it,.

The institutional inertia of the highway building bureaucracies, and their ability to deftly forget the role that their freeways played in decimating cities and neighborhoods decades ago, and their willingness to ignore the threat of climate change means that bitter freeway battles are likely to play out in more and more places in the years ahead.

2. Privacy, Transparency, and Accuracy challenge big data. The growing avalanche of location-based information gathered by telecommunications companies, financial institutions, and internet providers is a potential treasure trove of information about travel behavior. Alphabet spinoff Replica is selling a sanitized and synthesized version of its location data to cities for use in transportation planning. One of the early adopters is Portland’s metro, which hopes to use the data to plan roads and transit. But the usefulness of the data is clouded by a combination of privacy concerns, and the company’s secrecy. As Kate Kaye relates at Fast Company, Replica has been far from transparent in explaining where its data comes from. And while the data is anonymized, and is further massaged to create “synthetic” data on trip origins, destinations, and speeds, there are concerns it could be used in a way that violates user’s privacy. And because Replica won’t share the original data or fully explain how its modeled, its difficult or impossible for Metro to independently verify its accuracy:  Does it fully reflect trips by every demographic group, and accurately capture trips by non-car modes of travel? Big data could potentially be useful, but not if it comes from a black box process that makes it possible to check for incompleteness or bias.

3. Frank talk about climate change and vehicle miles traveled in California. California is leading the nation in thinking seriously about the kinds of changes that will be needed to reduce greenhouse gas emissions. For years, California has regulated vehicle emissions more stringently than the federal government, and is actively promoting vehicle electrification. But despite the hopes of some that EVs will be a magical technical fix, all of the evidence from California suggests that the state will also needed to dramatically reduce the number of car miles driven. Air Resources Board Chair Mary Nichols and Transportation Secretary David Kim agreed that while electrification will help, vehicle travel has to be reduced. Kim said:

“Promoting greater use of EVs is clearly a key strategy, but also: reducing VMT and encouraging mode shift. We need to have safe, accessible, affordable, reliable, and frequent ways of traveling. The more people walk, bike, and use shared mobility including transit the better it will be for everyone.”

The state has done its homework on this question. University of California, Davis professor Susan Handy, who’s done the math for the state, says that California will need to significantly reduce VMT by a combination of promoting denser development and providing alternatives–transit, walking, cycling–and also adopt policies that reflect to car drivers the physical, social and environmental cost of driving, through parking pricing and road pricing.  Streetsblog explained:

“It’s pretty clear we will also need to wield a stick,” said Professor Handy. That means making driving less attractive by making it more expensive–“pricing parking, cordon pricing, it all needs to be on the table”–as well as less convenient. One place to start would be replacing parking minimums with parking maximums–that is, instead of requiring the provision of free parking everywhere, capping the number of parking spots that encourage people to drive by giving them a free place to store their car. Another strategy is closing streets to private vehicles, as San Francisco recently did on Market Street. Handy also pointed out that “congestion itself is a deterrent to driving. Clogged roads is an incentive for people to get out of cars.” This is especially true if transit is not stuck in that traffic.

New Knowledge

Social distancing is key to beating the Corona Virus:  A tale of two cities.  The novel corona virus, aka Covid19, is now a global pandemic.  The US has been caught flat-footed, and we’re racing to catch up.  The chief problem now is dealing with “community spread” of the virus. Some of the most insightful work on how to fight a pandemic comes from retrospective research into the Spanish Flu pandemic of 1918.
Researchers compared the progress of the disease in two cities, Philadelphia and St. Louis. The two cities reacted very differently to the outbreak of the flu, with Philadelphia allowing large public events, including a parade with 200,000 persons to continue, while St. Louis acted quickly to limit large public gatherings. The net effect was that the death rate in Philadelphia was several times higher than that in St. Louis:
Social distancing in St. Louis radically slowed the rate of the flu’s spread through population, and avoided overwhelming the city’s health care system.  Our challenge with the Corona virus is to do everything we can to make sure it follows the dashed line rather than the solid one.
At City Observatory, we’re keen to emphasize the role of the civic commons, this is one of those paradoxical times when close and frequent social contact isn’t a good thing. Given our strong tendency to interact with others in our daily lives, it will require considerable effort on all our parts to observe the social distance needed to fight the corona virus.  We’re all in this together.
Richard J. HatchettCarter E. Mecher, and Marc Lipsitch, Public health interventions and epidemic intensity during the 1918 influenza pandemic.  

In the News

Houston’s Kinder Institute published a version of our commentary on the missing counterfactuals in gentrification research as “Is non-gentrification the real threat to neighborhoods?”

 

The Week Observed, April 3, 2020

What City Observatory this week

1.  Counting Covid- Cases in US Metro Areas.  We’ve been updating our metro area tabulations of the number of reported Covid-19 cases on a daily basis. You can find our latest tabulations here.

There’s a mixture of positive and negative developments. Seattle, the metro first hit hard by the pandemic, and among the first to institute social distancing, has seen its rate of new reported cases fall to the lowest level among the nation’s large metro areas–a still high 10 percent daily increase over the past week.

The more troubling developments are in New York which has, by far the nation’s highest incidence of reported cases (436 per 100,000 population on March 31, compared to the median large metro area, which as about 47 cases per 100,000), and in a number of cities where the rate of newly reported cases is growing faster than in the typical metro.  New Orleans and Detroit have elevated levels of reported cases per capita.  Cases are growing faster than the average large metro in Boston, Philadelphia, Miami and Indianapolis.

You can check back at our home page, www.CityObservatory.org–we’ll produce regular updates of our metropolitan tabulations of reported Covid-19 cases.

2.  An animated view of the spread of the pandemic.  We’ve used our daily metro level tabulations of reported cases per 100,000 population to produce an animated chart showing the growth of the pandemic from March 1 through March 31.  You can view it here:

Must read

Privacy and the public interest in a pandemic.  The New York Times has a provocative article discussing how we balance personal privacy rights of patients against the public interest in better understanding how to fight a deadly communicable disease. There’s wide variation among local health departments in releasing basic aggregated statistical data about the incidence and demographics of those who have contracted the Corona virus. Officials in Santa Clara County, California, for example refuse to release city specific data on the number of cases.  Meanwhile, other countries (like Korea) are making pinpointed data on the location of new cases publicly available.  What’s the right balance?  As we move from a period of mandatory, and nearly universal quarantine, to a regime of testing and tracking, we’re going to have to reconsider our assumptions about whether privacy should trump releasing data that could help save lives.  For what its worth, we routinely release pinpoint data on fatalities and injuries from traffic crashes (with personal information removed) to help understand, and try to minimize traffic deaths.  Should we do any less with a pandemic?

In the News

Urban Milwaukee reported on our analysis of Covid-19 cases by metropolitan area, as did the Milwaukee Sentinel Journal.

Willamette Week reported that the private developers of Portland’s convention center headquarters hotel made a $40 million profit, essentially cashing in on $74 million in public subsidies for the project. City Observatory’s Joe Cortright is quoted as saying the deal was structured to give all of the upside to the private sector, and provided only downside risk for the public sector.

The Week Observed, April 17, 2020

What City Observatory this week

1. Regional Patterns of Covid-19 Incidence.  The pandemic has struck every corner of the nation, but has clearly hit some areas harder than others. We’ve focused on those metro areas, like New Orleans and New York, that have the highest rates of reported cases per 100,000 population. But stepping back and looking at the national map of metro areas shows that there are some distinct regional patterns to this pandemic. One hotspot is the entire Northeast Corridor, from Washington to Boston.  Six metro areas in the corridor rank among the top 12 metros for incidence of reported Covid cases.  Three Great Lakes region cities, Chicago, Detroit and Indianapolis are also in the top ten. Meanwhile, aside from Seattle, rates in the West tend to be much lower than in other metro areas.

2. Updated Metro Area statistics on reported Covid-19 cases per capita.  We’ve continued the daily updates to our estimates of the number of reported cases per 100,000 population in each of the nation’s 53 largest metro areas.  A key part of our analysis has been tracking the relationship between the incidence of the virus with its rate of growth to show where we’re making progress in fighting the pandemic. Metros in the upper-right hand sector of our matrix have higher prevalence and higher growth; those in the lower left have lower prevalence and lower growth.

3. How useful are Covid-19 case data? According to the New York Times, as of April 14th, there were over 600,000 reported cases of Covid-19 in the US.  But, as we all know, the diagnosis of the virus has been hampered by a lack of testing capacity, and there are good reasons to believe that there are many undiagnosed cases.  While that hampers our ability to understand the scale of the pandemic, its less of a problem for judging its geographic concentration. We take a close look at the correlation between case counts and deaths (which are less influenced by testing capacity), and find that cases are a good proxy for deaths at the metro level.  That gives us confidence that the case data are a useful way of judging the relative severity of the pandemic in across metropolitan areas.

Must read

1. The Covid-19 recession will put a big dent in global greenhouse gas emissions.   Its hard to know at this point exactly how big the reductions will be, but we have a range of estimates that ballpark the likely change.  The Carbon Tax Center’s Charles Komanoff estimates that the Covid response (and recession) could reduce global carbon emissions by more than 20 percent, enough to lower the atmospheric concentration of CO2 by about 1 part per million compared to what it would otherwise be.


Other estimates suggest a smaller effect–a decline of about 4 percent in greenhouse gas emissions.  Which forecast is correct probably depends on the severity and duration of the Corona pandemic and its economic aftermath.

2. Jed Kolko’s statistical analysis of county-level Covid-19 death data.  Indeed economist Jed Kolko has a detailed regression analysis of the correlation between county characteristics and the rate of reported Covid-19 deaths.  He finds that the death rate has been higher in more populous and denser counties, specifically:

Death rates are higher in counties with a higher share of 60-plus, Black, Hispanic, or Asian residents, and in places where March 2020 was colder. Death rates are higher in denser counties and in more populous metros, but also in counties with a lower share of college-educated residents — and education, density, and metro size are all strongly positively correlated.

He reports that much of the effect of density on the observed results is attributable to the large number of cases in New York City and its surrounding counties, but he finds that the death rate still varies along the urban/rural continuum when one excludes New York City.

3. Todd LItman on cities and health. There’s a lot of speculation that pandemic-inspired health concerns will propel an exodus from cities. But a careful look at health data shows that cities are conducive to better health.  Todd Litman has, as usual, a methodical summary of the literature and data.  The key takeaway:

In fact, most people are far better off before, during and after a disaster living in an urban area that provides convenient access to essential services and activities than moving to an isolated rural area. Cities are significantly safer and healthier overall, resulting in lower mortality rates and longer lifespans than in rural area . . .  Rural residents have shorter lifespans due to higher rates of cardiovascular, respiratory and kidney diseases, unintentional injuries lung and colorectal cancer, suicide, diabetes, Alzheimer’s disease and birth defects. These urban-rural differences are even greater for poor and minority groups.

New Knowledge

Unless we price them correctly, autonomous vehicles will increase miles driven, pollution and congestion.  There’s a lot of debate about what the effect of autonomous vehicles might be on travel patterns and urban form.  A new report from researchers at the University of California Davis reports that how we price roads and vehicle use will likely have a major impact on the answer to this question.  They used a transportation demand model to test alternative scenarios, and found that the combination of AVs and unpriced roads would trigger an increase in vehicle miles traveled of 11 percent, but that a system of road pricing would actually lead to 7 percent less travel.
Caroline Rodier,. Miguel Jaller, Elham Pourrahamani, et al, Automated Vehicles are Expected to Increase Driving and Emissions Without Policy Intervention, March 2020, https://escholarship.org/uc/item/4sf2n6rs

In the News

Portland’s KXL radio interviewed City Observatory director Joe Cortright about the likely impacts of the economic downturn trigger by the Covid-19 pandemic.

The Chicago Tribune cited City Observatory’s report “Less in Common” in an article addressing the ways that the pandemic is likely to change our inter-personal relationships.

 

The Week Observed, April 24, 2020

What City Observatory this week

1. What the Covid-19 Shutdown teaches us about freeways. Everyone knows that speeds are up on urban roadways around the nation because of the stay-at-home orders to fight the pandemic. But there’s a hidden lesson here. In Portland, for example, one of the most regularly congested roadways is  not only moving twice as fast, but is actually carrying more traffic at the peak hour than before the pandemic. The reason? Stay-at-home has worked like a demand management policy to keep the traffic from reaching a “tipping point” where the freeway actually loses capacity.

This shows that actually managing freeways, through policies like congestion pricing–can move more traffic, faster. If highway engineers really cared about congestion, they’d be taking this lesson to heart, and better managing the multi-billion dollar assets they’ve built, rather than wasting billions on unneeded (and environmentally destructive) freeway widening.

2. Is Covid-19 the end of cities? We don’t think so. A New York Times story last weekend argued that city living had “lost its allure” and migration away from cities was likely to be accelerated by the pandemic.  We beg to differ on both these points. It’s true that cities aren’t growing as fast as they were in the middle of the last decade, but as we’ve pointed out before, this has less to do with diminished allure, than it does with the shortage of housing in cities, and our widespread failure to build enough to meet demand (and keep rents affordable).  So its not diminished allure, so much as unrequited ardor. And as we–and others–have pointed out, simple-minded claims of a connection between density and Covid-19 don’t seem to square with the facts. Plenty of extremely dense cities have managed to largely avoid the pandemic, and studies of the US evidence show little reason to believe that urban density is a key cause of its spread.

3. Suburbs aren’t immune from Covid-19. As the New York Times story implies, there’s a kind of working assumption that if one were to move from a higher density urban neighborhood to a lower density suburban one, you could minimize your risk of exposure to the virus. While in the aggregate, per capita reported cases seem to be somewhat higher in central counties than surrounding suburban ones, the difference isn’t large.

Our analysis of data for the largest metro areas shows it makes a lot more difference which metro you live in than it does whether you live in the suburbs or the central county:  there’s about a .9 correlation between city and suburban Covid-19 rates within large metro areas.  And in the aggregate, the rate of reported cases per capita in suburbs lags about 6 days behind the comparable rate in central counties.

4. A worthwhile Canadian example. Those who want to blame urban density for vulnerability to pandemics often dismiss the very low rates of infection in highly dense cities like Hong Kong, Taipei, Seoul and Tokyo as an Asian anomaly. But right hear in North America, we have a very dense city, with high transit ridership, that has a way below average rate of reported Covid-19 cases:  Vancouver, British Columbia.

We compare Vancouver’s experience with its two US neighbors, Seattle and Portland, and find despite greater density–and a much closer connection to China–Vancouver has the lowest rate of reported cases per capita of the three cities (and Portland’s rate, as we’ve noted is in the bottom five of all large US metros.

5. The Covid-Corridor?  We’ve been compiling and closely analyzing data on the incidence of Covid-19 in the nation’s 53 largest metro areas.  One alarming pattern is the relatively high incidence of cases in the Northeast Corridor. Its well know that the New York City metro area is the US epicenter of the pandemic, but nearby metros are also suffering disproportionally.

All four of the metros with the highest number of new cases–New York, Boston, Providence and Philadelphia–are all in the Northeast Corridor.  The corridor also accounts for six of the top eight metros on this measure (adding Hartford and Washington).  Each of these metro areas is reporting new cases per capita at a rate 2-3 times higher (and in Boston’s case, 6 times higher) than the median large metro area in the US.

Must read

The best visualization dashboard for state level Covid-19.  We’ve spent a lot of time looking at pandemic data.  There are many different data visualizations out there, but we think the best one for stat data is “91-DIVOC.”  It’s been put together by computer science professor Wade Fagen-Ulmschneider of the University of Illinois and uses the county level database assembled by Johns Hopkins University. It gives you a series of charts that you can easily customize (looking at cases, deaths, cumulative, weekly and daily data, and it also allows you to highlight and filter the data quickly and easily.  Here’s a snapshot showing new daily cases for all 50 states with Oregon highlighted.

Other dashboards contain much of the same information, but its often hard to pick out particular datapoints in a spaghetti of dozens of lines, and you’re often left to simply accept the choices made by the dashboard’s designer.  DIVOC-91 puts you in control:  it’s clean and fast.  Now if they had a dashboard for metro areas . .

In the News

Willamette Week wrote about our analysis of the impact of the Covid-19 shutdown on Portland-area traffic in their story, “The Biggest Bottleneck on the West Coast is handling traffic at double the normal rush hour speeds; Covid-19 has shown what could happen if tolls were placed on Portland highways.

 

 

The Week Observed, June 12, 2020

What City Observatory did this week

1. Covid-19 rates are spiking in five cities. Stay-at-home policies and social distancing have dramatically slowed the spread of the pandemic in the US, but as many state’s begin re-opening, there’s a concern that the virus could rebound. Looking at the data for the 50 largest US metro areas shows that there’s been a noticeable reversal in the general slowing of the virus in five cities: Phoenix, Tucson, Raleigh, Tampa and San Antonio.

While Arizona’s increase is apparent in statewide statistics, the other three cities have accelerating rates of growth that far outpace the statewide average.  We’ll want to pay close attention to what happens in these cities in the next few weeks as a sign of how well we can cope with a second wave of the pandemic.

2. The World Bank’s Sameh Wahba on building inclusive cities after Covid-19. There’s a lot of speculation that the Covid-19 pandemic will undercut the rationale for urban living. But the World Bank’s expert on urban disaster risk management is extremely optimistic that the pandemic will not dim city prospects, and that in fact, the pandemic will be an impetus to building more just places, partly as a way to improve health for all residents, and reduce vulnerability to future viruses.

Cities will continue to attract such footloose population through what they have to offer in terms of livability including quality amenities and services, public spaces and cultural facilities. . . . Cities represent the best of human ingenuity and well-being. That is true today, will be true tomorrow, and well after the pandemic has subsided.

3. Cratering convention centers stick cityies with the bill. We’re pleased to publish a commentary co-authored by former Seattle Mayor Mike McGinn. For the past couple of decades, cities around the country have been competing against one another for slivers of the convention business, subsidizing convention centers, underwriting huge “headquarters” hotels, and even paying conventions to come to town. With the Covid-19 pandemic, the convention business has seized up entirely, slashing room rates and occupancy in hotels, and wiping out much of the room tax revenue cities depend on to pay subsidies.

Among the worst hit places is Seattle, which is the midst of a $1.8 billion expansion of its convention center; unwisely the city put starting construction ahead of nailing down the financing, and now has a huge hole in the center of town, and very little likelihood of being able to sell bonds when room tax revenues have evaporated. It’s a real pickle, but conceals an even deeper problem: the convention center business has been lagging well behind the economy for most of the past two decades.

4. Covid-19 in cities: Segregation, not density. There’s been a knee-jerk tendency since the first outbreak of Covid-19 to blame urban density for the spread of the virus. The density theory has been largely debunked, but there’s another aspect of urban form that’s worth considering.  We look at the connection between racial/ethnic and income segregation and the prevalence of Covid-19.  We’ve known for some time that communities with higher levels of segregation suffer many adverse economic and health effects.  The data also show that metropolitan areas with higher levels of black/white segregation and higher levels of income segregation tend to have higher prevalence of Covid-19 on a per capita basis.

Must read

1. The differential impact of the Covid-19 recession on people of color. PolicyLink has a new report looking in detail at the labor market impacts of the pandemic by race and ethnicity.  Titled “Race, Risk and Workforce Equity in the Coronavirus Economy,” the report uses a combination of Internet job posting data and Census occupational data to look at the extent of job loss in during the pandemic.  Job losses among workers in occupations classified as “non-essential” have disproportionately fallen on low income workers.

The report also finds that in a series of occupations, people of color are more likely to be exposed to the virus.  It also finds that job losses in non-essential occupations have been concentrated among people of color.

2. How to respond to the Covid-19 pandemic.  The Center for Community Progress has a new report laying out some clear strategic advice for how cities ought to respond to the pandemic. Written by Alan Mallach (author of one of our favorite books, The Divided City), the report begins by sketching out some educated guesses about key aspects of the Post-Covid world: a prolonged recession, many households behind on rent, much tighter markets for housing finance, and badly strapped state and local governments. Mallach estimates that the shortfall in rent payments could be in the range of $29 to $118 billion this year, with effects cascading from families to landlords to the housing market and wider economy.

Some form of rent relief may be a key to reducing pain and staving off further economic decline. Ultimately, only the federal government has the resources necessary to make a material difference. Assuming we can generate an effective response, Mallach challenges all of us to think more expansively about how we might use this crisis as an opportunity to build better, stronger communities:

Can we see the challenge of recovering after the pandemic as an opportunity to rebuild not the same, but better than we were?

3. Don’t write off downtowns. Seattle Times business columnist Jon Talton is still very bullish about downtowns–at least his downtown, even in the wake of the Covid-19 pandemic.  Cities just have too many economic advantages not to bounce back.  He writes:

Today, downtowns continue to offer unique advantages in efficiency, productivity and innovation. They are the places where “creative friction” happens as ideas are easily shared and serendipitous encounters happen.  They offer public spaces and cultural amenities for all. In Seattle, large numbers of low-income housing units are sustained here, too.  In an era where the greatest challenge is climate change, downtowns served by abundant, frequent and convenient transit are essential to reducing greenhouse gases.

New Knowledge

Generational shifts in homeownership rates. ApartmentList’s Rob Warnnock carefully dissects the homeownership statistics by age, generation, income, education and race and ethnicity, to trace out the shifting patterns of homeownership in the US. The core finding, as we’ve seen before, is that homeownership rates are declining with each successive generation:

Homeownership in general has been on the decline for at least the last three generations. After adjusting for age, millennials, gen Xers, and baby boomers have all purchased homes at a slower rate than the generation that preceded them.

While that’s been established for some time, this report sheds light on the interesting sub-trends among different demographic groups.  Perhaps not surprisingly, homeownership rates for whites have not declined nearly as much as for other racial and ethnic groups, after controlling for age and generation.

Overall homeownership rates are down far more for the entire population than any individual racial/ethnic group.  What’s driving the decline then, is the increasing diversity of the US population, and in particular the fact that much of the growth is in among groups (Hispanic and Black) who have lower homeownership rates. This suggests that raising homeownership rates (to the extent that is a worthy policy objective) depends on reducing this persistent racial gap.

Rob Warnock.  “Homeownership rates by generation:  How do Millennials stack up?’ ApartmentList.com, March 17, 2020.

In the News

City I/O published its summary of our report, America’s Most Diverse, Mixed Income Neighborhoods.

The Week Observed, June 19, 2020

What City Observatory did this week

1. Youth Movement: Our latest CityReport. America’s urban revival is being powered by the widespread and accelerating movement of well-educated young adults to the densest, most central neighborhoods in large metro areas. Our new report looks at the latest census data and finds that the number of college-educated 25- to 34-year-olds increased in neighborhoods within three miles of the center of the central business district in every one of the nation’s 52 largest metro areas. Not only that, but in four-fifths of these cities, the rate of increase of this key demographic accelerated after 2010, compared to the prior decade. Despite concerns that the pandemic may be dimming urban prospects, real-time data on real estate searches from Zillow and ApartmentList.com confirm that cities are expanding their market share as suburban shares fall.  Our full report has detailed data for each of the nation’s metro areas with 1 million or more population.

2. Covid-19’s Lessons for Portland—and other cities. We’ve witnessed rapid and traumatic change in the past few months, from the Coronavirus pandemic to an outpouring of outrage over police violence. In a guest commentary, our friend Ethan Seltzer begins listing the lessons that these twin crises pose for our urban future.  While his comments are aimed at Portland, we suspect the issue he is touching on will be of interest to urban leaders throughout the nation.

3. CityBeat: Pushing back on the Wall Street Journal. There have been a surge of stories predicting that pandemic fears will provoke an urban exodus. This week, the Wall Street Journal weighed in with its entry, asking whether once worker’s could work anywhere whether they’d choose to say in big cities. Like so many such stories, the argument pivots on anecdotes of a New York couple moving to a smaller town in a different state. It’s a fair point that mid-career workers with demonstrated expertise might have that option, but it still turns out, especially for young workers, that there’s no place like a city to find your way to a career, develop your skills and build a personal and professional network. And work isn’t the only reason people choose cities: living in a dense urban environment provides access to more diversity and greater opportunity in the form of social contacts, cultural opportunities and personal interaction.

Must read

1. Tear gas takes aim at dissent:  A new video “Choking Dissent:  The Truth About Tear Gas” sponsored by Amnesty International and produced by Brooklyn design firm SITU documents the global use of tear gas as an instrument of repression. The video illustrates the chemical and ballistic properties of these munitions, and shows that far from being non-lethal or less-lethal, the projectiles are routinely used to inflict pain and injure protestors; and the gas has a range of adverse health effects, and can even cause death. The documentary combines cell-phone taken at protests–like this one in Philadelphia–with a computer generated image of the surrounding landscape, to illustrate how police deployed gas to trap protestors in a particularly vulnerable location.

2. A Bigger City is a Better City.  Alon Levy, transit-expert extraordinaire at Pedestrian Observations has a new essay that unapologetically endorses urban growth and development. Too often, Levy argues, urbanists advance density and development in cities as a necessary evil. That misses the critical point that cities make us better off, making us smarter and more productive, and providing us with opportunities to acquire skills, build networks and enjoy a better life. Levy writes:

Urban development is good

The ability to access more stuff easily is a good thing and there’s a reason both employers and residents pay extra to have it. More and bigger buildings stimulate this kind of access. On the production side, this means thicker social networks for people who work in related industries and can come up with new innovations – this is why the tech industry sticks in San Francisco and environs, and not the bay view or the state of California’s public services. This, in turn, raises wages. On the consumption side, this means more variety in what to buy.  Moreover, this is true down to the neighborhood level. A denser neighborhood has more amenities, because more people is a good thing, because new people stimulate new social events, new consumption, and new opportunities for job access.

It’s a clarion call to champion cities, not shrink from them.

3. Corona Virus won’t kill cities.  Centre for London’s Ben Rogers weighs in with a warning, but also a call for optimism.  The warning is that our initial response to the virus has triggered some anti-urban thinking:

With governments forbidding people from mixing, mayors warning us to avoid public transport and commuters learning they can work from home, you can see why urbanists, city leaders and businesses are worried. It seems all too possible that those who live and work in these cities will vote with their feet – or more likely, their cars. There will be powerful anti-city forces – out-of-town developers, car makers, road builders, oil companies and the champions of conservative, small town, small state values – cheering them on.

But ultimately, Rogers argues, this is exactly the kind of problem that cities have faced in decades and centuries past, and can, given good management, tackle again. Indeed, Rogers thinks that the resilience of cities in the face of this kind of challenge will further accelerate their growth:

The demographic and economic makeup of these cities might change, but people and business will still be jostling for space near the centre.  Cities have always worked particularly well for young people. They flock to them to build up vital social and professional networks, meet their mates and learn how the world works. They are also the ones who will find it easiest to adapt to new ways of moving around and contribute most to online innovation.

New Knowledge

Integration and political affiliation. Attending integrated schools means that white kids are less likely to register as Republicans as adults. Political attitudes are shaped by our families and and environments. A new study looks at how differences in school integration shape partisan political identification.

The study, “The Long-run effects of school racial diversity on political identity” by Stephen Billings, Eric Chyn and Kareem Haggag, takes advantage of the implementation of a new school integration plan in Charlotte, North Carolina in the early 2000’s.  In place of busing, a federal court ordered a re-drawing of school attendance areas, which meant that kids in some neighborhoods who had been attending segregated schools ended up attending integrated schools.  The authors tracked individual students and observed their pattern of voter registration as adults.  The key finding:  white students who attended integrated schools were significantly less likely to register to vote as Republicans than white students who attended segregated schools.

We find that a 10-percentage point increase in the share of minorities in a white student’s assigned school decreased their likelihood of being a registered as a Republican by 12 percent (2 percentage points), and that this impact was not driven by detectable changes in voting registration.

Factors like parental political affiliation also influence a student’s later political affiliation, but attending an integrated school had about one-sixth as much influence on partisan identification as did parental factors.

Stephen B. Billings, Eric Chyn, and  Kareem Haggag, The Long-run effects of school racial diversity on political identity, NBER Working Paper 27302

In the News

CityLab’s Marie Patino reported on the findings of our new report, Youth Movement: Accelerating America’s Urban Renaissance.

The Week Observed, September 25, 2020

What City Observatory did this week

1. Why free parking is one of the most inequitable aspect of our transportation system. There’s a lot of well-founded anger over the inequitable aspects of transportation:  the burdens of policing, fare enforcement, and road crashes all fall disproportionately on low income households and communities of color. But a little recognized and pervasive inequity in transportation is the fact that we charge people vastly more for using transit than we do for parking.  Data compiled by Northern Illinois University’s Chris Goodman show that most cities charge their residents 20 or 30 times a much for a monthly transit pass as they do for a monthly street parking permit (when they charge anything at all).  That makes no sense from an efficiency or equity standpoint.  Those who own cars have higher incomes than those who use (and who depend on transit). Private car storage in the public right of way is the conversion of common wealth to private use without compensation, and those who park on-street deprive other users of the potential ouse of that space. In contrast, public transit systems provide significant social benefits (less pollution and congestion), and usually have excess capacity. It makes no sense to charge little or nothing for parking, and too much for transit.

2. How to make gentrification worseBanning new construction is a great way to push up home values and accelerate gentrification.  Cities are conflicted and confused about how to protect affordability.  “Stop the world – I want to get off” was the title of Anthony Newley’s 1961 musical, but it seems like the core policy vision of a growing number of urban leaders faced with gentrification. An article earlier this month in the Washington Post portrayed building moratoriums put in place in Atlanta and Chicago in order to fend off gentrification. These efforts are certain to backfire. Blocking new development in the face of growing demand for urban space ironically plays into the hands of speculators and flippers by driving up the value of existing housing.  In the face of a shortage of great urban spaces, cities need to look for ways to accomodate more people, of all income groups, not simply to block change.

Must read

This week’s must read articles are a reminder of the important political dynamics that surround land use decision-making and the challenges of balancing local interests with broad public concerns like job creation and housing affordability.

1. Why state action is essential to reform local land use laws. Sightline Institute’s Michael Andersen reflects on some of the key policy (and political) lessons from Oregon’s legalization of multi-plex housing in the state’s single family zones.  A first key point is that state action helped get the ball rolling: It’s always difficult in a single neighborhood or city to agree to additional development (see below); up-zoning is the classic prisoner’s dilemma.  Having a state step in and require every city to allow a range of housing types overcomes this problem.  The second point Andersen makes is a subtle, and seldom recognized one: creating a bias for action.  The biggest obstacle to reform, Andersen notes is the inertia of the status quo.  Once the state sets a deadline for change it shifts the discussion from whether to do things differently, to a more productive question about what change makes the most sense.

2. NIMBY’s  triumphant in NYC. The New York Times reports that a major redevelopment project in Brooklyn’s Industry City is dead, thanks to local opposition, based on concerns about, among other things, gentrification.  The Industry City proposal involved redeveloping part of Brooklyn’s waterfront for, among other things, as many as 20,000 new jobs.

3. Ending member privilege. Brandon Fuller ties the demise of the Industry City proposal to a widely shared feature of many local land use processes: “aldermanic privilege.” In cities that elect council members by single member districts, there’s frequently a formal or informal practice of granting individual councilors veto power over land use approvals in “their” districts. A series of studies has shown that cities which have such single member districts systematically allow less housing, and in particular fewer apartments, than cities that don’t have this system. Single member districts escalate the political significance of parochial concerns, and City Councilor’s are often under tremendous pressure to protect “their” neighborhood from adverse effects. Multiplied across a city, that’s a recipe for NIMBYism.

In the News

City Observatory Director Joe Cortright is quoted in The Oregonian’s insightful article on following up on a predicted carmaggedon in Portland.  Half of a major interstate bridge was closed for a week, and despite predictions of gridlock, things were actually fine.

Streetsblog republished our commentary on equity and subsidized private car storage in the public right of way as “It Shouldn’t Cost 31x More To Take Transit Than Park.”

 

The Week Observed, October 2, 2020

What City Observatory did this week

1. Carmaggedon never comes, Portland edition. It’s a favored myth that any reduction in road capacity will automatically trigger gridlock, and highway engineers regularly inveigh against reallocating road capacity to promote safety or facilitate other users.  But real world experience with abrupt and significant reductions in road capacity shows that traffic declines in response.  Portland just closed half of its key I-5 freeway bridge for a week, and forecasts of gridlock and four-mile long queues simply didn’t occur.  Portland’s Oregonian reported that “forecasts calling for a nightmarish region-wide traffic catastrophe failed to materialize … travel patterns largely followed the normal cycle.”

The reason?  Travelers rapidly change their behavior in response to changes in road capacity, as many road users have choice about when, whether, by what route and how they travel.  The fact that carmaggedon never comes should remind us of the hollow and self-serving nature of highway engineer forecasts.

2. Gentrification’s big disconnect. We’re pleased to publish this guest commentary from Akron’s Jason Segedy, who stresses the profound disparity between the plight of low income neighborhoods in most of the nation’s cities and exaggerated concerns about the adverse effects of new investment, aka “gentrification.”

Must read

1. Matt Yglesias:  Much more housing. In the wake of World War II, many feared that peace would cause the US to lapse back into the doldrums of the Great Recession.  That fear was a major impetus to big federal subsidies to housing, highways (and sprawl). But the postwar housing boom did help power the economy, and Vox’s Matt Yglesias argues its time for another dose of this medicine:

A combination of rental assistance for consumers, capital funding for affordable housing, and regulatory relief for builders of all kinds could unleash a massive boom in new construction, creating countless blue-collar jobs and laying the foundation for a new era of inclusive prosperity.

Yglesias claims that we have a shortage of housing, but in our view, it’s more accurate to say we have a maldistribution of housing:  too much housing in the hands of empty-nesters in suburbs, too little housing in prosperous cities and high-opportunity neighborhoods.  Agree with Yglesias or not, this is a productive debate to be having.

2. A surge in biking, big data from Strava. Many dedicated cylcists use the Strava app to track their trips and measure their performance against other cyclists. With millions of trips in its database, Strava provides one indicator of the growth of cycling in cities across the US. Strava published some of this data, which show a significant uptick in cycling activity during the pandemic  Here are the data for New York City.

Strava has compiled this aggregated data into a new web-based tool called “Metro” that lets planners track cycling data for specific locales.

3. Covid cuts a diagonal slice through cities and suburbs. The inimitable Johnny Sanphillippo of Granola Shotgun has one of his signature photo-laden essays about how commercial and public spaces in cities and suburbs are faring in the wake of the CoronaVirus. While many are quick to proclaim the end of cities, Sanphillippo argues that the effects aren’t simple and clear cut, and the pandemic is creating winners and losers in every geography.

There’s a lot of talk on the interwebs these days about the mass exodus from big cities. . . .. The migration of frightened people seeking safety in the hinterland…. But the reality is more nuanced than suburbanites or country folks grasp. We aren’t seeing the end of cities and the triumph of the outskirts. Instead we’re experiencing a complex patchwork of winners and losers everywhere.

Parks and other civic spaces in cities are flourishing as people look to social-distance while experiencing the public realm.  In the suburbs, many big box stores, like WalMart and Home Depot still thrive, by much of the rest of retail, especially the small in-between stores that fill out malls and strip centers are being decimated.

 

New Knowledge

Electric cars won’t be enough to fight climate change, and pose their own problems.

One view of climate change–espoused by many highway departments–is that fighting climate change is simply a matter of electrifying the vehicle fleet. Just replace internal combustion engines with battery electric models, and voila, greenhouse gases disappear, with no need to change anyone’s travel patterns or land uses, and best of all, highway department’s can just keep building more roads, a kind of electric business as usual for them.

The trouble is, converting to an all electric fleet is a pipedream.  A new study from the University of Toronto tackles the details of what such a transition would entail, and the results are daunting. The authors estimate that fighting climate change would require 90 percent of the vehicle fleet to be electrified by 2050, up from less than 3/10ths of one percent of all vehicles today. Not only is that not plausible, the most optimistic estimates suggest that the best we might do by then is get to 50 percent electric vehicles, but the massive ramp up of electrification has costs of its own.

The biggest one is that we’d need vastly more electricity–the author’s estimate that achieving 90 percent electrification would require about 41 percent more electric generation.  Beyond that, that load (with its own peaking and geography) would put a massive strain on the power distribution system and would likely require considerable new infrastructure in both power generation and transmission. In addition, building that many electric cars and their batteries, entails significant environmental costs and the mining of scarce materials in many sensitive places.

Alexandre Milovanoff, I. Daniel Posen & Heather L. MacLean, Electrification of light-duty vehicle fleet alone will not meet mitigation targets, Nature Climate Change (2020).

In the News

Strongtowns featured our analysis of the overwrought predictions of Carmaggedon that traffic engineers use to scare the public in its article, “Carpocalypse Never?”

 

The Week Observed, October 9, 2020

What City Observatory did this week

Let’s fight congestion with a PR campaign.  For decades, when pressed to do something to improve road safety, city and state transportation officials have responded with . . . marketing campaigns. As the federally funded publicity around October’s National Pedestrian Safety Month makes clear, this mostly amounts to shifting blame for dangerous roads to vulnerable road users. We’re told safety is a “shared responsibility.”

Let’s try the costumed super hero approach for congestion.

Our modest proposal suggests that transportation officials ought to apply this same approach to fighting congestion:  A good marketing campaign could make it clear that traffic congestion is a product of lots of poor decisions by travelers, and that we should all take personal responsibility for our contribution to and exposure to traffic congestion. If we used that approach to congestion, the billions we save could be used to dramatically improve safety.

Must read

1. Environmental Justice has to be more than just a procedural step. It’s well known that the nation’s urban freeways sliced through and devastated many communities of color, and that’s prompted requirements that today’s environmental review processes for big transportation projects require a consideration of environmental justice (EJ).  But too often, highway builders treat these EJ requirements as just another check-box to tick off in a largely bureaucratic process. Maryland Delegate Sara Love, who represents a district bisected by freeways, points out that this purely procedural approach to EJ ignores the ongoing damage that roads (and the traffic and pollution they generate) continue to have today.  She writes:

In the 1960s the construction of Interstate 495 cut through the heart of many communities of color and low-income communities without consideration of their existence, their survival or their character and caused them great harm. Here we are, 60 years later and the Hogan administration is doing the same thing with its I-495 highway widening project. . . . The responsibility of the government — and the right thing to do — is not to exacerbate prior injustices, but to reverse them.

Tweeting about a similar dynamic in the Portland metro area, Transit Center’s David Bragdon likens the efforts to superficially address environmental justice as asking local residents to design the artwork that can be stenciled on the sides of the napalm bombs the DOT has already determined it will be dropping on your neighborhood. As long as we treat environmental justice as a procedural step, rather than a measurable, substantive requirement, we’ll get these kinds of results.

2. We’re holding on to cars and trucks longer, and that’s a threat to the environment. One of the key assumptions in many climate strategies is that we’ll start building cleaner vehicles (either electric or lower emission internal combustion cars), and that as consumers replace their existing vehicles, we’ll reduce emissions. Plans that call for all cars to be electric by 2035 or 2040 invariably refer only to newly purchased cars. But consumers are hanging on to their cars longer now than ever, and these older, dirtier cars mean higher levels of pollution, and less impact from new car requirements.  Over the past two decades, the average age of cars on the road has increased about 30 percent; and since the average is 11.6 years, many of the cars on the road are more than 15 or 20 years old.  This means that many of the cars sold today will be on the road in 2035 or 2040.

It’s possible we’ll look more and more like Cuba, where in 1950s era Fords and Chevies have been kept on the road indefinitely. Ultimately, we need strategies, like carbon pricing, that discourage the use of high polluting vehicles.

New Knowledge

The real cost of driving. We drive a lot in the US because many of the social and environmental costs associated with driving are shifted to the public at large, rather than paid directly by those who drive. The subsidies are buried in a complex web of taxes and regulations (like parking requirements) that make businesses and homes more expensive to keep driving cheaper.  We pay for the health and safety costs of driving through a range of general taxes.  How much would it cost to drive if we asked drivers to take responsibility for all the costs their decisions impose on others?

A study from the UK attempts to work out those costs.  The estimates are specific to the UK, but are a good rough indicator of total costs. The bottom line:  the social and environmental costs of driving gasoline powered cars work out to about 11.5 euro cents per kilometer traveled.  Translated into dollars (at 1.18 USD/Euro) and miles (at .62 miles/kilometer) that means that the social and environmental  costs of driving are about 22 cents per mile.

In the US, of course, gas taxes are just a tiny fraction of that amount:  The federal gas tax is about 18 cents per gallon; the median state has a gas tax of about 30 cents per gallon.  With cars averaging about 20 miles per gallon fuel consumption, that means that they’re paying about 2.5 cents per mile traveled, barely more than 10 percent of the social and environmental costs of driving. In a world where drivers stepped up and took direct responsibility for the costs they impose on others, they be paying 8 to 10 times more in road taxes than we currently charge in the US.

David Newbery, Transport Policy for a Post-Covid UK, Cambridge Working Papers in Economics: 2081.  7 August 2020

In the News

GreenBiz republished Joe Cortright’s commentary on the inequity of charging fares for the use of transit while allowing people to use the public right of way for private car storage either for free or at vastly discounted prices.

The Week Observed, October 16, 2020

What City Observatory did this week

1. Covid-19 is now worst in rural areas and red states. Early on in the pandemic, it seemed like everyone attributed the spread of the Coronavirus to big cities and density. It turns out, more than half a year on, that’s not the case. The epidemic is now far worse in the nation’s rural areas, which have 21 percent of new cases but just 14 percent of US population. A broad swath of rural and small metro America from the Gulf Coast to the Dakota’s, the politically red heartland is now the redzone for the virus.

The gap between urban and rural areas has widened in the past month, and we’re still waiting for the nation’s reporters to start filing stories about rural residents fleeing to the safety of the nation’s big cities, which now have, by far the lowest new case rates.

2. Equity and Parks. We’re pleased to present a guest commentary from Carol Coletta, President and CEO of Memphis Riverparks.  Her remarks to the International Downtown Association’s 66th annual meeting make a strong connection between vibrant urban spaces, particularly parks, and promoting the kind of social mixing that’s needed to promote social mixing, break down the growing divisions in America, and broaden empathy.  Downtown parks and public spaces are particularly important because they are the spaces most widely available to people from throughout the community.  As Carol writes:

Successful downtowns increasingly depend on great public space.  And great public space located in a downtown is more likely to be equitable space because of its location, not despite its location.

The pandemic has reminded us of the importance of these public spaces to our daily lives. One hopes that out of this crisis will grow a greater realization of the role they can play in bringing us closer together.

Must read

1. Building more roads won’t help America’s economy.  From “infrastructure week” to presidential campaign shibboleths, the idea that we can revive our economy by building more roads and bridges is a commonplace. While it may have been true in the middle of the 20th Century, it’s flat out wrong today, as Chuck Marohn of Strong Towns argues in a commentary for CNN:

When local governments don’t maintain what they already have, can they credibly justify building more? Instead of pursuing economic growth through system expansion, recovery must be based on a firm commitment to maintaining what has already been built and squeezing higher returns out of these existing infrastructure investments.
2. Housing affordability in Boston suburbs by allowing more apartment near transit. An article by a team from Boston and the Brookings Institution makes the case for an upzoning of land near transit stations in the Boston area.  Many of the region’s suburbs allow only single family homes, including in areas adjacent to transit. These highly accessible locations command very high land values, and because one can only build a single housing unit on most lots, the houses are very expensive. Allowing more houses to be built on this expensive, accessible land lowers the land cost of each unit.  The author’s estimate that allowing multifamily housing near stations would make the neighborhood affordable to households earning about half the median income of the typical suburb, expanding the opportunity to live in those suburbs to far more Boston metro residents. And while the direct beneficiaries would be middle income households, allowing more suburban housing near transit takes the pressure off urban neighborhoods:
Additionally, a statewide upzoning that allows developers to build in the most affluent, exclusive communities would take some of the pressure off moderate-income neighborhoods such as East Boston and Malden, which have been providing much of the region’s new housing. Allowing new construction in affluent, mostly white communities is one of the most effective ways to reduce the risk of gentrification and displacement in lower-income Black and Latino or Hispanic communities.
What’s true for Boston holds for many other high cost US metro areas:  Allowing more housing in accessible, in-demand locations can directly promote affordability in these neighborhoods, and is likely to have spillover benefits for affordability in the rest of the region.
3.  Playing politics with transit safety.  The National Association of City Transportation Officials (NACTO) criticizes a a new proposal from the Trump Administration to deny funding to “anarchist” cities for efforts toassure transit systems are safe from the Covid-19 virus:
Following instructions from the White House, the Federal Transit Administration (FTA) disqualified transit agencies in New York City, Seattle, and Portland from participating in a new grant program to research methods to slow the spread of coronavirus on buses and trains. This move puts transit operators’ and riders’ safety at risk and sets a dangerous precedent that could undermine future economic recovery efforts.
NACTO was joined in its statement by The Transit Center, the Natural Resources Defense Council and Transportation for America.

New Knowledge

New insight on urban travel patterns. Brookings Institution’s Bass Center for Transformative Placemaking has a fascinating new report on the connections between land use and travel patterns. The report taps data from cell phones and smart devices assembled by Replica to profile travel patterns in six US metropolitan areas:  Birmingham, Chicago, Dallas, Kansas City, Portland and Sacramento.

The study reports that cell phone data shows that the average trip taken in these metro areas was about 7.3 miles, and that the typical resident traveled about 21 miles per day. But those average mask significant variations, across metro areas, neighborhoods and individuals.  Some metro areas have shorter trips:  Portland’s average trip is 6.2 miles; Kansas City’s is 8.2 miles.  Also, trip distances vary significantly across neighborhoods within metro areas, with urban neighborhoods generally having shorter trips than suburban ones.  And finally, the data show that most trips are actually shorter.

A majority of trips in most metro areas are shorter than 4 miles; the seven mile average is a product of relatively few much longer trips.

This new Brookings study adds to the growing body of work tapping the stream of big data from cell phones and other smart devices to provide new data about transportation behavior.  While its tantalizing in its detail, we’ll want to develop more experience with the data to better understand how it relates to other sources of data on travel patterns.

Adie Tomer, Joseph Kane, Jennifer Vey, Connecting people and places: Exploring new measures of travel behavior, October 2020. Anne T. and Robert M. Bass Center for Transformative Placemaking, Brookings Institution.

In the News

Bike Portland called our analysis of the proposed $5 billion transportation bond measure in Portland “a scathing takedown” and a “serious critique” of the measure.

The Portland Mercury also cited our analysis of the proposed Portland Metro transportation bond measure in its election editorial.

The Week Observed, December 18, 2020

What City Observatory did this week

1. Want lower rents?  Build more housing!  A new study from Germany provides more evidence that the fundamentals of economics are alive and well in the housing market. The study looks at how increments to housing supply affect local rents, and finds that a one percent increase in the number of new homes constructed is associated with a 0.4 to 0.7 percent decrease in rents. Even more importantly, the study finds that rents decline across the board, both in the higher priced and lower priced tiers.  A key reason: new market rate construction triggers a chain of moves that ultimately creates vacancies and downward price pressures throughout the market.

The study also provides a quantitative estimate of “how much” new housing is needed to hold rent increases in check. In most German cities, an increase in new construction of between 10 and 20 percent above current levels would push rent inflation to zero.

2. Sustainability is about more than electrification. We’re pleased to publish a guest commentary from Kevin DeGood of the Center for American Progress.  In a recent series of tweets, DeGood profiled Soleil, a cutting edge housing development outside Salt Lake City, festooned with solar panels and substantial battery storage.

But the development itself is cut off from the surrounding neighborhoods, and is far from any common non-residential destinations, like shops, cafes and parks. It is car-dependent with a Walk Score of zero and much of the site is given over to roadways and car storage. As DeGood points out, that kind of development simply is not sustainable, no matter how many solar panels it has.  Like 1,800 space “LEED certified” parking structures that provide free parking and net zero homes with three car garages, auto dependency is the hallmark of un-sustainability.

Must read

1. What will Post-Covid cities look like? The Atlantic’s Derek Thompson looks at the tea leaves for cities in the wake of the pandemic.  We’re all hoping that 2021 will see the widespread distribution of an effective Covid-19 vaccine. Thompson predicts that we’ll see a resurgence in city center office employment, but it will be slowed for a while by our new skill at distance-work. He also predicts that the time we spent at home and in our neighborhoods will lead to a greater interest in promoting “15-minute living.”  Most optimistically, Thompson predicts a boom year for economic growth, once we shake off the pandemic:  the pent up demand for services and experiences coupled with hundreds of billions of forced savings could fuel a strong economic rebound.

2. More highway boondoggles. Bigger and more wasteful.  US PIRG and the Frontier Group have published the sixth volume chronicling wasteful highway projects around the country.  On the cover (shown below) is Houston’s Loop 1604 Expansion, which is joined by similarly grandiose and destructive projects in Cincinnati, Birmingham, Chicago, and Charleston, South Carolina.  In every case, states are putting hundreds of millions of dollars into roadways which will aggravate air pollution and climate change by encouraging more driving.  And as the report notes, due to induced demand, they won’t even achieve their stated objective of lowering traffic congestion.

These projects are the poster children of a highway building complex that at last count squandered $26 billion a year on futile and destructive expansion projects. The report and its accompanying website also has updates on the boondoggles—like the I-5 Rose Quarter Freeway widening in Portland, and is a great resource for tracking freeway fights across the country.

3.  Why electric vehicles aren’t a pollution panacea. Many Electric Vehicle (EV) advocates would like you to believe that electric vehicles will somehow cause auto pollution to disappear, but a new analysis by OECD scientists, reported by Kea Wilson at Streetsblog, is a reminder that even “clean” cars create enormous volumes of fine particles, from tire wear and brake linings.  These fine particles, including microplastics, are now nearly ubiquitous in the environment and have serious health effects. Recent research from Washington State shows that just one chemical compound in tire residue is responsible for killing adult Coho salmon.  Heavy vehicles, driven long distances inevitably produce larger amounts of these kinds of pollutants, even if they’re powered electrically.

4.  The perils of bi-partisan agreement on infrastructure. The New York Times editorialized fawningly over the prospects of bipartisan agreement on increased infrastructure spending. But the nostalgia for highway building makes no sense in an era of soaring carbon dioxide levels and a global climate crisis. As Robert Liberty observes, even the illustration chosen for this story reveals the problematic nature of the current glib nods to infrastructure:

It is telling that the photo chosen to accompany this article about consensus politics is a gigantic freeway interchange. This is the symbol of a bi-partisan policy that gutted the cities, enabled white flight into segregated suburban sprawl, polluted the air, changed the climate and destroyed natural resources and the diverse, low-income communities where freeways were built. And little appreciated or understood is that many of these freeway projects were big wastes of money compared to other, smarter projects and that they inevitably failed to fix the congestion that their construction amplified. ‘Bi-partisan” does not equal “good.”

We’ve been down this road before—and it’s why we’re facing a climate crisis.  We need a new path, not a repetition of past failures.

New Knowledge

Cities aren’t just about work, they’re about consumption and social interaction, too.  The subtext of much of the rumination about the future of cities in a post-Covid world is the notion that, if technology let us work remotely, no one would want to live in a city. Hence predictions that we’ll all decamp to suburbs or rural areas, now that we’ve all figured out how Zoom works.

As we’ve argued at City Observatory, people value living in cities for many reasons, and access to employment is just one.  A new study from the London School of Economics looks at the comparative importance of work versus social aspects of cities (consumption and leisure) and comes to a surprising conclusion.  Even if all of the productive advantages of urban locations disappeared, their advantages in consumption would still be a powerful force keeping people in cities.

Technically, economists call the benefits of living in dense cities “agglomeration economies” and they come in a couple of different flavors.  Agglomeration economies in production stem from labor market pooling, supplier specialization and knowledge spillovers, making workers and firms in cities more productive and profitable.  The second, and less studied economic advantage of cities comes from their rich array of consumption opportunities, including the number, diversity and proximity of businesses (like restaurants) and their opportunities for social interaction.

Four economists from LSE— Gabriel AhlfeldtFabian BaldDuncan Roth, andTobias Seidelhave created  a dynamic equilibrium model of urban location to tease out the relative contribution of these two factors.  They’ve used this model to perform a kind of post-Covid thought experiment, looking to see what happens to cities under two different alternatives (one where city’s productive advantages are taken away, and a second in which their consumption advantages are erased).  It turns out that both of these alternatives reduce the attractiveness of cities to people and businesses, but the effect of losing the social advantages is far more consequential.

More broadly the new method the authors have developed for estimating the value of qualify of life differentials across places suggests that the traditional approach (which assumes frictionless migration, and which doesn’t model divergent and idiosyncratic preferences among population groups) substantially understates the value people attach to quality of life.  The implication of this work is that quality of life and local public goods make a considerably larger contribution to well-being that estimated by standard hedonic models.  For example:  the new method finds that the consumption amenities associated with city size are a bigger factor in increasing welfare (and attracting and holding population) than the production efficiencies of agglomeration.

The authors have a non-technical summary of their work at the LSE US Centre; their full academic paper is available at LSE.

Gabriel M. Ahlfeldt Fabian Bald Duncan Roth Tobias Seidel, “Quality of life in a dynamic spatial model,” Center for Economic Performance Discussion Paper No. 1736, December 2020

In the News

Thanks to StreetsblogLA for highlighting our commentary on how our “free” roads are essentially paying people to drive.

 

 

The Week Observed, December 11, 2020

What City Observatory did this week

1. The only reason many people drive is because we pay them to. There’s an important insight from recent applications of tolling to urban highways. When asked to pay even a modest amount for using a fast (and expensive) asset, many drivers vote with their feet/wheels and choose other routes or forego driving at all. Case in point:  Seattle’s new $3 billion SR 99 tunnel under downtown:  Tolls cover less than 10 percent of the cost of the project, but as soon as they started charging tolls, nearly a third of the traffic on the tunnel went away; and half of that decline simply disappeared.

The plain message here is that many people will only drive on these expensive new urban roadways only if someone else pays all (or nearly all) of the costs.  In effect, that means that much of the traffic (and traffic congestion) we experience is a result of paying people to drive.  If they were asked to pay even a tiny fraction of what it costs to provide the roadway, there’d be no congestion.

2. CityBeat: More anecdotes about urban flight. The number of such stories is waning, but there’s still a strong impulse to spin narratives  proclaiming an “urban exodus” based on fears of the Coronavirus. The latest of these comes from John Burns Real Estate Consulting, who conducted a survey of institutionally managed single family home rentals, and found that 59 percent of new tenants had moved from urban areas. While they judge that to be evidence of city flight, the data are missing some important context; specifically, there’s no indication of whether 59 percent is an increase or decrease from pre-pandemic patterns. Moreover, the urban flight theory makes almost no sense today because—unlike in the Spring—Covid rates are now higher in less dense areas.  The death rate from Covid-19 in recent weeks has been twice as high in the nation’s rural areas as in metro areas. Statistically, leaving the city increases your risk of catching Covid.

Must read

1. Fixing US housing policy.  Brookings Institution’s Jenny Schuetz has some measured and direct advice for the Biden Administration if it wants to help address housing affordability, availability, and to address lingering wealth disparities. Schuetz points out the failures of promoting homeownership as a wealth building strategy:  our subsidies are heavily skewed to higher income households, and for many homeownership is an inappropriately risky bet. The collapse of housing prices a decade ago disproportionately affected lower income households and communities of color. Schuetz advocated eliminating the mortgage interest deduction entirely (it’s now only available to higher income households and is extremely regressive), and putting in place some combination of individual development accounts and baby bonds, that would enable all Americans to build wealth, both to have the financial security to weather short-term crises, and to accumulate the savings needed for major investments like education and housing.  It’s sensible, if understated, advice.

2. Rolling Coal:  Illegally disabling pollution control devices results hundreds of millions in added health costs. The most visible pollution cheating scandal was Volkswagen’s test-beating software that enabled pollution controls on its diesel vehicles only when they were being tested. In the US, there’s widespread DIY cheating on diesel emissions. Its possible to buy chips or hardware that disable the pollution controls on diesel vehicles.  Nationally, EPA estimates half a million pickups and SUVs have had their controls disabled.  Among US states, California has done the best job, through regulations, inspection and public education, of discouraging cheating. Streetsblog NYC’s Charles Komanoff estimates if other states could equal California’s record for compliance, the reduction in diesel emissions would save $10 billion in health costs.  Where’s the law and order crowd when it comes to following laws that save people’s lives?

3. Falling toll revenues are telling us something important about transportation projects.  Writing in the Seattle Times, reporter Heidi Groover has a terrific example of data-driven journalism that shines a light on our transportation economics. Groover tracks the traffic levels, and associated toll revenues from a series of major transportation projects in Washington State, including the multi-billion dollar Highway 520 floating bridge replacement and the new Highway 99 tunnel bored under downtown Seattle.  In short, the financial outlook for several of these projects is grim.  Groover writes

Traffic is down by about half on the Highway 520 bridge and in the Highway 99 tunnel, after steeper drops earlier in the year. Neither the bridge nor the tunnel is expected to meet pre-pandemic budget projections, and toll increases are likely on both routes next year. “We’re really in crisis management mode for the 520,” Deputy Treasurer Jason Richter told state lawmakers Nov. 30.

It’s a reminder that toll-based projects are only as financially sound as the projections on which they’re based, and its all too common for states to over-estimate how much revenue they’ll collect.  When toll revenues fall short, state DOTs start cannibalizing revenue that could otherwise be used for maintenance and safety, or are forced to further increase tolls, which can lead a kind of financial death spiral for a tolled roadway.  The truth is, as we opined this week at City Observatory, many big expensive highway expansion projects are only used because we massively subsidize people to drive on them.

New Knowledge

Travel limitations reduced the spread of the Coronavirus in the Spring.  A new study from Philadelphia Federal Reserve Bank economists looks at the effectiveness of reduced travel on the spread of the Coronavirus.  Using cell-phone data to track the overall level of travel, and also to focus in on travel to and from more affected counties, Jeffrey Brinkman and Kyle Mangum find that lessened rates of travel in the Spring significantly reduced virus exposure.  Their studied focused on five large metro areas.

Travel patterns changed in the U.S. during the coronavirus outbreak. People adjusted their travel patterns based on available information about the number of cases locally. Not only did people reduce overall travel but they avoided locations with a prevalence of cases. This significantly decreased exposure to and, in turn, reduced the spread of the virus.

Statistically, they estimate that compared to a counterfactual assumption where we didn’t reduce travel,overall exposure in the U.S. at the end of April was half as high as it would have been if people hadn’t traveled less often to locations with fewer cases.

While the results show people traveled less (total travel was down about 60 percent) and in particular, people avoided traveling to counties with high infection rates, the study doesn’t discern how much of the travel reduction was attributable to edicts like “Stay at Home” orders, and how much was due to voluntary decisions to reduce travel (and self-interested choices to avoid places with high levels of cases).  Either way, though, its clear that reducing travel had the effect of slowing the spread of the virus.

Brinkman, Jeffrey, and Kyle Mangum. “The Geography of Travel Behavior in the Early Phase of the COVID-19 Pandemic,” Federal Reserve Bank of Philadelphia Working Paper, forthcoming.  A pre-publication summary of these results appears in Economic Insights.

In the News

Streetsblog republished our commentary, Climate Hypocrisy in Phoenix, calling out the disconnect between pledging to meet the Paris Climate goals someday in the distant future, while plunking hundreds of millions of dollars into freeway widening.  Sadly, Phoenix is not the only offender.

 

 

The Week Observed, November 6, 2020

What City Observatory did this week

1. Achieving equitable transportation: Reallocate road space and price car travel. New York has recorded a kind of “Miracle on 14th Street.” By largely banning through car traffic, its speeded bus travel times 15 to 25 percent, with virtually no effect on traffic on adjacent streets. Buses now run faster, attract more passengers and are more efficient. And in effect, New York has priced car travel on 14th Street:  You’ll pay a $50 fine for driving there increasing by $50 for each subsequent violation. This miracle illustrates why reallocated and pricing road space is inherently equitable.  First, by making buses run faster (and carry more people) it helps all those who can’t afford to travel by car, which tends to be low income people.

2. Institutionalized racism in car insurance. It’s long been recognized that the real estate and mortgage lending industries did considerable damage to many neighborhoods through the process of red-lining, which dates back to the 1930s. Its now largely illegal to use geographic classifications to effectively discriminate against neighborhoods based on their racial composition. But a very similar practice still persists in automobile insurance. Auto insurance is generally mandatory, and roughly as costly as vehicle fuel, but the amount you pay depends significantly on where you live. A new study from Insurify shows that residents of Black neighborhoods pay considerably more than those in white neighborhoods; so much so that driver’s with clean records in Black neighborhoods pay more than bad drivers in white neighborhoods.

Insurance rating schemes have a higher up-charge for neighborhood characteristics than for bad driving:  the 5 percent of drivers classifed as “aggressive” pay about $350 more than safe drivers; those who live in Black neighborhoods pay about $700 more for their insurance. States could make automobile insurance more equitable by requiring insurers to use larger geographies that correspond to actual transportation markets, rather than using zip codes to effectively price discriminate.

Must read

1. Cities and Restaurants in the Wake of the Covid-19 pandemic.  New York Times economics columnist Eduardo Porter has a wide-ranging, well-informed and provocative analysis of how the pandemic is affecting the restaurant business, and why this is critically linked to the health of city economies. He provides a good overview of the economic literature demonstrating how urban amenities, particularly the opportunities for social interaction provided by bars and restaurants, have been a key to drawing well-educated workers to cities.

The New York Times

The need for social distancing due fight the spread of the Coronavirus has devastated restaurant sales, leading to widespread closures. With the advent of a third wave of infections, and the waning of federal stimulus payments, many restaurants are closing for good.  Porter explores how this is likely to affect city economies, and contemplates how long it will likely take to repair the damage.

2. A 15-minute city is going to require rethinking a lot of our policies.  Paris Mayor Anne Hidalgo has turned heads locally and around the world with her efforts to lessen urban car use and promote greater livability. Her headline initiative, “the 15 minute city” is getting considerable interest. Seattle’s Mike Eliason has an essay at Publicola pointing out that this will require Mayor’s and other city leaders to move beyond simply rhetorical flourishes. It’s easy to embrace the consumption convenience and clever illustrations of the 15-minute city messaging.

But realizing this vision will require bolder, broader and faster action. While cities like Seattle are implementing some slow streets, few if any are in commercial hubs. And simply including the 15-minute criteria in the long list of items to be addressed in future rounds of land use planning lacks scale and immediacy. Moreover, as Eliason argues, realizing actual 15 minute living implies a radical decentralization of a range of activities, ranging from health care to retailing, and will probably only be achieved with considerable increases in urban density, and if equity is to be achieved, much more social housing. It’s good to have a compelling goal and vision for urban growth, but we need to back it up with this kind of comprehensive thinking.

New Knowledge

Shocks vs. Fundamentals. In the wake of the Covid-19 pandemic there’s widespread speculation about the future of cities. The need for social distancing, coupled with much broader adoption of remote work has led some to speculate that urban locations will either decline or revive slowly.

How do cities respond to shocks like pandemics?  The historic record suggests that even severe shocks are mostly transitory, and that once the initial shock has subsided, the underlying fundamentals that shape urban growth (or decline) again predominate. Economist Amine Ouzad looks at two particular examples of severe shocks to city economies: the 2001 terrorist attacks on New York City, and the 1987 Loma Prieta Earthquake that damaged the San Francisco Bay Area.  In both cases the shocks produced considerable property damage, and for a time changed consumer, and investor perceptions of these markets.

Ouzad uses detailed statistics on metropolitan growth over the past few decades to analyze the relative impact of negative shocks—like natural disasters and civil disturbances—and long term fundamentals. He finds no significant impact of either disasters or disturbances on growth.  In contrast, fundamentals, like the educational attainment of the population, the industrial composition of the economy, and whether a region suffers from racial segregation, tend to be consistent and significant predictors of future growth.

Housing market data for New York and San Francisco show that shocks do have observable short-run effects, driving up vacancy rates and holding down prices.  But vacancies and prices tend to quickly revert to pre-shock trends and levels.

Ouzad concludes:

. . . over the span of four decades, metropolitan areas are remarkably resilient to shocks – fundamentals rather than short-run shocks drive long-run population trends. Such resilience of urban housing markets suggests that the benefits of agglomeration play a key role in residents’ welfare; sharing, matching, and learning are key motives that explain the desirability of urban living. These benefits have, over the long run, arguably been greater than the negative externalities of agglomeration. High levels of education, a diversified industrial composition, and racially integrated neighborhoods are keys to the resilience of metropolitan areas.

Amine Ouzad, Resilient Urban Housing Markets: Shocks vs. Fundamentals, Center for Interuniversity Research and Analysis on Organizations, Montreal. Cahier Scientifique 2020S-53, October 2020. https://cirano.qc.ca/files/publications/2020s-53.pdf

 

The Week Observed, November 13, 2020

What City Observatory did this week

1. Seven reasons you should be optimistic about cities in a post-pandemic world. There’s widespread pessimism about the future of cities. With the pandemic-induced advent of work-at-home, many people reason that soon there won’t be any reason to go into the office, or have offices, or even cities. We disagree. Not only are cities more than just about access to jobs, the pandemic is masking (sorry!) a deeper truth about how labor markets work. As long as everyone has to work remotely, no one is at a competitive disadvantage in the workplace, we’re all in the same Zoom-limited boat.  But as we gradually return to work, even a few days a week, those who are present will have a competitive advantage, better able to contribute to the organization, tap into informal communication, demonstrate commitment, and build networks.  Competition is the first of seven “c’s” that we think are key reasons to believe that city’s will not just survive, but actually flourish again in the post pandemic world.  The full list is as follows:

  • Competition: Zooming it in works when everyone has to do it, but if you work remotely while others are in the office, you are at a competitive disadvantage in contributing to and advancing in your work, especially if you are early in your career.
  • Consumption: Cities are about more than work.  They provide us with varied, abundant and diverse experiences and opportunities for social interaction and consumption. 
  • Couples: Young people are drawn to cities because they are  the best place to find life partners.  Once partnered, cities offer more opportunities for both partners to pursue their careers.
  • Careers: Cities are still the best place to find your way in life and build skills, networks and a reputation that enable you to be as successful and fulfilled as possible.
  • Creativity: The serendipitous interaction that happens most and best in cities is what fuels our knowledge-based economy.
  • Camaraderie and Commitment:  Being there matters. Face-to-face in place shows you care, and you’re committed. It’s about being in the room where it happened, not in the Zoom where it happened.
  • Civic commons: Cities are still the place we come together for collective experiences, from concerts and celebrations to rallies and protests. The pandemic has rekindled our awareness of how important public spaces are for enabling us to connect.

2. Why—and where—Metro’s $5 billion transportation tax measure failed.  Portland voters resoundingly defeated a ballot measure that would have raised about $5 billion from increased payroll taxes to pay for slate of transportation projects, including an extension of the area’s light rail system. Strikingly, the measure failed on the same ballot where voters approved billions in additional taxes for parks, libraries, pre-school, and school construction. We look in detail at the geography of voter sentiment.

 

The measure passed chiefly in close-in urban neighborhoods that also voted for an unsuccessful progressive mayoral challenger. The measure was rejected in the neighborhoods that would have been served by the proposed light rail extension, and was also rejected in East Portland, a relatively low income area where the measure’s proponents argued that pedestrian and safety improvements would redress long-term inequities.

Must read

1. The geography of the 2020 presidential election. The standard 50-state red/blue maps used to illustrate the geography of voter preferences are wildly misleading, principally because they make big, sparsely populated (and mostly red) areas seem much more important than they are (electorally).  The Washington Post has a fine antidote for this kind of visual lie:  a map that shows the results for every county, with a dot corresponding to the size of the electorate in each county.

Mapping election returns by population, rather than by acreage presents a much more realistic picture of the weight and distribution of political opinion across the landscape.

2. The economic geography of the 2020 presidential election. Our friends at the Brookings Institution have revisited a theme they explored four years ago:  How does the distribution of the presidential vote compare to the distribution of Gross Domestic Product (GDP).  Preliminary data show that the counties that voted for Joe Biden and Kamala Harris account for 70 percent of the nation’s gross domestic product.

That’s an increase from 2016, when the counties voting for the democratic ticket accounted for about 64 percent of the nation’s GDP. The increase reflects both the growing importance of urban economies to national economic output, and also the shift of a couple of key metro areas, like Phoenix, into the blue side of the ledger. But the message is the same:  voters in the nation’s most productive communities voted Democratic.

3. Rent control in a complex and dynamic market.  The District of Columbia is considering an ordinance to expand its system of rent control to housing built in the last 15 years. (The city currently has rent control for units built before 1976).

The DC Policy Center’s Yessim Taylor has a thoughtful analysis of how the expansion of rent control is likely to affect the housing market. Taylor points out that over time, rent control tends to lead to the removal of housing from the rental market place, either through condominium conversion, shifts to other uses or demolition. Historical experience in DC shows that in the years immediately following the imposition of wider rent control, about two-and-half percent of previously rented units moved out of the rental pool each year.

And rent control has long term effects on the marketplace, with diminished incentives to maintain housing, and limited incentives to build more units, rent control can lead the overall housing supply to shrink, driving up rents for everyone. In DC, its likely that expanded rent control will drive up rents in the “shadow” rental market (where homeowners rent out single-family homes or condominiums).

Over time, rent control would also tend to reduce local property tax collections as it drives down the market value of rental housing. On its face, rent control seems like a simple policy, but as Taylor’s analysis shows, it will have widespread, enduring and counter-productive effects on housing affordability.

New Knowledge

A trend to watch:  Declining central city apartment rents. At City Observatory, we firmly believe the steady, long-term increase in the relative rents commanded by central city apartments relative to those in the suburbs is a key indicator of the growing demand for cities (and a signal we’ve not building enough housing in desirable dense urban neighborhoods).  So we keep a close eye on current market indicators of rent.

One of the best sources of data on market trends is ApartmentList. Rob Warnock, an economist for Apartment List, has a new research report comparing city and suburban apartment rent trends for large metro areas for the first 9 months of the year. His data shows that across the 30 largest markets, apartment rents have continued to soften in central cities, while they have rebounded in suburbs. Since January, suburban rents are up slightly (about half a percent), while rents in their central cities have declined about 5 percent.  (It’s worth remembering that “principal cities” are usually the single most populous municipality in a metro area, and encompasses the entire city limits, and not just downtown or even dense urban neighborhoods).

This pattern holds in nearly all of the 30 markets, where suburban rents have either increased more or decreased less than rents in the central cities they surround. This suggests, in the summer and early fall at least, that urban markets are not growing as robustly as suburbs.  Warnock offers some reasons as to why this may be taking place, partly having to do with the recession, partly to do with pandemic, partly having to do with the mix of apartments in cities compared to suburbs:

There are also differences in the types of apartments available in each type of city; dense urban centers are more likely to contain newer, more-expensive, more-luxurious apartments that are positioned to see more vacancies and steeper rent drops during an economic recession. Meanwhile, suburbs tend to have a greater share of cheaper, lower-density homes that remain in high demand even as renters look to cut costs.

It’s important to note that city apartments still generally command a substantial rent premium compared to suburban apartments, but this divergence from the longer term trend of relatively increasing urban rents bears close watching. It may be, at least for the moment, that apartment supply in cities is no longer being outstripped by demand (deliveries of new apartments, typically years in the making, don’t change as quickly as demand for housing).  And as Warnock notes, some of what we may be observing may reflect a short term movement of renters to ownership spurred by low interest rates.

Warnock’s report has detailed data and interactive charts for each of the 30 largest US metro areas, showing rental trends for central cities and their suburbs from January through September, 2020.

Rob Warnock. “The suburban rent rebound,” ApartmentList.Com, November 10, 2020

In the News

City Observatory’s Joe Cortright is quoted in a Portland Business Journal article, “The departed” chronicling the demise of many long-established local restaurants.

 

 

The Week Observed, November 30, 2020

What City Observatory did this week

Black Friday, Cyber Monday, Gridlock Tuesday?  The day after a nation celebrates its socially distanced “Zoom Thanksgiving” we’ll look to see how the pandemic affects the traditional “Black Friday” shopping spree. It seems likely that more retail sales than ever will gravitate to on-line shopping. That’s got many self-styled smart city futurists predicting gridlock from more and more delivery vehicles.

But paradoxically, buying stuff on-line and having it delivered actually reduces vehicle miles traveled, because each mile traveled by a delivery truck wipes out 30 (or more) miles driven by shoppers in their cars.  And because delivery density increases the more packages Amazon or Fedex or UPS deliver—their trucks are traveling shorter and shorter distances between deliveries, making them greener and more energy efficient.

Must read

1. Brookings ideas on climate policy for the Biden Administration.  As the presidential transition proceeds, the policy experts at the Brookings Institution are offering their thoughts on what the federal government might due to reinvigorate the nation’s climate efforts.  We were struck by suggestions from two Brookings scholars, Jenny Schuetz and Adie Tomer.  Schuetz made a strong case for reforming land use and making it easier to build more housing in cities, where living is greener. For too long, federal policy has encouraged sprawling development that’s only made greenhouse gases increase:

Each year, we see more evidence of the devastating financial and human costs of building homes in the “wrong” places—and yet we continue to do it.

Adie Tomer echoes this diagnosis:

Neighborhoods designed at a human-scale—ones with greater proximity between homes and destinations—lead to far shorter trips. The problem is most of metropolitan America stopped building these neighborhoods long ago.

And, as Tomer notes, subsidizing electric vehicles would just repeat the mistakes of past federal policies, incentivizing more sprawl and long-distance trips. Instead, we need to look for ways to build more human scale neighborhoods.  That, and not a technical fix, is a critical choice for the climate.

2. A transportation parable:  Gas lines.  Michael Manville of UCLA takes an historical event, America’s gas lines from the “Energy Crisis” of the early and late 1970s,” and deploys it as a parable of why we have traffic congestion, and what we can do to solve it.  For a time in the seventies, America’s energy policy was stuck by the imagined political impossibility of allowing gas prices to rise in the face of global oil shortages. Rather than allow prices to rise, we resorted to official and unofficial rationing, everything from alternate day purchases, to gallon limits, to simply having long lines at the pump.

National Public Radio.
Briefly, the reason we have traffic congestion today is the same reason that, back in the 1970s, we had gasoline lines. We fixed the price of gas too low, so people had to line up to get it.  Since 1979, we haven’t had price controls on gas, and when there have been price increases, we haven’t had shortages, because demand quickly adjusted. When we price something correctly, people make different choices and figure out ways to use less (or in the case of traffic congestion, travel at different times). As Manville argues, we should be applying the same lesson to our streets:  If we price peak hour travel, we can eliminate congestion.
3.  Transit for all.  Yonah Freemark, famously of Transport Politic, and now with the Urban Institute starts the conversation about what the Biden Administration might do to improve the nation’s transit systems with a back-of-the-envelope set of estimates of what it might cost to increase transit service in all of the nation’s cities to the levels deployed in relatively high performing places.  Freemark uses the National Transit Database to estimate the number of hours of transit service provided per capita in every city, and then works out what it would cost to get every city up to say, Chicago levels of transit provision.  The short answer is, not that much.  Freemark estimates that it would cost $16.7 billion to provide every city of 100,000 or more with that higher level of transit service.  (Hours of service is a bit of gloss on the actual quality of service provided, principally because densely populated places get much better service per hour of transit operation than do sprawling ones, but even with that qualification, this is a great way to begin thinking about what we might do to improve transit everywhere.

 

The Week Observed, October 23, 2020

What City Observatory did this week

1. Now we are six. We marked City Observatory’s sixth birthday this week, and took a few moments to reflect back on the journey, and to thank all those who helped us on our way, and to look forward to the vital role that cities will continue to play in tackling tough national problems.

2. The amazing disappearing urban exodus. The meme that the urban density causes or aggravates the Coronavirus, and is leading people to flee cities is every bit as persistent as the pandemic itself. New data on postal change-of-address filings shows that there’s been at best a minor blip in people moving—up just 2 percent since the virus first spread.  In fact, over the past three months for which data is available, the rate of permanent moves is actually lower than it was in the same three months in 2019. Despite the fact that the data show there’s no significant change in migration, press accounts insist on repeating myths out people leaving cities.

Must read

1. Elon Musk’s hyperloop vaporware. The Boring Company’s claims that its hyperloop technology is going to disrupt urban transit are evaporating even before its Las Vegas demonstration project is complete. What was pitched originally as a exciting new automated underground system, is now just Teslas in a tunnel, and probably with human drivers. Tech Crunch reports that not only will individual vehicles only carry 5 passengers maximum, but the logistics of loading passengers will likely slash the system’s actual capacity to as little as a quarter of what Musk’s team claimed.  As Human Transit’s Jarrett Walker always reminds us, these car-based solutions run up against barriers of fundamental geometry, that can’t be disrupted.

2. The urban interstate and the damage done (Indianapolis edition).  Across the nation in the 1950s and 1960s, highway engineers demolished vibrant, walkable urban neighborhoods, chiefly to speed suburban car commuters through an increasingly sprawling landscape. Daniel Bradley of WRxx-TV has a terrific essay relating the story of the construction of Interstates 65 and 70 through Indianapolis, and the destruction of the neighborhoods (and renting of the urban fabric they caused).  More than 8,000 buildings were demolished and 17,000 persons displaced by the freeways. And the freeways dealt a body-blow to what had been the kind of urban spaces cities are struggling today to re-create:

“They don’t really realize it was a huge network of neighborhoods,” said [Paula] Brooks, an environmental health senior associate with the Hoosier Environmental Council and a Ransom Place neighborhood advocate. “It was a truly mixed-use urban neighborhood, the kind of neighborhood these young urbanists are fantasizing about now. You could get everything you needed. There were grocery stores and dry cleaners. Restaurants and beauty shops. For recreation, people could go to the park, a bowling alley or a skating rink.

You can tell the same stories for scores of other US cities.  It was a monumentally expensive and unthinking commitment to reshaping the places we live for the convenience of those driving through, and we’re still struggling to overcome the damage today.

3. Traffic safety: Something is 94 percent wrong. Dan Kostelec, writing at StreetsblogNYC debunks the oft-repeated claim that 94 percent of traffic deaths are due to “human error.” This framing of the causes of “accidents” is a favorite both of highway engineers and autonomous vehicle advocates.  The latter imagine that eliminating human’s from driving will, automatically, eliminate nearly all crashes. The highway engineers rely on the claim to deflect responsibility for designing and building roads that create dangerous conditions and that are unforgiving of the likely “errors” that they cause. For example, wide straight roads, with few pedestrian crossings, prompt speeding and give those on foot few options.  The trouble, Kostelec, argues, is the 94 percent number is a fabrication:

Simply put: It’s not true. Crashes are more complex than that and we need to understand all those factors to stand a chance at reducing traffic deaths in the United States.

New Knowledge

Urban density an Covid-19. A recent paper on the timing and severity of the Covid-19 pandemic, and its connection to urban density dispels the key myths about the Coronavirus. Particularly in the early days of the pandemic, the highest number of cases were recorded in large metro areas, like New York City; that observation led many to conclude that urban density itself was a cause (or aggravator) of virus spread.

But this paper from economists at the London School of Economics, takes a more nuanced view of the spread of the virus.  The authors look not just at overall death rates from Covid-19, but also at the timing of outbreaks.  Their key insight is that more populous and denser metropolitan areas were more likely to be hit early in the pandemic, and that smaller, less dense, and less populated metro areas, and rural areas were more likely to be hit later.  The timing of the disease reflects the breadth and depth of connections that large metro areas have with the rest of the country and the world; the vector of the virus is contacts, and large metro areas have more, and more diverse contacts than smaller metros and rural areas.

Ultimately, though, we are all connected, and the virus—as we’ve noted at City Observatory—has spread virtually everywhere.  As this paper observes, the rate of deaths from the disease, once the virus has reached an area, is essentially no different in higher density areas than lower density ones.  The key chart from the paper is this one, which shows the death rate, per 100,000 population, 45 days after the first death from Covid-19.  Data are for counties, with the death rate shown on the vertical axis, and the county’s population density on the horizontal axis.

The slope of the regression line is almost perfectly flat, suggesting that there is no correlation between county density and death rates, 45 days after the instance of the first recorded Covid-19 death in a county.

This is a compelling analysis of the data because it offers a plausible explanation for the commonplace observation that the disease was worse first in several big cities, as well as the fact that the disease has, in the past several months, become increasingly prevalent and severe in smaller metros and rural areas.

Felipe Carozzi, Sandro Provenzano, Sefi Roth,
Urban Density and Covid-19,
CEP Discussion Paper No 1711 August 2020

 

In the News

Oregonian columnist Steve Duin cited our analysis of the proposed $5 billion Portland Metro bond measure in his commentary, “Your father’s transportation plan.”  Duin called our analysis “pointed and persuasive,” writing:

Cortright argues the wage tax “is unrelated to transportation, effectively taxing those who use the system least, and subsidizes those who drive and pollute the most.” It “cannibalizes the principle source of funding for transit operations” at a time when TriMet is in crisis.

And even though Metro trumpets its imperative to “reduce greenhouse gas emissions and prepare for a climate-changing future,” Measure 26-218 does virtually nothing, Cortright says, to reduce emissions.

Our analysis also figured prominently in the Oregonian’s coverage of the debate over the ballot measure.

The Week Observed, October 30, 2020

What City Observatory did this week

Equity and Metro’s $5 billion transportation bond. This week, Portland residents are voting on a proposed $5 billion payroll tax/bond measure to fund a range of transportation projects. A favorite talking point of advocates is that the measure advances equity, because it will expand transit to black and brown communities.  But the measure’s single largest project, the $3 billion expansion of light rail, chiefly serves some of the region’s wealthiest and whitest neighborhoods, and has as its destination a tony suburban shopping center.

The bulk of the package essentially subsidizes car travel by paying for projects located in highway corridors. While the measure does some good things, like subsidizing transit fares for low income populations, it may also jeopardize the use of the payroll tax, which for a half century has underwritten transit operations.  A quest to build shiny new capitla projects may undercut the essential work of maintaining bus service, which is very much at issue in the post-Covid era. Fraught and subjective claims about “equity” are mudding the discussion of transportation policy. In the absence of a clear definition of what we mean by equity, and specific metrics for measuring progress, this term is clouding public debate, rather than advancing it.

Must read

1. Driving down emissions. An important new report from Smart Growth America and Transportation for America puts changing the way we get around front-and-center in our efforts to tackle climate change.  Excessive driving is the key reason for America’s growing greenhouse gas emissions.The report lays out a clearly explained, cleverly illustrated explanation for how we got in this mess:  Building more and more road capacity has induced more driving, and progressively led to the sprawling of homes, stores, and jobs, that have made us more car-dependent.

The focus on transportation as transportation distracts us from a more fundamental truth:  we’ve made in uneconomical and illegal to build the kinds of places where people don’t need a car to live:

Yet the deep irony is that there is huge unsated demand for communities where it’s safe and convenient to take transit, walk, and bike to get around, but policy decisions that prohibit building or adding housing to those types of places and require streets designed for cars to move quickly have artificially constrained the supply of these places.

The report lays out five key steps that federal, state and local governments should be keeping foremost in mind, and it also punctures the notion that we can solve our greenhouse gas problems solely with the technical fix of electrifying cars.

  1. Getting onerous government regulations out of the way of providing more homes where people naturally drive less;
  2. Making safety the top priority for street design to encourage more short trips; 
  3. Instituting GHG reduction and less driving as goals of the transportation system; 
  4. Investing heavily in other options for getting around; and 
  5. Prioritizing access to destinations

2. Los Angeles delays a freeway expansion. Like many state DOTs, the California highway department advanced with plans to widen the 5 and 605 freeways in Souther California. It’s now apparent that the highway-widening will lead to the demolition of hundreds of homes, and that has prompted Metro, the region’s transportation planning body, to call a time out. Importantly, the Metro Board directed the highway planners to specifically add a transportation system management/demand management alternative to the project’s environmental impact statement.  Making better use of the existing roadway can lower pollution, lessen costs and avoid destroying even more of adjacent neighborhoods. It’s an alternative that should get much higher priority in every city.

3. More institutional racism in transportation: insurance rates. (Hat-tip to Streetsblog USA for highlighting this study). Auto insurance website Insurify has a study looking at differences in rates charged to drivers by race. It concludes that:

Cities and towns with majority black residents experience among the highest quote prices compared to cities of any other racial makeup, regardless of how clean their driving record is. A driver with a clean record living in a majority-black neighborhood pays almost 20 percent more for car insurance on average than a driver living in a majority-white neighborhood who has prior driving offenses.

What’s perhaps most offensive about these disparities, is the fact that who your neighbors are has a bigger impact on your insurance rates than how safe you are as a driver.  Drivers with good records in majority black neighborhoods pay about $2,100 per year, or $800 more than the $1,300 similarly safe drivers in majority white, non-Hispanic neighborhoods.  But as Insurify notes drivers with bad records pay only about $400 more on average than drivers with a tarnished driving history. It’s a system that puts a bigger premium on where you live than how you drive, and does so in a way that systematically penalizes people of color.

New Knowledge

Distressed Communities Index.  Which cities and neighborhoods are doing well and which are struggling?  The Economic Innovation Group has a useful new tool that ranks the nation’s communities from healthiest to most troubled on a series of socioeconomic indicators. They’ve created versatile web-based tool that lets you quickly access data for a range of geographies.

The EIG tool groups communities into five broad categories from “prosperous” and “comfortable” to “at-risk” and “distressed.”  The categorizations are based on seven indicators, including income, poverty rates, educational attainment, and employment and establishment growth.  The tool allows users to view data for counties, congressional districts and zip codes. The zip code level data is the most illuminating and covers 25,000 zip codes which include 99 percent of the US population.

The reports findings are summarized in a live map of the nation, and you can drill down to specific geographies. Blue colors are most prosperous, reds are most distressed.

The Distressed Communities Index is a convenient, and transparent tool for quickly visualizing geographic patterns of neighborhood economic performance.  Here, as an example, are two metro scale maps showing Seattle and Cleveland.

Nearly all of Cleveland’s core zip codes are classified as distressed; none of the zip codes in the Seattle area are distressed.  In both cities, most suburbs are classified as either prosperous or comfortable; much of Seattle’s close-in area is likewise classed as prosperous. These data clearly illustrate the different economic geographies of thriving and struggling metro areas.

In the News

Oregon Public Broadcasting‘s Jeff Mapes has an in-depth analysis examining how the Trump Administration’s policies have affected the Oregon economy.  City Observatory’s Joe Cortright is quoted as pointing out that despite claims that tax cuts and deregulation would fuel growth, Oregon’s economy continued on the same trajectory it established during the Obama years, at least until the mishandling of the Coronavirus epidemic produced the sharpest and deepest downturn in a century.

The Week Observed, September 18, 2020

What City Observatory did this week

1. Lived segregation in US cities. Our standard measure of urban segregation, whether people reside in different neighborhoods, doesn’t really capture the way people from different racial and ethnic groups interact in cities on a daily basis. A new paper uses data gathered from cell phone records to look at the extent of mixing and isolation in the nations metro areas. It finds we’re generally much less segregated during the middle of the day than we are in the evenings, and that a range of public spaces and commercial ones, including parks, restaurants and entertainment venues, have much higher integration that residential neighborhoods.

The data also show which cities have the highest—and lowest—levels of “experienced” integration.  We have rankings for the largest US metro areas; Portland, Seattle and Minneapolis have the least “lived” segregation; Milwaukee, Detroit and St. Louis have the highest levels of  segregation by this measure.  The patterns of experienced segregation tend to parallel patterns of residential segregation.

2. Vancouver’s Columbian:  “Suburban commuters matter.”  After more than a year of working with the Oregon Department of Transportation on the proposed $800 million I-5 Rose Quarter freeway-widening project, a key Black community group pulled out of the project, prompting many local political leaders to withdraw their support. Who jumped in to support the project? The editors of the Vancouver Columbian, a suburban newspaper, who finally said the quiet part out loud:  The real beneficiaries of the freeway widening are not local residents, but car commuters from Washington State, who will now be further inconvenienced because the road won’t we widened to allow even more cars to drive faster through this historically African-American community.

Freeways, and freeway-widening privilege wealthier, whiter suburban commuters over urban residents, who are disproportionately lower income, people of color, and who travel by means other than cars. Plus the $800 million Rose Quarter widening is just a stalking horse for reviving an even larger multi-billion dollar freeway project to add more car capacity across the Columbia River. The racism of freeways isn’t some dim, historic vestige, its a current and continuing reality in urban areas.

3. Covid is now a rural, red-state pandemic.  Early on, people were quick to equate the Covid-19 pandemic with cities, and to incorrectly attribute the virus’s spread to density. That’s been disproven again and again.  It’s now the case that the pandemic is worse in rural areas than in large cities.  We compile analyses from three different sources (Brookings’ Bill Frey, Indeed’s Jed Kolko, and the Daily Yonder) showing how rural and red the pandemic as become.  What’s striking now is that, regardless of city size or urban/rural status, the pandemic is worse in red states than blue ones.  What we’re waiting for now is a swarm of media stories chronicling the exodus of people from small towns and red states to avoid the virus.

4. City Beat: No, people are not fleeing Portland for its suburbs. It’s becoming a favorite storyline in the media; frightened by the prospect of the Coronavirus, homebuyers are supposedly fleeing cities for their suburbs or surrounding hinterlands. The fact that the virus has spread to rural and small town America puts the lie to the premise, but its still being repeated, most recently by the Portland, Oregonian.  Stories like these are always documented with anecdotes of one or a handful of households moving to the suburbs, or further, but since people are always moving in (and out) of cities, this is meaningless. The data shows that urban markets, including Portland, remain robust.

Must read

1. SUV’s and Climate Change.  The sport utility vehicle (SUV) has become the symbol of climate change, but according to new data reported in the Guardian, its more than a symbol, its a tangible cause of increasing greenhouse gas emissions. The data from the International Energy Agency show that SUVs added more than 500 million metric tons per year to carb on emissions over the eight-year period 2010 to 2018, making them the second largest source of the increase in greenhouse gases.

Part of the growth of SUV’s reflects the substitution of SUVs for standard cars in the world’s vehicle fleets; emissions from non-SUV cars actually decreased but were overwhelmed by the increase in SUV emissions.

2. An international perspective on America’s housing affordability problems. It may seem like the price of housing is a purely domestic concern, but the an article by political scientists Sam Winter-Levy and Bryan Schonfeld in the journal Foreign Affairs weighs in on US housing affordability with an informative piece that reminds us that the result of the world has mostly figured out how to deal with housing. The US (and the UK) are virtually alone in the world in allowing NIMBY’s virtually unlimited sway over whether and where to build new housing. Japan, which had a housing crisis two decades ago, implemented national policies to legalize denser housing almost everywhere.  As Foreign Affairs explains:

. . . the national government passed a series of reforms assuming control over land use and reducing the ability of local opponents to block new housing construction. The government then eased planning restrictions in Tokyo, allowing taller and denser buildings. Since then, the city’s rate of housing construction has risen by 30 percent. In 2014, construction started on more new houses in Tokyo than in the entire state of California or in all of England. While the average price of a home in San Francisco and London increased by 231 percent and 441 percent, respectively, between 1995 and 2015, in Tokyo it remained essentially unchanged.

Underlying the US problem is the notion that housing ought to be a great investment, which as we’ve pointed out at City Observatory, is a recipe for policies and politics that automatically lead to higher home prices and less affordability. Other nation’s have lower rates of homeownership and much better housing affordability. Perhaps, as the authors argue, its time for a little less American exceptionalism.

3. Driving less, dying more.  The Coronavirus has created a powerful transportation safety experiment.  The volume of traffic, measured by vehicle miles traveled has fallen nearly 20 percent.  But total traffic fatalities, according to data just released for the first six months of 2020, have barely budged.  That means that the per mile rate of fatalities has increased significantly:

The National Safety Council (NSC) estimate of total motor-vehicle deaths in the first six months of 2020 is 18,300, up 1% from the preliminary 2019 estimate of 18,200. . . .  The estimated mileage death rate for the first six months is 1.37 deaths per 100 million vehicle miles traveled, up 20% from the revised 2019 rate of 1.14. Estimated vehicle miles traveled for the first six months of 2020 indicate nearly a 17% decrease compared to last year from 1,595 billion to 1,331 billion.

The most likely reason:  Fewer cars on the road prompts more speeding and reckless driving by those remaining. It’s a reminder that some level of traffic congestion actually saves lives by slowing vehicles and reducing the number and severity of crashes.

New Knowledge

Road safety:  Driver error or bad design? You’ll often hear claims that most crashes, and associated injuries and fatalities, should be chalked up to “driver error.” The implication is that the cause is essentially random and uncontrollable.  But what if driver’s are actually responding to visual cues in the built environment about how fast to drive, and how carefully to look out for other road users?  If such cues are important, then crashes aren’t random or unavoidable, or even necessarily the driver’s fault, but are really attributable to bad road design.

A new study published in the Journal of Planning Education and Research addresses exactly this issue.  The full study itself is gated here, but Beth Osborne has a terrific summary of the research published at the State Smart Transportation Institute.

. . . the design of the built environment primes drivers’ expectations about potential conflicts or hazards and to what risks they should be prepared to respond. The embedded conditions in the built environment can cause “inattentional blindness”: situations in which drivers look but still do not see a conflict type that the design doesn’t prepare the driver to watch out for (usually pedestrians, bicyclists or motorcyclists).

The key point here is that we need to evaluate traffic safety from a cognitive and behavioral perspective, recognizing that context and the physical environment condition the way drivers act. Many road design standards that provide more room for error (wider lanes, clear zones, wide rights-of-way, and building setbacks) also send driver’s unmistakeable cues that they can and should drive faster, and that this space is exclusively for their use.  Other design choices may prompt safer behavior: We all instinctively tend to drive more slowly and carefully on narrow roads, with limited visibility and many obvious indications that other users are present.  As places work to achieve Vision Zero, its worth keeping in mind that rethinking the behavioral cues give to drivers may be more effective in protecting vulnerable road users than added doses of infrastructure.

Eric Dumbaugh, Dibakar Saha & Louis Merlin, “Toward Safe Systems: Traffic Safety, Cognition, and the Built Environment,” Journal of Planning Education and Research. August 2020. doi:10.1177/0739456X20931915

In the News

Flossin Magazine published Joe Cortright’s essay “Restorative Justice: Will ODOT’s $800 million freeway widening restore Albina, or just repeat past injustices?”

Thanks to Greater Greater Washington for pointing its readers to our commentary pointing out that much of what is called “pedestrian infrastructure” is needed only because of the prevalence of cars, and is really infrastructure that benefits car travelers, not people walking.

The Week Observed, September 11, 2020

What City Observatory did this week

1.  Manufacturing consent for highway widening. In the early days of freeway battles, state highway departments were power blind and tone-deaf, and citizen activists often triumphed in the court of public opinion.  In the past several decades though, highway builders have become much more adept at manipulating the process of “citizen involvement,” to project an image of widespread support and to stifle, outlast and ignore opposition.

2. Why one long-time Portland transit leader is opposing Metro’s $5 billion transportation plan.  We’re pleased to publish a guest commentary from GB Arrington, one of the veteran architects of Portland’s transit system. He argues that Portland Metro’s proposed bond measure, to be voted on in November, is a wasteful and ineffective investment, guided largely by political considerations rather than sound planning objectives.  Arrington writes:

TriMet and Metro lost sight of their job to deliver livability to the region’s citizens. Going along with political backslapping and road building is not what made the region stand out as a global model of livability. Pursuing political business as usual is a sure- fired recipe for making Portland more like sprawling auto-oriented Houston than the Portland we have long worked and aspired to be.

3. Is there anything smart about “smart cities?” One of the most popular urbanist buzzwords is “smart cities,” a coinage and a line of thought that implies that there’s no urban problem that can’t be tackled by some combination of technology and big data. Color us skeptical.  Policy makers routinely ignore the unambiguous “small data” that’s already at hand, and some of our worst problems stem from a lack of imagination and ambition, rather than a lack of data.  To the contrary, “big data” is often wielded by the existing institutions (like state highway departments) to create an illusion of scientific validity for projects that make our transportation problems worse (and our cities less livable).

Glib and self-serving claims about new technologies (we’re looking at you Elon) often become an excuse for policy makers to ignore tangible if prosaic challenges (like running buses and legalizing housing) in favor of hopes that computer renderings will magically materialize and obviate these problems.

Must read

1. How cities come back from disaster. In the midst of a pandemic, widespread protests about police brutality and racism, its hard to see a clear way forward. Fear and pessimism are leading many to predict (falsely in our view, see above) an urban exodus. Writing at The Atlantic, Derek Thompson takes a longer and more optimistic view. Cities have experienced cataclysms and disasters before, and rather than extinguishing urban settlement, they’ve generally prompted innovation and change. The epidemics of the 19th Century led to much improved sanitation, urban parks and more robust public health systems. Thompson argues that Covid-19 will be another catalyst to innovation. He has his own theories about what those innovations will be, and we might even end up making cities even better than they were before the pandemic:

Altogether, this is a vision of a 21st-century city remade with public health in mind, achieving the neat trick of being both more populated and more capacious. An urban world with half as many cars would be a triumph. Indoor office and retail space would become less valuable, outdoor space would become more essential, and city streets would be reclaimed by the people.

2. Still missing in Minneapolis: Missing Middle Housing.  Last year, with great fanfare in housing policy wonkdom, Minneapolis “abolished” single family zoning and allowed the construction of triplexes in most areas zoned for single houses. Nearly a year on, CityPages reports the fruit of the policy breakthrough:  three new triplexes. So far, at least, not the housing supply avalanche that proponents hoped and opponents feared.  Writing at Strong Towns, Daniel Herriges downplays the shortfall, reminding us that the rezone was “necessary, but not sufficient” to get more housing. It’s still the case that triplexes are subject to the same setback requirements and lot coverage limits that applied to single family homes, which may leave precious little room to squeeze in a triplex or even a duplex.

Still missing in Minneapolis.

Portland’s new Residential Infill Project anticipated these challenges, creating incentives for building more units. The Minneapolis experience is a reminder that single-family zoning is much more than the headline in the zoning code, their are myriad details, from height limits, to setbacks, to floor-area-ratios, to parking requirements to design review that effectively determine what can be built.

New Knowledge

The power and persistence of long-distance migrants on economic prosperity.  Our studies at City Observatory have stressed the important role that talented young people moving to cities have on regional economic success.  A new study that looks backwards at the patterns of migration within the US confirms that places that attract new residents, particularly from long distances, experience significant and enduring economic gains.  Viola von Berlepsch and Andrés Rodríguez-Pose studied the patterns of internal migration in the US in the late 19th Century and traced out the economic performance of regions through the 20th Century.  They find that counties that attracted high numbers of migrants, particularly from far away tended to have stronger economic performance than those counties that attracted relatively few migrants from distant locations. Specifically, levels of per capita income in 2010 are positively correlated with the distance that migrants traveled to a county in 1880, controlling for a wide range of other factors.

The idea that openness to newcomers can catalyze growth is not a new one; what’s striking about this study is how the patterns of openness a century ago are still reflected in the variations in economic performance today.

. . . internal migrants having crossed state-lines between their birth state and destination exert a significant and positive long-term impact on the economic performance of the receiving regions. They leave a trace which is still evident more than 100 years after the settlement took place. Counties that attracted a large share of domestic migrants around the turn of the 20th century have become and remain more prosperous in 2010 than those largely bypassed by internal out-of-state migration streams . . . the bigger the distance travelled, the greater the positive long-term economic legacy. Counties that drew a large number of long-distance migrants around the turn of the 20th century have been more dynamic over the next century.

von Berlepsch, Viola and Rodríguez-Pose,Andrés (2019) The missing ingredient: distance internal migration and its long-term economic impact in the United States. Journal of Ethnic and Migration Studies.

In the News

Thanks to Streetsblog and Strong Towns for republishing our essay, “Why most pedestrian infrastructure is really car infrastructure.”  The Overhead Wire reports that it was their most read story on September 8.

Pedestrian infrastructure?

Pedestrians don’t need infrastructure, so much as they need vibrant urban places, with closely spaced destinations, something that’s both rare and illegal in most of the nation’s suburban areas. Expensive pedestrian crossings like these, are at best remedial measures, and don’t do what’s needed to make places truly walkable.

The Week Observed, September 4, 2020

What City Observatory did this week

Why most pedestrian infrastructure is really car infrastructure. One of the most misleading terms you’ll hear in transportation is “multi-modal” which in practice means a highway for cars and trucks, with largely decorative provisions for pedestrians and bicyclists. We look at a couple of examples of pedestrian overpasses in Florida and Georgia, which while ostensibly built for the benefit of people walking, are really only necessary because of a car-dominated environment.  In addition, unlike highway “improvements” pedestrians overpasses almost always mean longer, more arduous journeys for pedestrians, and serve mainly to free drivers from having to slow down or watch out for non-vehicular travel.

Pedestrian infrastructure?

Pedestrians don’t need infrastructure, so much as they need vibrant urban places, with closely spaced destinations, something that’s both rare and illegal in most of the nation’s suburban areas. Expensive pedestrian crossings like these, are at best remedial measures, and don’t do what’s needed to make places truly walkable.

Must read

1. Facebook:  I’ll take Manhattan. Despite all the doom-saying about the future of work in dense cities, Facebook is moving forward with plans to lease 700,000  square feet of office space in the Farley Post Office Building near Penn Station in New York City.  The reason? It’s all about access to talent.

The RealDeal reports that even in software, that quintessential computer-using business, innovation and productivity are all about close, face-to-face human interaction:

“So much of what we do is collaborative,” Rosenthal said. “[Software is] like writing a book together where all the plots have to connect and make sense and there are thousands of authors. It’s really hard to do if you’re not co-located in the same space and it’s important to even be able to see each other in the same space.”

2. Covid is now worse in rural areas and smaller metros than in big cities.  Indeed economist Jed Kolko has been tracking county level Covid-19 case data by his urban/rural county typology.  While the pandemic was famously worse in some of the nation’s big cities—notably New York—in the spring, with the second wave of the virus, the pattern has reversed.  Its now the case that the rates of reported cases are now higher in smaller metros and rural areas than in the densest urban counties of large metro areas.

And in the past month, there’s been a notable further divergence in reported cases per capita, with rural and small metro areas collectively seeing the largest uptick.  The highest and fastest rising rates are in non-metropolitan areas and the smallest metros; the lowest rates are now in the central counties of metros with a million or more population. While there are important qualifications for any urban/rural county classification system, these data show that the notion that vulnerability Covid-19 was primarily a product of urban density is a, shall we say, “rural myth.”

In the News

Writing at Michigan Future, Lou Glazer makes a strong case for the future of downtowns, quoting our “Youth Movement” CityReport:  “High density, high amenity neighborhoods are not going away.”

BikePortland directed its readers to our commentary, “The case against Metro’s $5 billion transportation bond.”

The Week Observed, August 28, 2020

What City Observatory did this week

The case against Metro’s $5 billion transportation bond. Portland’s regional government, Metro, is asking voters to approve a $5 billion package of transportation improvements, to be funded by borrowing against an increase in payroll taxes.  We take a close look at the proposal, and conclude that its a bad deal on a number of grounds:  It emphasizes investments in corridors, mostly highway corridors, rather than on bolstering the dense, walkable centers the region’s transportation plan has emphasized for years.

In practice, this is what a “corridor” plan for cycling and walking can look like.

By the agency’s own estimates, it does virtually nothing to reduce greenhouse gas emissions, even though increased driving is the single biggest source of carbon pollution in Portland.  By paying for road improvements with a payroll tax, rather than a user fee, like the gas tax, it implicitly subsidizes driving, with the result that the proposal both insulates drivers from facing the costs of their decisions, and leads to more driving, pollution and crashes. We estimate that the subsidy to driving offsets by a factor of 50 the gains Metro claims from lower carbon emissions from its $5 billion investment package.

Must read

1. Jerry Seinfeld explains why New York (and other cities) aren’t “over.”  A New York comedy club owner has picked up and moved to Florida, but not before leaving a trash-talking note for his former city.  Jerry Seinfeld takes offense, and explains why leaving the city says more about the club owner than the city.  New York, and other cities will thrive, Seinfeld argues, for the same reasons they always have. We won’t all be decamping to the suburbs or rural America and  “zooming it in.”

There’s some other stupid thing in the article about “bandwidth” and how New York is over because everybody will “remote everything.” Guess what: Everyone hates to do this. Everyone. Hates.

You know why? There’s no energy.

Energy, attitude and personality cannot be “remoted” through even the best fiber optic lines. That’s the whole reason many of us moved to New York in the first place.

You ever wonder why Silicon Valley even exists? I have always wondered, why do these people all live and work in that location? They have all this insane technology; why don’t they all just spread out wherever they want to be and connect with their devices? Because it doesn’t work, that’s why.

Real, live, inspiring human energy exists when we coagulate together in crazy places like New York City.

It’s an entertaining but well informed argument about why cities matter, and why they’ll persist even in a post-Covid world.

2. How biased traffic models enshrine highway expansion. Vice’s Aaron Gordon has a detailed article illustrating how state highway departments have used biased transportation modeling to justify ever wider roads and freeways.  Drawing on examples like Louisville’s massively overbuilt Ohio River Bridges project, Gordon shows that state highway agencies developed multiple, conflicting forecasts, exaggerated ones to justify building the project, more accurate ones to sell bonds.

As Gordon explains, the problems with the “four-step” models have been well-documented for decades, but they continue to be used because they’re structured in a way that can be used to produce just the answers that State DOTs want:  Build baby, build.  As we’ve noted at City Observatory, these models routinely overestimate future traffic and overstate congestion in a “No-build” world, and that at best they become self-fulfilling prophecies, where building additional road capacity induces additional travel and congestion that wouldn’t have occurred if the project hadn’t been built.

As Transportation for America’s Beth Osborne trenchantly observes highway builders use the models conceal who’s making the decision:

“This is the fundamental problem with transportation modeling and the way it’s used. We think the model is giving us the answer. That’s irresponsible. Nothing gives us the answer. We give us the answer.”

3. Parking’s dominance of Kansas City. When you look closely, its astonishing home much of the urban realm is dedicated to private car storage. Writing at StrongTowns, Daniel Herriges lays out the key findings from an Urban Three analysis of land use patterns in Kansas City. This map tells much of the story—everything in red is dedicated for parking; all the arrows point to parking structures.

#

Between parking and the roadway itself (with much of that right-of-way dedicated to parking as well), moving and storing cars is the city’s dominant land use. On a per capita basis the city devotes more land to parking than it does to buildings. How did Kansas City end up with so much parking, you ask?  For the answer, you’ll want to read all five installments of Herriges’ series on Kansas City, which chronicles how abandoning streetcars and embracing freeways (and sprawl) led to today’s fiscally challenged “Asphalt City.”

New Knowledge

A carbon price tied to net zero emissions. Economists, ourselves included, love the idea of a pricing carbon.  The reason we pollute so much is because everyone gets to treat the atmosphere as a free dumping ground for greenhouse gases. If we charged polluters for the damage their emissions do to the atmosphere, they’d switch to other less polluting or non-polluting activities. The trick is setting the right price for carbon emissions.

In a perfect theoretical world there’s an ideal price for carbon that equilibrates the amount charged to the exact value of a eliminating one more pound of carbon.  But in reality, its devilishly hard to compute that price, inasmuch as it depends on complex (and largely insoluble) questions of long term interest rates, technology trends, and questions of inter-generational equity. Consequently, estimates of a fair and equitable “social cost of carbon” are all over the map.

Economists Noah Kaufman and his colleagues propose to cut through this Gordian knot of complexity and disagreement by working backwards from a desired level of carbon emissions, and estimating what price of carbon would be required to reach a specific goal; in their proposal net zero carbon emissions by 2050.

This is of course, a bit of a dodge, inasmuch as choosing a particular goal implies that that corresponds to a social optimum (and it implies values for all those hard to estimate factors that underly a social cost of carbon).  But it is a way to truncate the debate and move quickly to action.  Their modeling hinges on the accuracy of their technological forecasts and their estimates of the responsiveness of consumer demand, investment and innovation to particular carbon prices. But if they’re right, the solution to climate change would be within reach with 2030 carbon prices of $77 to $120 per ton.

Also, they’re not pricing deists:  Prices can be accompanied by selected regulations and investments:  Banning coal or fracking and paying compensation for affected workers and communities, and investing in low-carbon research and development would make sense.  Pricing helps reinforce these policies.

Kaufman, N., Barron, A.R., Krawczyk, W. et al. A near-term to net zero alternative to the social cost of carbon for setting carbon prices. Nat. Clim. Chang. (2020). https://doi.org/10.1038/s41558-020-0880-3

In the News

City Observatory’s Joe Cortright is quoted extensively in Aaron Gordon’s Vice article on the biases in traffic forecasting models (see “Must Read,” above).

Planetizen offered a detailed summary of our ranking of the nation’s least and most segregated cities.

The Week Observed, August 21, 2020

What City Observatory did this week

America’s most and least segregated cities. Residential racial segregation is a fundamental and persistent aspect of system racism in the United States. Segregation cuts of disfavored groups from economic and social opportunity, and cities with higher levels of segregation tend to have lower levels of intergenerational economic mobility.  In general, American cities have made progress—albeit slow progress—in reducing the levels of racial segregation.

This week, City Observatory presents data on the white/non-white dissimilarity index to show the level of racial segregation in the central, urban counties of the nation’s largest metro areas.  The dissimilarity index measures come from the Federal Reserve Bank of St. Louis’s voluminous economic database, and rank cities from least to most segregated. Portland, Oregon has the lowest level of white/non-white dissimilarity of any large US metro area.  That represents a dramatic change from 1970, when Portland was more segregated that typical large US metro area.  We have data for the 50 largest US metro areas as of 2018, the latest year for which the underlying census data is available.

Must read

1. Portland adopts the nation’s most aggressive re-write of single family zoning.  Sightline Institute’s Michael Andersen has an extremely comprehensive examination of Portland’s new residential infill project, which essentially allows four-plex, and even six-plex homes in the city’s single family zoned neighborhoods.  The policy has been years in the making, and got a big push last year from the Oregon Legislature’s mandate that most cities allow four-plexes.  Portland’s residential infill policy does that one better:  it allows six-plexes in single family zones, provided half the units are set aside for households with incomes below 60 percent of the area’s median household income.  The plan also makes important supporting changes:  it eliminates nearly all off-street parking requirements for these new residential developments, and also allows buildings with more units to have greater floor space, incentivizing developers to build additional units rather than McMansions.

Sightline Institute

It started slowly, but the measure built a strong political following of activists who recognized that the key to addressing the city’s housing affordability problems is getting more units of housing built.  And this measure leads the nation in showing how, technically and politically, this can be done. Plus, Andersen knows whereof he speaks: he and his colleagues at Sightline have been deeply involved in this effort for years.

2. Police-involved shootings by city.  The murder of George Floyd by police has triggered a rising awareness of police violence. Sadly, its not isolated and uncommon.  The website Cops in the Hood has tabulated data on “police-involved shootings”—itself a tortured and evasive term—and computed the per capita instance of such shootings in major US cities.

3. Why can’t we make cars safer for the people outside of them? CityLab’s David Zipper notes that we’ve invested considerable energy and regulatory emphasis on making cars safer for their occupants, with things like crush zones, seat belts and airbags. But in the US, little if any attention has been paid to making cars safer for people who aren’t seated inside their vehicle. In fact, the growing weight and height of cars, especially of trucks and SUVs has made them deadlier to others in collisions, especially to those on foot or cycling.

There are a whole range of technologies that could make cars safer for non-occupants, most notably using geographic positioning system information to limit vehicle speeds. That technology is already in place on 35 pound, $500 electric scooters (which pose negligible risks to non-occupants), but isn’t required on two-ton, $50,000 vehicles that are much more dangerous to others. The disparity speaks volumes about the deeply embedded privilege that drivers enjoy relative to those who don’t travel by car.

New Knowledge

The coming senior housing sell off.  Inexorably, baby booms are getting older, with more and more turning 65 every year. As they age, boomer homeowners are increasingly likely to want or need to sell their homes.  And the generations behind them are numerically smaller, less wealthy, and in many respects less enamored of low density living that boomers.  The result, according to the University of Arizona’s Arthur Nelson, is that in the coming decade there’s going to be a glut of sellers and a dearth of buyers for large suburban homes, with predicable impacts on home prices.

Nelson marshals an impressive array of data about the tenure patterns of different generations, and contrasts these with the generational shift in preferences for smaller, more central locations. In his view, the market demand for housing will shift from three-quarters of increased demand coming for households in their peak housing consumption years, two three-quarters of household growth being among those looking to downsize.

Nelson argues that public policy ought to insulate seniors from any erosion in the value of their homes.  Given that baby boomers have more wealth than any other generation, that seems like a stretch. Some softening of home prices, will by definition, make housing more affordable to younger generations who’ve had a hard time becoming homeowners because of high housing prices. The mismatch between a growing demand for urban locations and a glut of suburban housing seems like a very likely future.

Nelson, Arthur C. (2020) “The Great Senior Short-Sale or Why Policy Inertia Will Short Change Millions of America’s Seniors,” Journal of Comparative Urban Law and Policy: Vol. 4 : Iss. 1 , Article 28, 470-525.

In the News

Bloomberg Quint‘s Peter Coy draws from City Observatory’s most recent report, Youth Movement, in his article about urban inequality.

CityLab’s Laura Bliss quoted City Observatory Director Joe Cortright in her article examining Portland’s new residential infill policy.

 

The Week Observed, August 7, 2020

What City Observatory did this week

1. Is it random, or is it Zumper? Are rents going up or down in your city?  Listicles showing which places have the biggest jumps (or declines) in rents are a perennial media favorite, but as we’ve warned before, when it comes to data on rent changes, caveat rentor.  We once again sift through Zumper’s claims about rent trends, and find that their methodology (which is subject to composition effects) produces inexplicable deviations in trends for one-bedroom and two-bedroom apartments in the same market in about a fifth of metro areas.

It’s implausible to suggest, as Zumper’s data claims, that rents on two bedroom apartments in Columbus are down 5 percent while rents on one bedrooms in the same market are up 15 percent over the same time. There are better behaved and more reliable sources of rent data.

2. Self-proclaimed climate leaders shouldn’t spend $5 billion on transportation to reduce greenhouse gases by 5/100ths of one percent. Portland’s regional government, Metro, will ask voters this fall to approve a $5 billion dollar transportation measure, but despite the fact that it claims to care about climate change, and transportation is the biggest and fastest growing source of greenhouse gases in Portland, this package does essentially nothing to reduce emissions.

At a time when we all recognize the growing climate emergency, dedicating a huge chunk of revenue to projects that don’t move the needle on greenhouse gas emissions is a tragic mistake.

Must read

1. The Covid-19 Recession is hitting big cities harder.  Jed Kolko, Chief Economist for Indeed has parsed BLS job-loss data by metro size to show the differential impact of the recession.  The key takeaway:  job losses have been proportionally greater in the largest metros.  Metro’s with 5 million or more population have lost more than 11 percent of their jobs, compared to between 8-9 percent for metros under a million.

The reason likely has a lot to do with the knock on effects of the varied ability of people to work from home in large metro areas. While the people who work from home are able to keep their jobs (and paychecks) the businesses that depend on them purchasing services–think restaurants that cater to the lunch-trade, or business travelers–are harder hit in these  metros.  Looking at Indeed’s own data on new job postings, Kolko explains:

In sectors that serve local customers in person, job postings have fallen more in metros where people are more likely to have jobs that can be done  from home, allowing them to practice more physically distancing. Job postings in these in-person service sectors — retail, food preparation, sales, and beauty & wellness — have fallen more in metros where people are more likely to work from home, like San Francisco, Washington, Boston, and Seattle.

There’s an important lesson about economic linkages here: While work-at-home is good for those who have the opportunity, it doesn’t insulate other workers in service industries from the downturn. That’s why comprehensive policies, like enhanced unemployment insurance benefits are critical to keeping the economy from auguring downward.

2. Europe agrees to implement a carbon border tax.  In 2023, the member states of the EU have agreed to implement a carbon border “adjustment fee” that effectively adds the price of embedded carbon emissions into goods imported from other countries. A domestic carbon tax is a useful first step, but to make sure that a nation doesn’t disadvantage local producers, and simply end up exporting its carbon footprint, the border adjustment fee makes sure that consumers see the price of carbon in goods regardless of where they’re made. The fee will increase incentives to decarbonize within and beyond the EU, as those seeking to tap the European market will want to reduce carbon intensity to lower their likely fees. Some worry that this disadvantages lower income countries, but nation’s imposing the carbon fee could easily use at least some of the revenue to finance more generous foreign aid, rather than forcing less developed nations to depend on dirty industries to escape poverty.

3. Looking for patterns in the rent slowdown.  Since the onset of the pandemic in the Spring, rental markets across the nation have softened, and Zillow reports that nationally rents are down about one and a half to two percent in the past few months. Zillow focuses in on the city/suburb variation in these patterns and reports that the rent slowdown has been more pronounced in urban areas than their surrounding suburbs, although so far, the differences are generally small, and the patterns doesn’t hold for all metro areas. Rents have decelerated most sharply in the centers of New York, San Francisco and Dallas compared to their surrounding suburbs.  Nationally, urban centers have seen roughly a 2 percent deceleration; suburbs about a 1.5 percent deceleration. But some cities show the reverse pattern:  in 14 of the 34 markets Zillow studied, the city deceleration was less than in the suburbs, including Philadelphia, St. Louis, Baltimore, Portland, Cleveland, and Minneapolis.

New Knowledge

The costs (and benefits) of inclusionary housing requirements.  MIT economist Evan Soltas has a new paper estimating the costs and benefits of inclusionary housing policies which should be of tremendous interest to everyone interested in affordable housing. Cities around the country have a variety of policies that incentivize (or require) that developers include a certain portion of affordable housing units in new developments.

Voluntary programs offer developers financial incentives to participate, and its possible using variations in the terms of those programs over time and space to compute just how much subsidy one has to provide to get developers to participate. Soltas uses New York’s property tax exemption program (called 421a) to measure how much it costs (in foregone tax revenue) to prompt developers to build an additional unit of affordable housing. City-wide, Soltas finds that the city has to forego about $1.6 million in property tax revenue for each additional affordable unit.

But there’s much more to the story.  The cost varies widely across neighborhoods in the city, with even higher subsidies needed in expensive locations, and far smaller ones in lower income neighborhoods. And cost isn’t the only important factor at work here: one objective of the inclusionary programs is to create more opportunities for low income households to live in high opportunity neighborhoods. Soltas taps data from the Equality of Opportunity project’s estimates of the impact of neighborhoods on lifetime earnings of low income kids to estimate how much they gain from living in different neighborhoods. Even though its cheaper to subsidize units in some neighborhoods, the long term cost to kids in terms of lower lifetime earnings makes this a bad bargain. And while the most expensive neighborhoods (in Manhattan) produce greater benefits, they aren’t worth the costs.  Soltas does identify some “opportunity bargains” places that are relatively cheap to get more affordable units, and which provide better lifetime opportunities for kids.

All this suggests a couple of key findings:  First, inclusionary housing programs aren’t free, or even particularly efficient, at a city-wide cost of $1.6 million per additional unit. Second, it makes a huge difference to both the costs and benefits of inclusionary programs where in a city these units are built.  Few strategies have given much thought to how to tailor incentives to minimize costs while maximizing benefits.

Evan Soltas, The Price of Inclusion:  Evidence from Housing Developer Behavior, July 2020.

In the News

City Observatory director Joe Cortright presented the results of our recent study, Youth Migration, Accelerating America’s Urban Renaissance, for a webinar sponsored by the Manhattan Institute.  Full video available here:

The Week Observed, July 31 2020

What City Observatory did this week

1. The abject failure of Portland’s Climate Action Plan. Last month, Portland issued the final report on its 2015 Climate Action Plan. It emphasizes that the city took action on three-quarters of the items on the plan’s checklist, but glosses over the most important measure of results:  the fact that Portland’s carbon emissions have actually increased in the past five years.

The City’s adopted 2015 Climate Action Plan called for putting Portland on a path to reduce greenhouse gases by a cumulative 25 percent from 1990 levels by 2020, but instead, emissions increased, and the city is stuck only 15 percent below 1990 levels. A newly issued “final” progress report on the 2015 plan mostly papers over this failure, and the reasons for it (increased driving due to cheaper gas). Portland has now adopted a “Climate Emergency Declaration” that sets an even more ambitious goal (a 50 percent reduction from 1990 levels by 2030), but hasn’t done the math or spelled out the steps that will be needed to achieve that goal. Instead, it mostly promises to convene a new conversation about climate which will be more equitable and inclusive, starting in the fall of 2020.

2. Auto industry experts forecast a permanent decline in driving and car sales.  KPMG, management consultants to auto makers and their suppliers, has taken a close look at changed behavior in the wake of the coronavirus pandemic.  Many more of us are working at home, and shopping on-line, trends that KPMG expects to translate into permanent changes in behavior even after the pandemic subsides. For example, they estimate that car travel associated with shopping could decline by 10 to 30 percent, eliminating 40 to 130 billion miles of vehicle travel each year.

Less car driving means lower car sales—and less lucrative car repair—but also means fewer crashes, injuries and deaths and less pollution.  KPMG is warning its clients to prepare for a smaller auto market in the years ahead. Another groups that should pay attention to these forecasts is highway builders:  Less driving means less need for new and wider roads. Hopefully, lower gas tax receipts are already sending that message.

Must read

1. Exclusive communities deepen metropolitan inequality.  Brookings Institution’s Jenny Schuetz has a timely look at the way that exclusionary practices in some suburban jurisdictions drive metropolitan inequality and segregation. Schuetz explores the relationship between housing affordability and racial and ethnic composition of neighborhoods in cities in four large metro areas. She finds that exclusive suburban jurisdictions effectively exclude minorities by driving up rents. As she explains:

Whether or not expensive communities intend to bar entry to lower-income households, high housing costs are as prohibitive a barrier as more overt forms of discrimination. As long as there are substantial racial gaps in income and wealth, expensive communities will effectively be off limits to most Black and Latino households—as well as to renter households of all races.

The result is the most expensive cities in any metro area tend to have far fewer people of color than the overall metro area.  Here are data for four large metros: Dallas, Detroit, Los Angles, Washington

2. The high (and inequitable) cost of city parking. Writing at the Washington Post, Ike Brannon, channelling his inner Donald Shoup, calls out the high social and environmental costs and inequitable impacts of the district’s under-priced street parking. Residents pay just $35 a year for a street parking permit, well below the market value of parking (as high as $2-3,000 per year in the Adams-Morgan neighborhood). Car owners are, on average much wealthier, and much more likely to be white than the average DC resident, meaning that this subsidy is delivered chiefly to higher income households, and is unavailable to those who can’t afford (or choose not to have) a car. But it isn’t just that people when out cars don’t benefit from under-priced parking, they pay an added cost in terms of worse transit service.

 . . . the ubiquity of parked cars slows down buses. For example, the L2 from Woodley Park to Farragut has five stops where a handful of parked cars can delay the bus by one to two minutes. The result is that thousands of bus riders lose five to seven minutes each day (and that’s one way) to accommodate 20 to 25 car owners.

In addition, the obsession with parking supply is a chief reason for NIMBY opposition to new housing. More residents would mean more competition for free or under-priced parking, so existing residents push to assure plenty of parking by blocking more housing. The result is that everyone pays higher rents because not enough housing is built to accomodate all those who would like to live in urban neighborhoods. Free and under-priced parking is the scourge of healthy, livable and fair cities.

New Knowledge

Why the ward system leads to higher rents. Many cities elect their governing bodies by single member districts. Voting rights advocates have pushed for single member districts as a way to secure representation for geographically concentrated minorities, and courts have ordered some cities to replace their “at-large” City Councils with members elected by district. But single member districts have the pernicious effect of amplifying the “NIMBY”—Not in my backyard—dominance of local land use decisions.  Unlike city councilors elected city-wide, members elected ward tend to be especially sensitive to the most localized aspects of decisions. By rule or custom, many ward-systems grant a personal veto—in Chicago “Aldermanic privilege”—for all land use decisions in their district.

While this effect is well-known in practice, there’s been little systematic investigation of the cumulative effects of ward-system on housing supply, until now.  Evan Mast, of the Upjohn Institute has assembled a unique database of cities and their housing markets, which traces the housing supply changes in cities that did, and didn’t change from at-large to single-member district representation.

The key finding:  as expected, cities that shifted to single-member districts saw a significant slowing in the rate of the approval of new housing construction, compared to otherwise similar cities that retained their at large systems. Overall, cities that switched to the ward system saw a 21 percent decrease in new housing production after doing so.  The decreases were disproportionately concentrated in multifamily housing, where new production declined 38 percent (single family homes were down, too, but only about 11 percent).

Ward-based systems amplify the political power of NIMBY opposition to housing, and thereby limit expansion of the housing supply, and likely drive up rents and home values, which may be a gain for incumbent homeowners, but means less housing affordability for everyone in a city.

Evan Mast, Warding Off Development Local Control, Housing Supply, and NIMBYs, W.E. Upjohn Institute for Employment Research, July 10, 2020

In the News

Now is the wrong time for a multi-billion dollar transportation package for the Portland metropolitan area, City Observatory Director Joe Cortright told Willamette Week.  The region would be better advised to wait and see how the decline in driving caused by the pandemic (and predicted to continue by auto industry experts) changes the need for transportation investments.

The Week Observed, July 24 2020

What City Observatory did this week

The exodus that never happened. You’ve probably seen stories bouncing around the media for the past few months claiming that fears that density makes people more susceptible to the pandemic are prompting people to leave cities in droves. While and enterprising reporter can always find an anecdote to build such a story upon, the data don’t support that thesis. The latest bit of evidence comes from ApartmentList.com, which has done a detailed analysis of where people are searching for new apartments. Contrary to the media hype, interest in cities is actually up, with the market share of searches for cities outpacing suburbs and rural areas.

Even in the New York City metro area, the center of the pandemic (until recently), search activity in the five boroughs was up, while searches in surrounding cities and suburbs declined. ApartmentList has detailed data for the top 50 metro areas; all but a handful follow the national trend—toward, and not away from, dense cities.

Must read

1. Are Big Oil’s days as a political force numbered? The decline in oil prices, the stock market, and growing adoption of renewable energy sources have dealt a significant economic blow to the fossil fuel industry. Coal is mostly dying, and oil and gas companies are troubled. Sightline Institute’s Alan Durning speculates that this economic decline likely presages a parallel decline in political power—but with a lag.  As he explains:

. . . politics lags economics. Political change lags because beliefs solidify early in politicians’ careers. It lags because enduring relationships among power players outlast the realities in which they form. It lags because institutional advantages persist in law, policy, and custom.

For cities, the political power of the fossil fuel industry has largely been a negative. Cheap gasoline has fueled sprawl, and many of the costs and externalities of car dependence have been borne largely by cities and their residents. While Durning’s thesis seems fundamentally correct, dying or even declining industries can often double down on political strategies in an effort to stave of the inevitable, so we shouldn’t look for the political influence of this industry to disappear any time soon.

2. Challenging the “out-of-scale” myth. Listen to a controversy about virtually any new building or development, and you’re virtually certain to hear this objection, what’s proposed is “out-of-scale” with the surrounding community. It’s a coded way of objecting to more density (and more neighbors, and often, a more integrated neighborhood).  The implication is that all new development has to resemble what went before.  StrongTowns Daniel Herriges challenges both the aesthetics and logic of the “out-of-scale argument, and points out that the variation we observe is a natural part of cities:

Buildings are developed at different times, and in a rapidly changing city with permissive zoning rules, eclecticism will be the natural result of simple randomness. A place with rising land values will see some plots of land redeveloped at a higher density, while others—maybe right next door—are not, perhaps for no other reason than the owner has not wished to sell.

Perhaps the reason we hear this argument so much is that its an inevitable truism:  When houses or stores were first built on a previously vacant site, they were, by defnition, “out-of-scale” with their surroundings.  The process of development is always about some buildings having a different scale than others.

Out of scale! (Strong Towns)

3. As the cars come back, the buses slow down. As America gradually, and fitfully, reopens in the wake of the pandemic, auto traffic, which had fallen by more than half in nearly all US cities, is growing again. And in New York, where the virus hit hardest, rebounding traffic has had a measurable effect on bus travel speeds.  Streetsblog NYC reports bus speeds are down citywide:

Manhattan buses fared the worst in the slowdown, going from 7.6 miles per hour in May down to 6.8 miles per hour in June, a 10.5-percent drop in bus speeds. Brooklyn and Queens buses also faced steep percentage declines in the same time period, going from 8.5 miles per hour to 7.7 miles per hour (down 9.4 percent) and 10.8 miles per hour to 9.8 miles per hour (down 9.2 percent), respectively.

Its worth noting that slower bus speeds penalize the transit dependent, and lower the attractiveness and increase the cost of public transit. As we’ve pointed out before at City Observatory, slower bus speeds are inherently inequitable, given the lower incomes and more limited choices of transit dependent persons. In all of the brouhaha over whether we’re adequately applying an “equity” lens to transportation policy, there’s never been an opportunity for anyone to question the equity of how infinite and essentially free use of streets by cars results in inequitable outcomes for bus riders.  As with so many things, the pandemic revealed the costs that our auto-dependent transportation system imposed on everyone; it’s an opportunity to learn and fix things, if we choose.

4.  More highway to hell.  The road lobby is in complete denial about climate change. Latest case in point: a new report from TRIP, a highway lobby group, calling for hundreds of billions of dollars of additional spending on highways. No acknowledgement that highways have produced pollution and sprawl, eviscerated cities, and produced a car-dependent transportation system that is fundamentally inequitable. And no mention at all of climate change; the word “greenhouse” doesn’t appear anywhere in its report; “climate” appears exactly once: in reference to the impact of moisture on roadwear.

Pavement deterioration is caused by a combination of traffic, moisture and climate. Moisture often works its way into road surfaces and the materials that form the road’s foundation.

The fact that transportation accounts for 40 percent of US greenhouse gases, and is increasing:  not TRIP’s problem. And tragically, as we’ve pointed out before, the Transportation Research Board, an arm of the National Academies of Science, is being used as a shill by the highway lobby to promote more road spending, more car dependence, and more greenhouse gas emissions, putting us all on the highway to hell.

New Knowledge

Slow streets are safe streets. The National Asscoiation of City Transportation Officials has a new guide to creating safer city streets:  City Limits: Setting Safe Speed Limits on Urban Streets. For decades, car-oriented highway departments have used the pseudo-scientific 85 percent rule to allow speeding drivers to set the legal limit of speeds on urban streets, with predictable results for traffic safety, especially pedestrian deaths. While the rest of the industrialized world has made steady progress in reducing traffic deaths, the US has not.

In addition to making the case that slower city streets are safer for everyone, the NACTO manual lays out clear how-to guidelines for implementing slower speeds, with methods and results drawn from member cities. In the wake of the Covid-19 pandemic, there’s a new appreciation of how car-dominated the urban realm has become.  This manual provides practical advice on how to change that, and make our streets safer.

 

The Week Observed, July 17, 2020

What City Observatory did this week

Dominos falling on Portland’s Rose Quarter freeway widening project. In the space of just a few hours two weeks ago, local political support for an $800 million freeway widening project collapsed, after local African-American groups pulled out of the project’s steering committee. We trace the rapid-fire chronology of local leaders bailing on the project, and then dig deeper into the reasons why support for this massive project evaporated so quickly.

Portland’s Mayor and Metro President were clinging to the Albina Vision to justify the I-5 Rose Quarter Freeway widening–no more. (Concept: Jonathan Maus/ Bike Portland – Illustration & Copyright: Cloe Ashton, May 2019).

There are a host of environmental, financial and social reasons to oppose this flawed project, but its apparent death-knell was the continuation of a decades-long cavalier attitude of the Oregon highway department to community concerns. While the project is yet completely dead, it is so out-of-touch with our understanding of urban transportation, our needs to seriously combat climate change, and the long-delayed effort to make meaningful reparations for the damage done to the historically Black neighborhood, than one can only hope it expires soon.

Must read

1. Why reducing car travel and prioritizing buses is inherently equitable. Our experience with the Coronvirus-induced Stay-at-Home orders has shown how “less-car” cities work better for everyone. Some of the best evidence comes from New York City, where lessened car traffic has accelerated bus speeds. Streetsblog NYC points out:

The decrease in car traffic during the coronavirus pandemic shows exactly what the mayor could do to help bus riders. For example, the M22 — which stops at the corner of Worth and Centre streets — inched along at just 4.8 miles-per-hour in June, 2018, according to data from the MTA. But during the height of the city’s lockdown, when car use dropped by 78 percent, bus speeds on that same route went up to 7.4 miles-per-hour — yet the city has made no moves to ensure that fewer private cars block MTA buses, whose riders are mostly people of color with an average income of $28,455 a year.

If we’re looking to have a fairer and more sustainable transportation system, one of the best things we can do is discourage private car use in cities.

2. Covid-19 Trends and Foot Traffic by city. The American Enterprise Institute’s Housing Center has assembled a very clever visualization showing the trends in foot traffic at stores, restaurants and other destinations (gathered from anonymized cell location data) and the rate of reported cases of Covid-19 in the nation’s 40 largest metro areas.  The data show the disparate patterns of the disease in different places, and how our collective willingness to Stay-at-Home blunted the spread of the virus—for a while. This chart shows foot traffic (blue) normed to pre-pandemic levels and new cases per 100,000 population per week (red), from mid-March to early July,

New York City, as well known, bore the brunt of the pandemic early on, but has brought new case counts down sharply.  Meanwhile, in other cities, notably Miami and Phoenix, reopening in mid-to-late May triggered a dramatic rise in cases.  Strikingly, in the past two weeks, foot traffic has tapered off almost everywhere, in part due to the July 4 holiday, but also, apparently due to increased reticence to venture forth as the pandemic’s second wave hits.

3.  Death to the Park and Ride.  For a long time, transit projects have been conceived as ways to lure commuters out of their private cars, for a portion of their journeys, anyway. That’s why many new and expanded high capacity transit projects feature expensive, publicly subsidized parking garages. It’s increasingly clear that this approach is counterproductive to environmental and equity goals.

A bleak Toronto park and ride lot. (Matt Pinder)

Matt Pinder, blogging at Beyond the Automobile, writes that now is the time to close the book on this auto centered strategy. It’s led us to build stations in the wrong places, turned station areas into desolate car-dominated spaces, and is fundamentally inequitable:

The park-and-ride is also a problematic public asset because it privileges the wealthy. A commuter on the [Toronto’s] GO Train whose household makes more than $125,000 per year is nearly twice as likely to drive to the station compared to someone with a household income of less than $40,000 per year. When parking is free, the cost to build, operate, and maintain it is bundled into the cost of the transit fare. In other words, when park-and-ride is free, the poorer riders subsidize the wealthier ones.

Pinder argues that the wake of the Covid-19 pandemic is a good time to think about repurposing park-and-ride lots as dense, mixed use areas. Rather than catering to a twice-a-weekday flood of commuters, transit ought to enable walkable places and car-free living. The result would be more sustainable and more equitable.

New Knowledge

Housing shortages are behind gentrification pressures.  The Urban Institute has a new study looking at the connection between housing supply and gentrification.  The study uses data from home mortgages to look at the income levels of home buyers, and compares them to the income level of the neighborhood in which they’re purchasing homes. The study operationalizes “gentrification” as higher income buyers representing a disproportionately large share of home buyers in a neighborhood relative to its current income. The author’s find a strong relationship between expensive housing markets (attributable to a shortage of housing) and the degree to which high income mortgage borrowers buy homes disproportionately in low- or moderate income neighborhoods.

This chart summarizes the data for the 74 large metro areas included in the analysis and finds that using this metric, gentrification is higher in markets with a shortage of housing.

Our examination reveals that, in many MSAs, high housing costs—resulting from a lack of available housing—cause affluent buyers to look for homes in low- and moderate-income (LMI) neighborhoods. That means cities’ housing supply can determine how fast gentrification may occur. Boosting the supply of housing can slow the pace of new buyers moving into lower-income neighborhoods.

The implication of this study will seem counter-intuitive to many.Its commonplace to argue that new market-rate housing is a cause of gentrification, but its actually the case that the reverse is true:  The way to lessen gentrification pressure is to build more housing.

It’s also worth noting that the usual caveats about eliding the difference between “gentrifying” and “gentrified” apply here. This study, like many others, describes a neighborhood as “gentrifying” if the proportion of high income purchasers exceeds the current high income share in the neighborhood. These “gentrifying” neighborhoods can remain much more income-diverse than the typical neighborhood in a metropolitan area.

Laurie Goodman, Ellen Seidman & Jun Zhu, To Understand a City’s Pace of Gentrification, Look At Its Housing Supply, Urban Institute, June 24, 2020

In the News

 

The Week Observed, July 10, 2020

What City Observatory did this week

CityBeat: NPR urban flight story. The pack animals of the media have settled on a single, oft-repeated narrative about cities and Covid-19; that fear of the virus will lead people to move to the suburbs. The latest iteration of this trope is an NPR story, focused on a couple moving from New York City’s Upper West Side to suburban Montclair, New Jersey. There are several problems with the story (starting with the fact that the couple was already planning to move prior to the pandemic). More importantly, the risk of contracting the virus is about one-third greater in the suburb they’ve chosen than in Manhattan. More broadly, the data don’t bear out this claim: well-educated young adults are moving to cities in increasing numbers, and real estate search data from the height of the pandemic show no decline (and even a slight increase) in interest in urban locations.

Must read

1. Stop more building roads. In an Op-Ed published by The New York Times, engineering professors Shoshanna Saxe and 

Most developed countries already have effective road systems; they can be maintained, but the economic benefits of expansion are marginal and the downsides significant. Road construction is environmentally destructive, and it promotes urban sprawl, congestion, air pollution and inequality.

The Hippocratic oath requires that physicians “First, do no harm.”  It’s advice that ought to apply with equal, if not greater force to engineers. Our transportation system, and the sprawl and climate crisis it has spawned, are the product of spending too much on transportation, not too little.

2. And take the cars off the streets of Manhattan. Farhad Manjoo has a long-form on-line essay (with singing and dancing graphics) in The New York Times, arguing that now is the time to ban privately owned cars from Manhattan. The pandemic has given city dwellers everywhere a better sense of what life would be like with fewer cars, and how we might better use our scarce public road space to serve a wider variety of uses (an users). Manjoo makes a detailed and comprehensive case for the safety, environmental and livability benefits of greatly reducing the number of cars on the island. Drawing on the work of a number of urbanists, Manjoo sketches out what life might be like with fewer cars in New York City:

Underscoring all of the reasons for a ban is simple equity.  Most people in New York don’t own cars, and don’t use them to travel to work or other destinations; its simply unfair to prioritize the needs of a generally wealthier few over the many especially when cars degrade everyone’s quality of life.  To quote Manjoo:

The amount of space devoted to cars in Manhattan is not just wasteful, but, in a deeper sense, unfair to the millions of New Yorkers who have no need for cars. More than half of the city’s households do not own a car, and of those who do, most do not use them for commuting. . . . New York’s drivers are essentially being given enormous tracts of land for their own pleasure and convenience.

Streets ought to give priority to the people who live in the city, and who want to be there, rather than those who simply want to travel through it (or use the public realm for free storage of their personal vehicles. No city is better position to lead the way in this revolution than New York.

The Week Observed, June 26, 2020

What City Observatory did this week

When NIMBYs win, everyone loses. Two land use cases from different sides of the country are in the news this week. In both cases, local opponents of new housing development have succeeded in blocking the construction of new apartments in high demand neighborhoods. The high profile case is in Palo Alto, California, near Silicon Valley, where local homeowners managed to subject a 60 unit senior affordable housing project to a local referendum, and killed it.  Now the site is home to newly built $5 million mini-mansions. With fewer and more expensive homes, the area will continue to be an enclave for the wealthy.

Meanwhile in working class Inwood, in Northern Manhattan, local residents have one a first-round court victory challenging the upzoning that would have allowed more market rate apartments, as well as set aside of affordable units. The court found that the city had failed to contemplate the impacts on the neighborhood’s racial makeup. What that misses, in our view, is that not approving upzoning in Inwood will also have a significant impact on the neighborhood’s affordability, and thereby, its demographic composition. Blocking new housing doesn’t keep a neighborhood affordable; actually its just the opposite:  the more constrained the housing supply, the worse the affordability problems, the greater likely displacement and the more sweeping the demographic changes. That happens whether the NIMBYs at work are wealthy homeowners or struggling renters.

Must read

1. Whose streets? Cars’ streets.  In the wake of the growing public debate on the structural racism embedded in so many of our institutions, it seems like a good time to revisit this excellent analysis of our transportation system written by Ben Ross in 2014.  The way we’ve built our roads, particularly in suburbs, where a growing number of low income families and people of color are living, creates an environment that is deadly or dangerous to those who can’t drive or don’t own cars. Suburban roads incentivize and privilege high speed car travel, with few cross walks and sparse transit service, and when they are killed in collisions, pedestrians are often blamed.

The full weight of ninety years of car-first engineering bears down [on the suburban poor] as they make their way to and from decaying apartment complexes and aging tract houses. Long walks to the main road, unprotected dashes across wide highways, and perilous waits at bus stops on unpaved shoulders are a daily routine. A landscape created for affluent motorists becomes an oppressive burden in its decline.

The problem is compounded by car-oriented laws that are routinely used to harass and intimidate people of color. “Jay-walking” becomes a code-word for blaming pedestrians hit and killed by cars, and a reason police can use to detain or arrest citizens: Michael Brown was gunned down by a policeman in Ferguson, Missouri after being stopped for allegedly jay-walking. As Ross relates, the pedestrian hostile suburban environment is no accident: it’s literally mandated by a host of engineering rules and standards. If we’re going to overcome racism, we’ll need to change these rules of the road.

2. Cities will survive. Richard Florida weighs in on the long-run implications of the pandemic for cities.  There’s lots of hyperventilating and pessimism about urbanism, but Florida isn’t having any of it.  Cities have weathered similar challenges in the past and emerged even stronger; the decisive agglomeration advantages of being in cities overwhelms the perceived disadvantages of density. Florida is also quick to note that today’s unrest over racial injustice is very different than in the 1960s; its a racially and economically diverse group of protestors fighting for more just cities, and amounts to a powerful reason for hope, and not the polarizing paroxysm of fear.

New Knowledge

More evidence that density has little to do with Covid-19 prevalence. New York City tested all women who delivered children in NYC hospital, providing a comprehensive sample of women from different parts of the city.  Researchers looked at the correlation between the likelihood of testing postive for Covid-19 and various neighborhood socioeconomic characteristics.  They found positive links between housing over-crowding and the odds of being infected, but a negative relationship between the number of units in a building and offs of being infected.  People who lived in buildings with more units were less likely to have Covid-19.

The study found no statistically significant relationship between population density and the probability of having a Covid-19 infection (i.e. women who lived in higher density neighborhoods were no more or less likely than women from lower density neighborhoods to test positive for the virus.

There was no statistically significant association between SARS-CoV-2 infection and population density (interdecile OR, 0.70 [95% CI, 0.32-1.51]) or poverty rate (interdecile OR, 2.03 [95% CI, 0.97-4.25]).

Ukachi N. Emeruwa, Samsiya Ona,Jeffrey L. Shaman, et al, Associations Between Built Environment, Neighborhood Socioeconomic Status, and SARS-CoV-2 Infection Among Pregnant Women in New York City, JAMA. Published online June 18, 2020. doi:10.1001/jama.2020.11370, June 18, 2020

In the News

The Pittsburgh Business Times wrote about our report Youth Movement, and its relationship to the market for new apartments in that city.

Greater Greater Washington pointed its readers to the Youth Movement report.

The Week Observed, June 5, 2020

What City Observatory did this week

1. Covid-19 and Cities:  An uneven pandemic. We’ve been following the progress of the Covid-19 virus in the nation’s metropolitan areas for the past three months, and with the benefit of hindsight we can now trace out some key facts and trends. Overall, its apparent that the pandemic has been far worse in some cities than others. Across the 50 largest metro areas, the prevalence of Covid-19 cases varies by a factor of 20 between the hardest hit and least affected large metro areas. There have been both geographic patterns of infection (the Northeast Corridor has suffered from Washington to Boston), as well as isolated and severe outbreaks in metro areas that run counter to larger regional trends (New Orleans).  Most strikingly, some cities that largely avoided the virus early on have seen substantial increases in cases, while initially hard hit cities have damped down the spread of the virus.

Two months ago, Seattle was the hardest hit metro, and Minneapolis the least affected; today Minneapolis has a higher rate of prevalence than Seattle, and is experiencing the highest growth in cases of any large metro area.  So much for our speculation that Minnesota nice would translate into more effective social distancing.

2. Whitewash for Oregon DOT’s Freeway Widening project. For several years, Oregon’s highway builders have been pushing an $800 million freeway widening project near downtown Portland with the incredible claim that it will–unlike every other freeway ever built–lead to less air pollution and greenhouse gases. The agency’s claim has been subject to withering public criticism, which it has largely ignored. Last month, the agency convened a six-person “peer review” panel to look at its environmental work. But the panel held all of its meetings behind closed doors, didn’t hear from project critics, or apparently even review the lengthy and detailed technical critique of the project’s key traffic projections–which largely determine the pollution estimates. It’s a cynical ploy to create yet another false talking point in favor of the freeway widening project.

3. Advice to the Governor on recovering from the pandemic.  City Observatory’s Joe Cortright has served as Chair of the Oregon Governor’s Council of Economic Advisors.  On May 29, the Council met with Governor Kate Brown to discuss how the state should prepare to rebound from the pandemic and its associated recession.  We have a short policy memo outlining the major points presented in this briefing, which are likely applicable to other states as well.

Must read

1. Police Killings by Metropolitan Area.  There’s little or nothing we can add to the anguish, anger and outrage over the killing of George Floyd by Minneapolis police.  We’ll simply note that police killings are a regular occurrence, and that their victims are disproportionately Black Americans and other people of color. The research collaborative Mapping Police Violence” has assembled a wide array of data from across the nation showing the extent and frequency of police killings.  Solving any problem begins with understanding its scope and severity—and this is a good place to start.

2. Streetsblog NY weighs in on the CDC’s anti-urban advice for fighting Covid. The Centers for Disease Control appeared to be having its very own “Marie Antoinette” moment last week, in the form of a recommendation that everyone drive alone to work, and that employers ought to subsidize parking.  It’s a plainly impossible “solution” both for those who don’t own, or can’t drive cars, and also for cities, who have no room on their streets or parking space to accomodate everyone driving.In response to powerful and immediate reactions from the likes of Streetsblog, the National Association of City Transportation Officials (NACTO) and Smart Growth America, CDC backpedaled. As Streetsblog relates:

“This was an important recognition by CDC of the ways that their previous guidance actually contradicts decades-long work within their own organization to help address health by encouraging more walking, more biking, and more transit use in metro areas across the country,” said Steve Davis of Smart Growth America

3. Ed Glaeser on the Pandemic and the Future of Cities.  (Actually a “must-listen”). Harvard Economist Ed Glaeser, author of “The Triumph of the City” is interviewed on LA public radio station KCRW about how the Covid-19 pandemic is likely to affect urban living. One bit of history:  this is not the first health challenge cities have faced. Glaeser notes that that the plague even affected classic Athens. But in almost every instance, cities have rebounded. The 19th century parks and open space movement was one example of how cities changed to become healthier. But success is not foreordained. Glaeser:

History teaches us that this can go two ways:  urban areas managed to rebound wonderfully from the influenza epidemic of 1919; the twenties one was of the great city-building decades in American history, where we built things like Rockefeller Center and the Empire State Building.  But the plague of Justinian that struck Constantinople in 542 AD led to 800 years of de-urbanization in Europe.

Ultimately, though, Glaeser is optimistic about cities:  The fact that we are social beings, and that we grow smarter by being close to other smart people creates a powerful city-centered dynamic in a knowledge-driven economy. Cities will be the crucibles that overcome the pandemic.

In the News

The Salem Statesman-Journal quoted City Observatory director Joe Cortright’s comments to the Governor’s Council of Economic Advisors.

Oregon Public Broadcasting reported on Cortright’s analysis of the Oregon Department of Transportation Peer Review report on the Rose Quarter I-5 freeway widening project.

The Week Observed, May 29, 2020

What City Observatory did this week

1. LA Covid correlates with overcrowding and poverty, not density. City Observatory is pleased to publish a guest analysis and commentary from Abundant Housing LA’s Anthony Dedousis.  Los Angeles County has released detailed geographic data on the incidence of the Covid-19 pandemic, and Anthony offers a series of charts, maps and a regression analysis that explore the common characteristics that are related to the the outbreak. He finds that poverty and housing overcrowding are positively correlated with the prevalence of Covid-19 cases within LA County.  But like other studies that have looked at neighborhood geographies, he finds essentially no correlation between the housing density and Covid-19 cases.

2. City Beat: Pushing back on the claim that a city is “uniquely vulnerable” to the pandemic.  We look closely at a recent Portland Oregonian article claiming that Multnomah County, home Portland, is somehow “uniquely vulnerable” to the Coronavirus, because of its size, diversity and density. While its clear than the area’s large population and greater diversity than the rest of the state makes the hiring of culturally adept contact tracers a larger and somewhat more complex task, there’s little to indicate Portland’s underlying problem is worse or different than elsewhere in the state. Multnomah County’s rate of cases per capita is about half that of the Salem area, and is only slightly higher than in adjacent suburban Washington County. Most egregious, the article asserts (without any evidence) that Portland’s density makes Covid-19 worse.

Must read

1. Henry Grabar, writing at Slate, warns cities not to repeat their errors of the past in kowtowing to the automobile. Sure, we have some work to do to get transit systems back on their feet in a post-Covid world, but cars aren’t the solution:

Another carpocalypse is looming as coronavirus shutdowns ease. Traffic is rebounding but mass transit is not—and won’t for some time, if the experience of cities in Asia and Europe are any guide. Once again, city leaders will be under enormous pressure to accommodate drivers.

We’ve been down this road before: For most of the 20th century, planners were convinced that faster, bigger roads and ample free parking would halt “decentralization” and save the centers where people worked. The results speak for themselves: Cities with overgrown highway networks and plenty of parking are, contrary to theory, now the ones that few people want to come to. Cities cannot beat suburbs at their own game. But they can destroy themselves in the process.

2. The prescription for avoiding the next pandemic (and saving the planet) is to clean up urban transportation.  In an Op-Ed for the Boston Globe, Dr. Gaurab Basu, makes an explicit connection between the coronavirus, climate change and the deeply unfair characteristics of our current transportation system.  Low income people are not only poorly served by our auto-dominated transport system and feeble mass transit, they also bear the brunt of its negative health and environmental effects. Low income communities, are, for example, disproportionately exposed to fine particulates from vehicle exhaust and tire wear, and the respiratory problems that creates make them more susceptible to Covid-19:  As Dr. Basu eloquently puts it:

My oath as a doctor is to first do no harm. But our transportation system does active harm to my patients by polluting the air and destabilizing the climate. We need to stop describing the big problems of our time, and instead act with conviction to solve them.

3. Coronavirus is not a reason to abandon cities.  Aaaron Gordon writes at Vice, in a post-covid world public policy will be critical to the future of cities. While (as noted elsewhere in today’s Week Observed) the density argument is essentially a red herring, the pandemic itself creates an unsettled and changeable political environment in which we’ll be seeking solutions that give us an assurance that we won’t be susceptible to future breakouts. For some, that’s framed as an “escape to the suburbs.”  The danger is that, as in the past, public policy will be tilted in a way that hurts cities.  Gordon notes:

Just as the widespread abandonment of American cities in the 20th century was the result of very clear policy choices made at all levels of government that incentivized people on nearly every level to buy a house in the suburbs, so too will whatever happens with American cities next be the result of people responding to incentives put before them, not a vast array of individual choices about how they feel about density. Much of this rests on the federal government’s shoulders, but cities and states have leeway to determine their own futures.

New Knowledge

A close look at housing density and Covid-19 in New York.  The Citizens Housing and Planning Council, a non-profit advocacy group in New York has compiled a terrific analysis of pandemic data for New York, compared to surrounding jurisdictions and other cities worldwide.
The report carefully teases out the important differences between building density and housing over-crowding, which are frequently elided or ignored in many press accounts. It has a zip-code level regression analysis looking at the correlation between housing overcrowding and rates of Covid-19 cases in New York: there is a positive correlation between overcrowding (which itself is closely associated with poverty) and Covid cases.  You’ll also find a detailed examination of the numbers of cases in institutional settings (like nursing homes and jails) which account for a disproportionate share of Covid cases and deaths.
While the report has a litany of statistics, and links to underlying studies, we especially appreciate the direct and graphic style it uses to communicate its conclusions:

While much of our attention is now focused–appropriately–on working to reduce the number of new cases daily, cities everywhere will want to do the kind of careful analytical work presented here to help give their citizens and government officials a clearer idea of what factors do—and do not—increase a city’s vulnerability to pandemics.

Citizens Housing and Planning Council, Density and Covid-19 in New York, May 2020,
https://chpcny.org/wp-content/uploads/2020/05/CHPC-Density-COVID19-in-NYC.pdf

In the News

Strong Towns republished our commentary, Postcards from the Edges, noting that the Navajo Nation, with some of the most sparsely inhabited areas of the country has a higher rate of Covid-19 cases per capita than New York City.

The Hillsboro (Ohio) Times-Gazette, quotes City Observatory Director Joe Cortright’s commentary, “Density isn’t Destiny”  in its article discussing the pandemic.

The Week Observed, May 22, 2020

What City Observatory this week

1. Postcards from the Edges:  Looking at the relationship between density and the pandemic. There’s a widely circulating meme associating urban density with the spread of the Covid-19 virus, undoubtedly because people know that the virus has hit New York City particularly hard, and well, it is America’s densest city.  There’s plenty of data to suggest that density is at best a minor factor, but two edge cases point up the shortcomings of the “density=pandemic” theory.

The first comes from the tragic explosion of Covid infections and deaths in the Navajo Nation, which now has the unfortunate distinction of having even more cases per capita than New York, despite being one of the most sparsely populated parts of the continent. Meanwhile, in contrast, our second postcard is of Vancouver, British Columbia, one of the continents five densest cities.  We’ve updated our earlier analysis, and it shows that metro Vancouver has a lower rate of reported Covid-19 cases than any US metro area with a million or more population.

2.  A Rosetta Stone for county-based city/suburb definitions. One of the most widely and frequently available sources of detailed geographical data for the US is gathered for the nation’s counties. Because it covers the entire nation and is a manageable and slowly changing set of boundaries, county data is handy for quickly analyzing nationwide geographic patterns of activity. Several very good scholars have used these county-level data sets to compare the growth and performance of urban and suburban areas. But as we explore, classifying any given county as urban, suburban, rural or some in-between category is anything but easy and unambiguous. This commentary lays out the differences for urban/rural classification systems used by the Brookings Institution, the Daily Yonder and Indeed economist Jed Kolko.  Our results show that there are wide differences among these three quite reasonable approaches in counting which counties are urban, and how many people live there.  For those who use county data, we also present a side-by-side translation of the three different definitions as applied to the nation’s largest metro areas, a kind of Rosetta Stone for interpreting the varied claims that are made about trends in urban, suburban and rural America.

3.  Is the pandemic worse in cities?  Hard to tell from county data. In the Corona-Virus pandemic we’ve all become much more county-centric than we realize.  Because health data, like the number of reported cases and deaths due to the pandemic are collected by County Health Departments, these 3,000 or so varied government units have become the underlying geography for analyzing the virus. As we’ve noted, some analysts have tried to characterize the urban/suburban incidence of the pandemic by aggregating this county level data, using one of the three rubrics we examined in our previous commentary.  We use our Rosetta Stone crosswalk to better illuminate the varying conclusions that these different analysts make about the geographic patterns of the pandemic.

Must read

1. Why cities are more resilient in the face of disease.  Writing at The Conversation, epidemiologist Catherine Brinkley takes the long view of cities as a technology for coping with disease.  When earlier epidemics, like yellow fever and cholera struck cities, we actually didn’t turn tale and leave, but instead remade our cities in ways that made them healthier for everyone. In the late 19th Century, cities invested in parks, to provide clean air, and freely available outdoor recreation to the masses, with the result that disease declined and health improved. Its now the case that in general, cities are healthier than suburban and rural areas, both because they facilitate a more active, healthier lifestyle; they also reduce car travel, which lowers the number of injuries and fatalities (something we’re seeing around the country in the pandemic). In addition, cities have been more resilient than lower density areas in coping with and recovering from past pandemics:

Yet while dense major cities are more likely entry points for disease, history shows suburbs and rural areas fare worse during airborne pandemics – and after. According to the Princeton evolutionary biologist Andrew Dobson, when there are fewer potential hosts – that is, people – the deadliest strains of a pathogen have better chances of being passed on. This “selection pressure” theory explains partly why rural villages were hardest hit during the 1918 Spanish flu pandemic. Per capita, more people died of Spanish flu in Alaska than anywhere else in the country. Lower-density areas may also suffer more during pandemics because they have fewer, smaller and less well-equipped hospitals. And because they are not as economically resilient as large cities, post-crisis economic recovery takes longer.

2. Urban Density is not the problem.  State Senator Scott Wiener, and co-author physician Anthony Iton, writing in The Atlantic take head on the claims that density has worsened in Corona virus.  Wiener and Iton point out that San Francisco, the nation’s second densest city, has done a well-above average job of containing the pandemic’s spread.  They argue that most critics are incorrectly conflating housing over-crowding and higher urban densities.

But density and crowding are different things. Crowding is what happens when, due to a lack of sufficient housing, families and roommates are forced into tight quarters designed for a smaller number of inhabitants. That crowding can increase spread of contagion. Density in cities—where people can live in uncrowded homes near neighbors, services, and commercial corridors—doesn’t.

And in fact, our failure to allow more housing to be built in cities in the face of palpable demand, by limiting density, is what drives up housing prices and creates over-crowding.  So far from being the cause of the problem, higher density in cities is one key to lessening the over-crowding that is a demonstrable contributor to pandemics.

New Knowledge

Millennials and cities.  A new report from the Knight Foundation asked a nationally representative sample about Americans a series of questions about how attached they are to their communities.  While the overall study is an ambitious attempt to provide insights about what generates community attachment, the study asked a number of questions about how people inhabit metropolitan space.
A question that particularly caught our eye, asked: “How often do you spend time in the principal city at the heart of your metro area?” While its difficult to compare answers across metro areas (because principal cities are such widely varying portions of the metro areas they occupy), it is illuminating to look at the differences in the answers by the age of the respondent.  The Knight survey shows a very strong correlation between age and the amount of time spent in one’s “principal city.”   More than half of the youngest adult respondents (Millennials)  reported being in their principal city on a daily basis, compared with only a little over 40 percent of GenX adults, and fewer than a third of older adults.
The survey confirms a growing body of research that cities are particularly attractive for young adults.
Molly M. Scott, Robert L. Santos, Olivia Arena, Chris Hayes, and Alphonse Simon, Community Ties: Understanding what attaches people to the place where they live.  Urban Institute and Knight Foundation, May 2020.

In the News

The New York Times quoted City Observatory director Joe Cortright in its on-line May 20, op-ed article, “How will cities survive the Coronavirus?:

One of the industries most disrupted by the pandemic and related travel restrictions is travel and tourism.  KGW-TV interviewed City Observatory director Joe Cortright on the likely impact on local lodging taxes in Portland.

 

The Week Observed, May 15, 2020

What City Observatory did this week

1.  City Beat:  We push back on a New York Times story claiming that people are decamping New York City on account of pandemic fears. You can always find an anecdote about someone leaving New York (or any city, for that matter) because people are always moving out of and moving in to cities. In New York’s case, there’s been a net inflow over the past decade, something that isn’t shown when looking at figures on domestic net migration (which leave out the key role that international immigrants play in gateway cities like New York. Plus, the premise of the story is wrong:  the prevalence of Covid-19 is higher in New York’s suburbs than in the city proper; in suburban Westchester and Rockland counties, cases per capita are 50 percent higher than in the city.

2. Thank you, Google! We’re all trying to get a better handle on the pandemic, and how it is affecting our communities and the economy. In an effort to provide a better picture of how our behavior had changed, Google tapped its vast trove of location data to see where we were spending our time.  They published “Community Mobility Reports”–global and local estimates of the change in time spent at home, at work and at common destinations.  While we appreciated the information, we and others, were disappointed that Google provided only a set of PDF files, rather than machine-readable data. To their credit, Google’s fixed that, and now makes CSV files with its estimates available.  As we noted, these are aggregated to a large enough geographic level (counties) that there’s no danger that any individual’s privacy is at risk.  Get the data.

Must read

1. Density and Covid:  The evidence from Chinese cities. There’s a widespread belief that density is somehow either a cause or principal contributor to the spread of the Covid virus. But a new study from the World Bank looks at the data on prevalence rates in Chinese cities and finds that density played little or no role in the pandemic.  Indeed, some of the densest cities, such as Shanghai, Shenzen, and Beijing have among the lowest rates of reported cases.  The authors conclude that statistically there is essentially no correlation between density and incidence of Covid.  They also note that density provides some significant advantages in supporting anti-pandemic strategies:

Higher densities, in some cases, can even be a blessing rather than a curse in fighting epidemics.  Due to economies of scale, cities often need to meet a certain threshold of population density to offer higher-grade facilities and services to their residents.  For instance, in dense urban areas where the coverage of high-speed internet and door-to-door delivery services are conveniently available at competitive prices, it is easier for residents to stay at home and avoid unnecessary contact with others.

2.  New York Times editorial on the essential cities.  The Times has a strong editorial reminding us in the midst of this national emergency, that now is not the time to walk away from cities, but rather to rededicate ourselves to making them the engines of widely shared opportunity. Its widely understood that the effects of the pandemic have fallen most heavily on the poor and people of color. In large part, that aspect of the virus reveals some critical shortcomings in urban America.  In “The Cities We Need,” the Times’ editors make a case that will be familiar to City Observatory readers:

Our cities are broken because affluent Americans have been segregating themselves from the poor, and our best hope for building a fairer, stronger nation is to break down those barriers.

Even in good times, the economic segregation of our large cities cuts many people off from the American dream, and intensifies many of our key national problems, from crime, to traffic, to environmental degradation.  In bad times, these fault lines are magnified. The editorial calls for a strong re-dedication to building truly integrated cities, where people of all backgrounds have the ability to live in any part of a metropolitan area. Perhaps the only area where we might part company with the editorial is on this claim, that there is just one way to reduce segregation:

There can be no equality of opportunity in the United States so long as poor children are segregated in poor neighborhoods. And there is only one viable solution: building affordable housing in affluent neighborhoods.

It’s true that opening up exclusive residential enclaves to affordable housing is one step, but another is recognizing that the movement of at least some higher income people into low income areas, which is often tarred with the brush of gentrification, also mostly has the effect of promoting greater racial as well as economic integration. While people should be free to move to other places, simply abandoning low income neighborhoods is not an option that will help many cities, and especially their remaining residents. In the end, though, its hard not to agree with their conclusion:

Reducing segregation requires affluent Americans to share, but not necessarily to sacrifice. Building more diverse neighborhoods, and disconnecting public institutions from private wealth, will ultimately enrich the lives of all Americans — and make the cities in which they live and work a model again for the world.

3. Business closures due to Covid-19. As everyone knows, Yelp is many consumer’s indispensable guide to shopping, dining and personal services. Yelp’s business listings are updated on a regular basis to accurately reflect opening hours, and as many businesses shut down (either due to a decline in consumers, or ultimately, stay-at-home orders, this showed up in Yelp’s data,  That’s all helpfully tabulated and shown in an animated map:

New Knowledge

People started staying-at-home well before stay at home orders. The Opportunity Insight’s team, led by Raj Chetty, have compiled an array of private sector data to track the our behavior during the Covid-19 pandemic.  They have an impressive user-friendly dashboard that lets you quickly find data on changes in sales, employment and traffic in states and metro areas, as well as showing key indicators of the Corona virus, such as reported cases and deaths.  And they have an important finding:  People started commuting less, eating out less, and staying at home more, well before local and state governments started issuing formal stay-at-home orders.
Raj Chetty, John N. Friedman, Nathaniel Hendren, Michael Stepner, and the Opportunity Insights Team, Real-Time Economics: A New Platform to Track the Impacts of COVID-19 on People, Businesses, and Communities Using Private Sector Data, May 2020

In the News

GreenBiz republished our analysis of what we can learn about transportation demand management from the Covid pandemic.

The Week Observed, May 1, 2020

What City Observatory this week

Our updated analysis of the prevalence of Covid-19 in US metro areas.  It continues to be the case that the pandemic is most severe in the Northeast Corridor.  The New York Metro area is the epicenter, as everyone knows, but far less noticed are the very high rates of reported cases per capita in all of the metro areas from Boston to Washington.

The Northeast corridor accounts for six of the eight hardest hit metro areas based on cases per 100,000, and alarmingly, the area continues to have the some of the highest rates of increase.

We have full details on all metro areas with a million or more population.  The good news is that the curve is flattening.  The rate of growth in newly reported cases continues to decline. Averaged over the last week, the daily growth rate in reported cases has declined to about 3.5 percent.

Must read

1. The Pandemic shows what cars have done to cities. Stay-at-home orders have dramatically reduced driving around the country. In the process, the pandemic has given us all a taste of what our communities would be like if they weren’t so completely given over to expediting car travel and priveleging those who are driving through a place over those who occupy it.  Tom Vanderbilt

Moments of crisis, which disrupt habit and invite reflection, can provide heightened insight into the problems of everyday life pre-crisis. Whichever underlying conditions the pandemic has exposed in our health-care or political system, the lockdown has shown us just how much room American cities devote to cars. When relatively few drivers ply an enormous street network, while pedestrians nervously avoid one another on the sidewalks, they are showing in vivid relief the spatial mismatch that exists in urban centers from coast to coast—but especially in New York.

2. Don’t blame density for New York’s Covid-19 problems. Aaron Carr notes that its a politically convenient diversion for New York Governor Andrew Cuomo to blame the city’s high density for the elevated rate of reported cases in New York. Carr marshalls a variety of statistics to show that both compared to other, far denser cities around the world, there’s little relationship between density and prevalence of Covid-19.  The same also holds within New York:  some of the least dense parts of the region have the highest number of cases per capita. Other dense places that reacted more quickly, like San Francisco, have seen a far smaller impact from the pandemic.

3. Why New York City’s density is a good thing for health.  Writing at CityLab, Manhattan Institute’s Nicole Gelinas tackles the claim that density is somehow a bad thing for the health of a city’s residents.  She notes that on the eve of the pandemic, the city reported an increase in the life expectancy of its residents to 81.2 years, an increase a full year over the past decade.  The city has improved health for its residents in a variety of ways:  car crashes, air pollution and crime are down. But its also the case that density, including the fact that New Yorkers walk a great deal more than most Americans, is a big contributor to better health.  As Gelinas notes:

Even just basic exercise — walking home from the subway — keeps the average New Yorker healthier than most suburban Americans. Just 22 percent of adult New Yorkers are obese, according to the city’s health department, compared to the 42 percent rate for the U.S. as a whole, as reported by the CDC.

The high number of Covid cases and deaths in New York are particularly alarming, and unfortunately lead many to incorrect conclusions. It’s difficult, in the moment, to weigh all of the different aspects of city life objectively: imagine trying to convince people coming out of the first screening of “Jaws” to head straight to the beach. In the long run, though, its clear that cities make us healthier.

New Knowledge

Stay at Home saves lives—by reducing crash deaths. A study of road crash statistics in California confirms that the big declines in traffic that we’ve seen due to compliance with Stay-at-Home orders is producing a collateral benefit of reducing deaths and injuries from crashes. The Road Ecology Center at the University of California, Davis, has compiled crash data.  It finds that the number of total crashes and injury/fatal crashes declined by about half compared to levels recorded prior to the Stay-at-Home order.
Fraser Shilling and David Waetjen, Special Report(Update): Impact of COVID-19 Mitigation on California Traffic Crashes, Updated, April 13, 2020, Road Ecology Center, University of California, Davis.

In the News

Streetsblog quoted City Observatory director Joe Cortright in its article “We Shouldn’t Have To Say This: Expanding Sidewalks Does Not Spread COVID-19.”

 

The Week Observed, April 10, 2020

What City Observatory this week

1. What cities are showing us about the progression of the Covid-19 pandemic.  In an important sense, each large US metro area is a separate test case of the path of the Covid-19 virus. By observing the path of the pandemic in different cities, we can get a sense of how it ultimately may be tamed. We step back to look at the differing experiences of US metro areas, based on our tabulations of the number of reported cases per 100,000 over the past month.  Seattle provides some signs for hope:  It had the first serious outbreak, but since then has dramatically reduced the rate of new cases reported (it was also one of the first to enact social distancing measures).  At the other end of the spectrum, Minneapolis-St. Paul has managed to keep its rate of reported cases well below that of other large metro areas, and as seen a very low rate of growth.  Is there something about social capital, health care or some other aspect of the Twin Cities that help it fight the spread of the virus?

2. Where people are staying at home. The key strategy for blunting the growth of the Covid-19 pandemic is restricting travel and implementing social distancing.  But how well are these tactics working?  We have some insights from “big data” extracted from the electronic breadcrumbs left by cell phones and other smart devices.  City Observatory has extracted county level data for the principal counties in each of the nation’s 53 largest metro areas in order to gauge the relative degree to which different metro areas have cut-back on their travel in recent weeks.  Data from Google and Cuebiq show significant reductions in travel.  The declines have been most pronounced in a number of well-educated tech-centers (where presumably a large fraction of the workforce can work remotely).  In addition, workplace travel and total travel seem to be down significantly in tourist-oriented metros like Las Vegas and Orlando, reflecting these regions’ dependence on the hard-hit travel, recreation, accomodation and food-service sectors.

3. A subtle and detailed picture of rental housing markets.  The DC Policy Center has a new report looking at rent control policy in the nation’s capital. It’s got a lot to say about that controversial policy, but its signal contribution to the housing debate is a much more nuanced and detailed picture of the way housing markets really work. A substantial fraction of Washington’s rental housing is privately owned single family homes, condominiums and small apartment buildings, which the report describes as the “shadow” rental market. Owners of these smaller properties have a good deal of flexibility to choose to live in their own building, or to put it on the rental market. One of the overlooked side-effects of rent control is that it tends to prompt a sharp contraction in this “shadow” inventory, as owners can choose to occupy homes themselves, or sell their investment to a new owner-occupant. Either way, this can quickly shrink the housing stock, leading to more displacement and further pressure on the un-regulated portion of the market. Few cities have developed such a detailed picture of the characteristics of the rental housing market, but more should, especially before they consider sweeping changes in rental regulations.

4.  Updated data on the number of reported Covid-19 cases in the nation’s large metropolitan areas. On a daily basis, City Observatory has been updating its estimates of the number of reported cases of Covid-19 per 100,000 population in each of the nation’s 53 largest metro areas (all those with a population of one million or more).  There’s a wide range of results, if the reported data are accurate.  New Orleans leads the nation with roughly 20 times more reported cases per capita than the typical metropolitan area.

Must read

1. Evidence that the Covid-21 curve is already flattening in New York City.  MIT Economist Jefflrey Hariris has a snap analysis of data on reported cases, hospitalizations and deaths in New York City which suggests that the pandemic’s exponential growth curve is already starting to flatten.  This is a hopeful sign that social distancing measures are starting to take hold.

Harris, Jeffrey, “The Coronavirus Epidemic Curve is Already Flattening in New York City” National Bureau of Economic Research Working Paper No. 26917, April 2020.

2. Miami blocks construction of freeway due to environmental concerns, and limited traffic benefits. The Miami Herald reports that an administrative law judge has blocked efforts to proceed with construction of a new 14 mile, $1 billion expressway. A key part of her rationale:  the project would produce limited improvements in traffic conditions because it would induce additional demand for travel. The judge found:

“Not only does the data reveal that the improvements in West Kendall congestion would be … ’meager,’ but also they provide no support for a finding that the [expressway plan] will accomplish its second objective — improving the commute time to downtown and other employment centers,”

3. A blown opportunity to repurpose street space for humans.  Bike Portland’s Jonathan Maus calls to task the Portland Bureau of Transportation for not thinking more creatively and expansively about re-dedicating some of the public right of way that has been vacated by cars in the pandemic to make it available for people.  Sidewalks are generally so narrow throughout the city that its not possible for two people to pass while observing the six-foot social distancing rule.  Meanwhile, adjacent streets sit substantially unused by cars (and those that are driving seem more likely to be speeding).  Despite the growing demand for public space (and wider spacing among people), PBOT has done little.  As Maus writes,

With car use at all-time lows, we have a tremendous amount of excess road capacity. Our streets represent thousands of acres of public space that could be put to emergency use to ensure healthy mobility for all Portlanders — from the central city to the eastern city limits.  But instead of enacting simple and proven measures to seize this opportunity and improve conditions, PBOT and Commissioner Eudaly are keeping the status quo and hiding from reality.

The broader point may be this:  If we can’t find the political will to reallocate street space to active transportation, when car use is at a decades long-low, when people are walking and biking in increased numbers, and its literally a matter of life or death, what makes us think we’ll change anything when the world returns to “normal”–whatever, and whenever that is?

New Knowledge

How do recessions affect mortality?  It’s a virtual certainty that the US is now in an economic recession after more than a decade of growth. How will the loss of millions of jobs (hopefully, only temporarily) affect people’s health.  There are some interesting–and on the surface, contradictory–findings.  For individuals losing one’s job is associated with increased risk of death. Some estimates suggest that losing your job is the health-risk equivalent of being ten-years older.  The surprising counter-factual is that US mortality rates, calculated for the entire population, tend to go down in recessions, that is, on an age-adjusted basis fewer people die in a given year in a recession than otherwise.
On its face that doesn’t make sense, but a recent study looking at the aftermath of the Great Recession helps clarify the paradox. Studies undertaken in the past few years shed some light on the apparent contradiction.  As the author’s explain, their are negative consequences for those who are unemployed, but the slower economy more than offsets those mortality losses by improving the health of those who hang on to their jobs.
Our results indicate that in comparison with employed persons, the unemployed have a significantly increased hazard of death. Since the increase in this hazard is at least 73% (Table 1, model M1) and 1 extra year of age raises the hazard of death by approximately 7%, the health-damaging effect associated with being jobless is similar to the effect of about 10 extra years of age. However, each percentage-point increase in contextual unemployment reduces the hazard of death by approximately 9% (Table 1, model M3). The magnitude of this effect is slightly greater than that of reducing age by 1 year.
José A. Tapia Granados*, James S. House, Edward L. Ionides, Sarah Burgard, and Robert S. Schoeni,“Individual Joblessness, Contextual Unemployment, and Mortality Risk,” American Journal of Epidemiology, July 2014.

In the News

CityPages reported on our theory that Minneapolis-St. Paul’s low incidence of reported Covid-19 cases may be a result of “Minnesota Nice” the fact that the region has long practiced the art of social distancing. More seriously, the article discusses the contribution of high levels of social capital to forging an effective community response to the pandemic.

Richard Florida gave a shout out to City Observatory’s analysis of the geography of the Covid-19 pandemic at CityLab.

 

The Week Observed, March 20, 2020

What City Observatory this week

1. Cheap gas means more pollution and more road deaths. Russia and Saudi Arabia have engineered a big decline in oil prices in the past few weeks, and as a result, US gas prices are now expected to decline by about 50 cents a gallon in the coming months. While that sounds like a bargain, we know that cheaper gas translates directly into more driving, more pollution and more crashes. While that will likely be blunted by an economic slowdown because of the disruption caused by Covid-19, the long run effects of cheap gas are negative–the likely fall off in domestic oil production and investment will overwhelm the economic boost from cheaper gas.  The decline in oil prices is a great opportunity to do something we should have done long ago:  impose a carbon tax.

2. It’s no mystery:  Bus ridership has declined because gas is so cheap.  And its going to get worse. A story in last week’s New York Times posed a widespread decline in bus ridership nationally over the past five years as a mystery. While they line up some interesting suspects (demographics, the changing population of transit-served neighborhoods) they allowed the real criminal to escape scot-free. But we have the smoking gun.  The decline in gas prices from 2014 onwards matches exactly the decline in bus ridership.

Its also worth noting that the increase in transit ridership between 2004 and 2013 was propelled by the increase in gas prices; it shouldn’t be any surprise that once gas prices declined that ridership would stop growing, and actually decline. Given oil dropping into the $30 per barrel range, it seems likely that bus ridership is headed for another leg down–and no one should be surprised when that happens.

3. The Oregon Department of Transportation finally admits its been lying about I-5 Rose Quarter crash rates. For years, the Oregon DOT has been promoting its now $800 million I-5 Rose Quarter Freeway project with the phony claim that its “the #1 crash location” in Oregon. We and others have been pointing out that’s a lie–based on ODOT’s own statistics–for more than a year. We even provided evidence of this lie in public testimony to the Oregon Transportation Commission–but nothing changed. Until, earlier this month, when we challenged the agency through its “Ask ODOT” email address. The agency finally conceded we were right and removed the false claim from the project’s website–but its still using a false and deceptive claim about safety to sell this wasteful project.

Must read

1. Bill de Blasio, Climate Troll. Charles Komanoff doesn’t pull any punches when it comes to grading Mayor Bill de Blasio’s tenure as Mayor of New York City. While the Mayor gets plaudits from some national enviornmental leaders for promoting divestment and opposing (some) new fossil fuel infrastructure, Komanoff views this as just posturing. When it comes to actual city policies that might reduce the nation’s biggest source of greenhouse gases (driving), and capitalize on the intrinsic greenness of dense urban living, de Blasio is either clueless or missing in action.  Komanoff writes:

New Yorkers by the thousands are organizing to win better subways, safe bicycling and vital public spaces — both for their own sakes and because they enable a city with fewer cars. The fossil fuel infrastructure confronting us daily is a hellscape of cars and trucks.  Livable-streets advocates are counting down the 22 months left in de Blasio’s second and final term. We’re weary of his inane climate pronouncements and his cluelessness about what being a climate mayor really means.

2. Ed Glaeser on “Urbanization and its Discontents.” Just a few years back, Ed Glaeser wrote “The Triumph of the City.” While he’s still bullish on cities (as are we) he recognizes that urbanization, as it is occurring is not an unalloyed good and that there are many things we need to do better to realize the benefits of cities for everyone. Glaeser acknowledges that the private sector dynamics of talent migration and agglomeration are moving faster than the public sector response to a fundamental “centripetal” shift in economic activity. In particular, the politics of cities is now less conducive to growth, especially an expansion of the housing supply in high demand cities. And this in turn has important implications for the nation, blocking the access to opportunity for those with fewer skills, and encouraging a brain drain from less successful areas:

The limits on moving into high wage urban areas also means that migration becomes more selective and that imposes costs on the community that is left behind. When less skilled people can find neither jobs nor homes in high wage areas, then only high skilled people leave depressed parts of the U.S. The selective out-migration of the skilled means that these areas suffer “brain drain” and end up with even less human capital. If local economic fortunes depend on local human capital then this leaves these areas with even less of an economic future.

(Gated working paper available at National Bureau of Economic Research).

New Knowledge

The end of job decentralization in large metro areas. A new study from the federal reserve bank of boston uses data from County Business Patterns to track the trends in growth between urban, suburban and rural areas. It finds that in the largest metro areas, the long term pattern of decentralization essentially stopped after 2005.  This chart shows the share of total commuting area employment located in the denseset county in the commuting area.  The thick black line corresponds to the metro areas with the denseset urban cores.  For many decades, they experienced a steady decline in their share of employment, but that trend essentially stops in 2005.
These data are consistent with a pattern we’ve traced at City Observatory for several years.  Its apparent that large metro economies and urban centers have performed better in the past decade than in previous decades.  Its strong evidence for the importance of agglomeration economies in production, and also in consumption.  If anything, the county level data used for this study tend understate the role of urban centers because counties are such large and variable units and don’t correspond well with urban cores.
The study also looks at patterns of manufacturing job growth. The decline in many urban economies was tied to the decentralization of manufacturing. For a time, rural areas benefited from this trend, but particularly after 2000, the decline in manufacturing employment affected rural areas in the same way it had rust belt cities in the 1960s and 1970s.
It is easy to forget that the exodus of manufacturing jobs from big cities near the middle of the 20th century caused significant problems both for the affected workers and for the cities themselves. Smaller cities experienced less-severe manufacturing losses during this period, and in some cases substantial absolute increases in manufacturing employment. Our regression analysis indicates that this spatial pattern changed in the 1990s, with outlying manufacturing clusters of counties experiencing large losses as well. The challenges now faced by these more-rural areas closely parallel the challenges faced by New York, Detroit, and other large industrial cities in previous years.
Benjamin K. Couillard and Christopher L. Foote, Recent Employment Growth in Cities, Suburbs, and Rural Communities, Federal Reserve Bank of Boston, Working Paper #19-20, December 2019

In the News

Rice University’s Kinder Institute republished our thoughts on the Coronavirus pandemic.

 

The Week Observed, March 27, 2020

What City Observatory this week

1. The Geography of Covid-19.  A week ago, we issued a call to get much more granular with our statistical analysis of the pandemic’s spread.  In just the past few days, a number of new localized measures have emerged.  We highlight some of the best practices from around the world.  South Korea has a government database that geocodes the location of individual cases. They’ve mapped the locations of cases in a way that helps people avoid risky locations. We also highlight excellent data analyses from France and Italy.

2. County-level growth rates of Covid-19 cases.  The big issue with Corona is flattening the curve.  But the usual way we present the data (with the ominous upward sloping exponential curve) makes it difficult to pick out whether we’re making progress.  There’s an alternative:  computing the average percentage increase in the number of diagnosed cases over the last 5 to 7 days.  Others, like Lyman Stone have generated these charts for US states:  We’ve calculated them for the US counties with the highest numbers of diagnosed cases. This measure shows which places are starting to flatten the curve, and which ones are behind the curve.

3.  Metro Covid-19 rates March 22.  Between state level data (which are widely published) and county level data (which are noisy and hard to digest) lies data for metropolitan areas.  We’ve taken county level data and aggregated it up to the metropolitan level for the 53 metro areas with a million or more population.  We’ve also calculated the incidence of the pandemic in each metro:  the number of diagnosed cases per 100,000.

4.  We updated our metro level analysis to include data for March 25.  New York, New Orleans and Seattle, still topped the list for cases per capita, but showed wildly different growth trajectories.  The rate in New York nearly tripled from about 80 cases per 100,000 on March 22, to 250 on March 25. Seattle’s growth rate attenuated, with cases per 100,000 increasing from about xx on March 22, to about yy on March 25 (up xx percent).  The number of cases per 100,000 in the median large metropolitan area more than doubled in three days, from about 4 cases per 100,000 on March 22 to about 9 cases on March 25.

 

Must read

1. Policy Advice on dealing with the Corona Virus.  Many of our wonky urbanist friends have put their minds to thinking about what the nation ought to be doing to tackle the pandemic, and mitigate the economic damage from the necessity of social distancing.  Richard Florida and Steven Pedigo have their eyes on how cities come out of this crisis and have outlined eight steps we ought to be taking.  Xavier de Souza-Briggs, Amy Liu  and Jenny Schuetz have outlined the need for a large and comprehensive program to buffer state and local governments from promises to be a brutal fiscal shock, which if not addressed could worsen the pandemic and prolong the economic suffering, as well. Upjohn Institute’s Tim Bartik and colleagues have some useful suggestions about what needed to be in a federal fiscal response. Time will tell whether the measures now emerging from Congress will measure up to the solid advice we’re seeing here.

2. No, Joel, this doesn’t mean the end of cities. Emily Badger delivers your antidote to the “end of urbanism” talk emanating from some quarters. In her article at the New York Times, she points out that cities have long encountered–and always overcome–infectious diseases. There’s a lot about our urban environment that could be strengthened, in ways that would less vulnerable, while also making them even better places to live during normal times.

But if the earlier history of American cities is full of public-health horror stories about substandard housing, factory pollution and poor sanitation, more recent history tells of the health and resiliency density can provide. In practical ways, density makes possible many of the things we need when something goes wrong. That is certainly true of hospital infrastructure — emergency response times are faster, and hospitals are better staffed in denser places. When one store is closed or out of toilet paper, there are more places to look. When people can’t leave home for essentials, there are alternative ways to get them, like grocery delivery services or bike couriers. When people can’t visit public spaces, there are still ways to create public life, from balconies, porches and windows.

3. Trying times:  Will they temper us? Ryan Avent, who writes for The Economist, asks some important questions about how the Covid-19 Pandemic will reshape our politics and possibilities.  His essay at “The Bellows” challenges us to think about the opportunities this crisis presents:

Not to say that the short-run economic or health consequences of the pandemic aren’t a terrible thing to face. But changes to the system which once looked impossible to achieve increasingly seem within reach, maybe even inevitable. It is just possible to imagine badly needed changes—mandated paid sick leave, the end of the employer-based health insurance system—becoming reality. Something like a universal basic income, the fantastic underpinning of a techno-utopian future, may well emerge as a means to protect people against the economic damage wrought by the pandemic.

4.  Corona is reaching rural areas much more slowly.  Perhaps density has something to do with it, but rural areas are generally less connected to the rest of the country and the world, than are cities.  This probably explains why rural areas have much lower incidence of Covid-19 than the nation’s cities.  Our friend, Bill Biship, at the Daily Yonder has mapped incidence of the virus in the nation’s non-metro counties.  The hotspots? They turn out to be disproportionately recreational areas, like those near ski areas in Colorado, Idaho and Utah.

New Knowledge

Partsanship and Covid:  If there was any doubt we listen to the news (and presidential statements) with a strong partisan filter, the response to the Covid virus should put that to rest. A clever analysis of the varied Google search trends and the share of the electorate who voted for Donald Trump in 2016 shows just how much skepticism “Red” America harbored for about the virus.
Throughout January and February, Trump repeatedly downplayed the significance of the disease.  His public statements insisted that the outbreak was “under control” and would affect “a very small number of people.” That message clearly resonated with some Americans–but not with others.
Political scientist Brian Schaffner, gathered Google data on the incidence of searches for the term “hand sanitizer” for each of the major media markets (roughly corresponding to metro areas and their hinterlands).  Google reports the geographic intensity of searches, normalized for population, on a scale of 0 to 100, where 100 represents the market with the highest level of searches, and lower amounts represent the population-adjusted relative search activity for that term in other markets. He then charted that data against the fraction of a market’s population that voted for Trump in 2016.
For calendar year 2019, there was essentially no correlation between the prevalence of hand sanitizer searches and the 2016 vote.  But in early March, there was a strong negative correlation between hand sanitizer searches and the partisan voting.  Markets that voted strongly for Trump were far less likely to search for “hand sanitizers” than were markets where Trump got a small fraction of the vote. Its a good indication that Trumps remarks blunted concern in areas where he had strong support, and either had no effect (or perhaps heightened alarm) in the areas where he had few supporters.
Interestingly, Schaffner continued to chart Google search data week by week, and as the virus spread more widely, the partisan division in searches for hand sanitizer declined sharply–the line “flattened out” so that there’s relatively less difference now in searches between red and blue markets.  That’s neatly shown by an animation available here:

 

 

The Week Observed, February 28, 2020

What City Observatory this week

1. The inequity built into Metro’s proposed homeless strategy. Portland’s Metro is rushing forward with a plan asking voters to approve $250 million per year in income taxes to fight homelessness and promote affordability in Metro Portland. It’s pitched as redressing the inequities of the past: homelessness disproportionately affects communities of color.  But the political deal that’s been crafted allocates the money to counties not based on need, but on population. The result is one suburban county gets five to six times as much funding per homeless person as does Multnomah County (which includes the central city, Portland).

2. Gentrification:  The case of the missing counterfactual. How do we know whether gentrification makes a neighborhood’s residents worse off or better off? The answer that comes from sociologists tends to be based on ethnography:  interviewing long-time neighborhood residents to get their opinions about change. That’s a good starting point, but as a rule, these ethnographic studies don’t look at a control group of otherwise similar neighborhoods that don’t gentrify. So, while residents of gentrifying neighborhoods may remember that things were better in the past, and resent change, that doesn’t necessarily mean that residents of otherwise similar low income neighborhoods that didn’t gentrify would feel better about their situation. What’s missing from these studies is the “counter-factual”–i.e. a look at the opinions of people in low income neighborhoods that didn’t gentrify.  The assumption is, that in the absence of gentrification, everything stays the same. But we know that’s not true: low income neighborhoods that didn’t gentrify tended to lose about 40 percent of their population over four decades. This “displacement by decline” is far more common, and likely even more devastating to long-time residents than gentrification. But you’d be hard pressed to know this by reading the typical sociological study of neighborhood change.

3. Why Atlanta’s building moratorium won’t stop gentrification, but will definitely accelerate “flipping.” Atlanta is investing $25 million to turn an abandoned quarry into the city’s largest park, and even before its done, the new park is generating huge interest in surrounding neighborhoods. So much so, that, fearing gentrification, Atlanta Mayor Keisha Lance Bottoms has announced a six-month moratorium on new building permits in the area. While the Mayor is hoping to forestall gentrification, in our view, the moratorium is likely to backfire. Demand for housing in the area won’t go away, instead, it will simply be focused on the existing housing stock. If anything, the moratorium makes the attraction for house flippers even more irresistable, because when they put their revamped houses on the market, prospective buyers and renters won’t have as many choices.

Must read

1. What do to about gentrification?  Writing in Governing, guest commentator Jabari Simama reflects on his experiences with neighborhood change in Atlanta. Change creates friction and discomfort, but if we’re starting from a place that’s highly segregated by race and income, we’ve got to figure out how to make change productive, rather than simply blocking it. As Simama points out:

Gentrification is neither good nor bad, but we must manage it better. Clearly we want to encourage residents to live wherever they please. Mixed-income and multiracial neighborhoods are good for our cities. The question is what needs to be done to make them inclusive — racially and economically — and balanced with legacy and new residents.

It’s a useful message: too often in the face of rapid change and palpable pressure to do something, political leaders find themselves confronted with limited and fruitless choices.

2. Jenny Schuetz:  Corporations make a convenient scapegoat for housing, but that’s not the problem. Brookings Institution scholar Jenny Schuetz takes on one of the most popular myths in housing debates:  that the increasing–but still very small–corporate ownership of houses and apartments is responsible for housing unaffordability. At best, they’re a symptom, and it turns out that corporations are actually finely tuned to taking advantage of supply constraints to make shrewd investments.  If local regulations block the construction of new homes and apartments, then demand shifts to the existing housing stock. Instead of older homes and apartments filtering “down” as they age and depreciate, they get rehabbed and “filter up” rising in price.  As Schuetz explains:

Local regulations also play a role in the buy-and-rehab strategy employed by private-equity firms. In places where regulation limits new apartment construction, acquiring existing buildings is less risky than trying to build new rental housing. There are stronger financial incentives to maintain and upgrade old apartments in tightly regulated markets, because they face less competition from new, high-amenity buildings. This process of upward “filtering” among existing apartments is particularly harmful to housing affordability because it results in higher rents without expanding the number of homes available.

3. More road-widening madness, Los Angeles edition.  Magnolia Boulevard is a major thoroughfare in North Hollywood, and with recent investments in improved transit and new apartment construction, its become an increasingly walkable area. It’s baffling then that the LA Department of Transportation is proposing to widen the roadway, in apparent contravention of the city’s adopted 2015 Mobility Plan, and also contrary to Mayor Eric Garcetti’s recent proclamation that the city will work to reduce greenhouse gas emissions by lowering the number of vehicle miles traveled.  LA Streetsblog explains that when it comes to the manifest destiny of road-widening, highway engineers aren’t going to let newly adopted policies or contributing to planetary destruction get in their way.  Widening the roadway threatens to undo much of the improved livability and walkability of the area:

Since 1999, there’s been a subway, Bus Rapid Transit, dense housing, great walkability, thriving sidewalk cafes, bike lanes, bike paths, bike-share, theaters, galleries, and much more. In two decades NoHo has emerged as the most walkable, most transit-oriented place in the San Fernando Valley, and indeed one of the more walkable neighborhoods in the entire city. Ignoring the current neighborhood opposition and holding steadfastly to outdated plans threatens to cram more cars into the area disrupting what makes it work for the people that live there.

New Knowledge

How density and distance from downtown shape political affiliations. We’ve long known that the red/blue divide cleaves along the urban/rural axis.  Almost everywhere big cities and dense urban neighborhoods are more reliably blue, and non-metropolitan areas and small towns are predictably red. At least some of this has to do with the sorting that’s going on along demographic lines, with younger, better educated (and more generally blue) people moving to metro areas and city centers, while the population of rural areas remains older and less educated.

A new study from University of Maryland political scientists James Gimpel and co-authors published at the Daily Yonder looks to dis-entangle the relative effects of neighborhood urbanity and local demographics in explaining this red/blue divide.  It finds that even after controlling for the partisan differences imparted by different levels of age, income, education and the like, that denser urban areas, and neighborhoods closer to the urban center tend to have a higher fraction of Democrats.  The following chart shows their findings for distance from a large city and partisan affiliation.

The median Democrat lives within about 12 miles of the center of a city of 100,000, while the median Republican lives about 20 miles away.  Democratic affiliation peaks at about 5 miles from the center and declines the further one goes outward; Republican affiliation peaks at about 60 miles from the city.

Demography, still matters, of course, but geography plays a significant part even after controlling for the effects of income, age, education and so on. As the authors explain:

The effects of place are significant. For example, two voters with otherwise similar backgrounds, one living in a city, the other living 165 miles outside the city, will differ in the probability of expressing Republican loyalty by about 9 points. Density, similarly will alter the propensity to identify with the major parties, with those in the densest settlements 15 points more likely to be Democrats than those living in the least dense settlements.

Hat tip to our friend Bill Bishop, publisher of the Yonder.

In the News

The Milwaukee Business Times quoted our ranking of the share of independent restaurants in the the nation’s largest metro areas in its story: “Milwaukee among U.S. metros with most independent restaurants.”

The Portland Tribune highlighted our analysis of the mismatch between the allocation of funds and the incidence of homelessness in Metro’s proposed $250 million per year housing measure.

 

 

The Week Observed, February 21, 2020

What City Observatory this week

1. Local flavor:  Which cities have the most independent restaurants.  Local eateries are one of the most visibly distinctive elements of any city. As Jane Jacobs said, the most important asset a city can have is something that is different from every other place.  Independent restaurants are a great indicator of local distinctiveness. We use data from Yelp to rank the market share of independent and chain restaurants in the nation’s largest metro areas, and find which cities have the most and fewest independent restaurants. The cities with the smallest chain restaurant market shares included New York, San Francisco, and Providence.

We also note that there’s a strong relationship between the market share of independent restaurants and the number of restaurants per capita (a good indicator of consumer choice).  Cities with a higher independent market share have more restaurants per capita than cities with a higher fraction of chains.  Independent restaurants equate to more different choices and more total choices, adding to local flavor in a measurable way.

2. Why cars are bad for independent restaurants. We followed up on our analysis of Yelp’s data on chain vs. independent restaurant market shares by comparing it to US Department of Transportation data on the number of vehicle miles traveled in different metropolitan areas.  It turns out that there’s a pretty strong relationship by how much people drive in a metro area and the fraction of chain restaurants:  more driving is correlated with a higher market share for chain restaurants.

We’re not sure of the reason for this correlation, but we have some hunches. Selecting restaurants by looking through the windshield of a fast moving car is going to bias ones choices in favor of familiar, well-advertised chains. Also, cities where people drive less enjoy a green dividend:  they spend less on cars and fuel and therefore have more money to spend on food, including independent restaurants.

3. Understanding Walkable Density.  We’re pleased to publish a guest commentary from DW Rowlands, a graduate student at the University of Maryland at Baltimore.  Her research looks into the lived density of particular neighborhoods, adjusting conventional measures of density (which ignore the connectedness and walkability of local streets) to produce a new measure of walkable density that effectively captures how easily people can interact in real world environments. Her work shows that some cities and some neighborhoods come much closer to achieving an “ideal” level of walkability:  chiefly older cities and core urban neighborhoods who have traditional street grid systems. Newer metros and outlying areas, with serpentine roadways and cul-de-sacs tend to have even lower “walkable” densities than ordinary density measures would suggest.

 

4. Mapping Walkable Density.  DW Rowlands has mapped walkable density in 17 of the nation’s largest metropolitan areas.  Her maps compare the actual walkable density of census tracts with their theoretical ideal density (i.e. how many people one would live near, ignoring the nature of the street network. These maps show which neighborhoods come closest to realizing their “ideal” density, with dark colors indicating places with relatively high levels of walkable density (relative to their ideal) and lighter colors showing places where actual density falls shortest of the ideal.  Here’s a map for Boston:

5. Climate Failure and Denial at the Oregon Department of Transportation.  As is now true nationally, transportation is the largest source of greenhouse gas emissions in Oregon.  On paper, the state has a response, in the form of a “State Transportation Strategy.” In reality, it amounts to an excuse for inaction, mostly hoping that other actors (federal regulators, car manufacturers, and car owners replace current cars with clean vehicles and fuel. The trouble is its not working: Oregon’s plan called for a 10 percent reduction in 1990 emissions by this year; the state’s transportation emissions are now up 20 percent above 1990 levels–and are going in the wrong direction.

And despite its failure to make any progress, ODOT isn’t proposing any additional actions to reduce greenhouse gases–instead its planning to spend billions widening freeways. It shouldn’t be any surprise that the strategy isn’t working. The department of transportation’s “stakeholder” group included climate skeptics and deniers, who didn’t think the state mandate to reduce greenhouse gases made sense (or was worth incurring costs).

Must read

1. Rent control works best when rent control works least.  The District of Columbia has had a rent stabilization law since the mid 1980s, and more than a third of the District’s apartments are covered by the law. The usual economic concern about rent control is that it tends to stifle upkeep and maintenance, encourages condominium conversions and discourages new construction. If DC’s ordinance hasn’t had all those effects, its probably because it imposes relatively modest limits on landlords: the stabilization provisions limit rent increases to the cost of living plus two percent per year. According to the DC Policy Center’s Yesim Sayin Taylor, the reason the program works so well, and why there’s been relatively less shrinkage in the rental housing stock in DC, than in San Francisco (which has tougher rent control), is because it only aims to stabilize, not restrict, rents:

D.C.’s stock of rent-stabilized units has remained so steady in part because the law prioritizes rent stabilization over strict price controls. The Rental Housing Act’s goal is not to create or preserve affordable housing, but to protect tenants from rapid, unreasonable increases in their rents.

More stringent rent controls, Taylor argues, would likely prompt landlords to convert units to condominiums, and would discourage new construction, leading to a reduction in the total number of rent controlled units.

2. More right of way for transit in Portland. As Smart Growth America notes, Portland’s City Council has unanimously adopted a plan for expanded “Rose Lanes” a network of dedicated lanes and transit signal priority on 29 streets around the city. Building on the success of some existing exclusive bus lanes, the project is avowedly experimental, with the idea of trying and as needed adjusting different treatments to speed bus operation.

Most of the projects are designed to be relatively inexpensive and quick-to-implement. Projects are expected to roll-out city-wide this year and next. Unlike huge capital projects (light rail), these measures can provide widespread benefits, quickly, to many transit users and at low cost.  As SmartGrowthAmerica concludes: “political will, more than money, can become the limiting factor in whether or not transit is truly prioritized on the street.”

3. Public opinion is shifting toward the environment and climate change. The Pew Research Center has a new survey out showing that after the waning of the Great Recession, concern about environmental issues and climate change has increased substantially.  According to Pew a majority of Americans now say that both the environment generally and climate change in particular should be a top priority for the President and Congress.

For the first time in Pew Research Center surveys dating back nearly two decades, nearly as many Americans say protecting the environment should be a top policy priority (64%) as say this about strengthening the economy (67%).

The downside:  there’s a stark partisan divide on both issues:  No where are Democrats and Republican’s more divided than on the importance of dealing with climate change.  Some 78 percent of Democrats think its an important issue; only 21 percent of Republicans do.

In the news

The Stranger quoted City Observatory’s analysis of induced demand on Houston’s Katy Freeway in an article entitled: “Washington Democrats are damn right to remove congestion relief from state transportation goals.”

 

The Week Observed, February 7, 2020

What City Observatory this week

1. Talent drives economic development. We know the single most important factor determining metropolitan economic success:  It’s determined by the education level of your population. The latest data on educational attainment and per capita incomes show that two-thirds of the variation in income levels among large metro areas is explained by the fraction of the adult population with a four-year degree.

Each one percentage point increase in the adult college attainment rate is associated with a $1,500 increase in metro average per capita income.

2. Jealous billionaires and cash prizes for bad corporate citizenship. The big economic development prize of the past several years was Amazon’s HQ2, which the e-commerce giant purposely structured as an competitive extravaganza, baldly asking for subsidies from cities across North America.  Bloomberg Business reports that Amazon’s quest was fueled by CEO Jeff Bezos’s jealousy over the generous subsidy package Nevada provided Elon Musk for a Tesla battery factory.

3. It works for bags, bottles and cans, why not try it for carbon?  A newly enacted law in Portland requires grocery stores to charge customers a nickel for each grocery bag they take. Echoing Oregon’s half-century old bottle bill, and similar bag fees in other places, this provides a gentle economic nudge to more ecologically sustainable behavior. And the evidence is that it works–it London, single use bags are down 90 percent. We ought to be applying the same straightforward logic to carbon pollution, and ironically, the bag fee, on a weight basis is 20 times higher than the charge most experts recommend for carbon pricing.

4. Dodging responsibility for climate change. Oregon’s Transportation Commission, a five-member citizen body, is on the firing line in the battle on climate change because of its plans to spend $800 million to widen a Portland freeway. It has largely turned a deaf ear to testimony about the freeway’s negative effects, essentially asserting that the decision to build the freeway has already been made by the Legislature. They’re just following orders, apparently.

In our view, that’s far too narrow a view of the Commission’s role and responsibility:  they have an obligation to hear citizen concerns, and if the project is unsustainable, uneconomic, and won’t work, they have a duty to tell the Legislature.

Must read

1. Fighting housing segregation in Baltimore. CityLab has a great retrospective on the struggles in Baltimore to break down the segregation of public housing. As in many US cities, for decades, public housing (which disproportionately serves people of color) has gotten built in low income neighborhoods, which has only served to perpetuate and intensify racial and income segregation. CityLab chronicles the role of Barbara Samuels, and ACLU lawyer who challenged this practice, and who won a court victory 25 years ago.  This court case–Thomspon v. HUD, gave some real teeth to the Fair Housing Act’s provisions requiring governments to affirmatively further fair housing. Ultimately this case got housing officials to instead implement a system of housing vouchers that give public housing recipients the opportunity to live in middle income and higher opportunity neighborhoods.

2. Little Women’s lessons for housing policy. It takes a real wonk to find deep housing policy lessons woven through an Academy Award nominated film, and Brookings Institution scholar Jennie Schuetz is up to the task. Little Women features a combination of mixed income neighborhoods, relaxed building codes, and co-housing (boarding houses) all of which facilitate greater housing affordability–and thanks to social mixing–romance. As Schuetz explains:

Less zoning equals more social equity and more romance. While some details of the world depicted in “Little Women” would not appeal to modern audiences—corsets and the lack of modern health care come to mind—a more laissez-faire approach to housing regulation seems worth revisiting. Can we imagine a return to communities where mansions, middle-class homes, boarding houses, and low-income housing can co-exist without legal restrictions or social prejudice? Where households with diverse incomes and family structures can interact with casual, everyday intimacy?

In spite of 21st Century injunctions about lesser income housing lowering home values and the apparent desirability of ubiquitous homeownership, the lifestyles (and policies) of 19th Century America promoted housing affordability and social mobility, things we could use more of today.

3. Boston’s under-occupied large homes. We often talk about a shortage of housing, but in an important sense, the problem is less a shortage than a maldistribution. A new study from the Metropolitan Area Planning Council, Boston’s regional planning agency looks at the occupancy of large dwellings (those with three or more bedrooms). It finds that most are under-occupied–that is, these housing units have fewer occupants than they do bedrooms. This phenomenon is driven by history, homeownership, and demographics. A quarter of all three bedroom units are occupied by just one or two persons aged 55 or older.  Inertia, the transaction costs associated with selling a home, and a probable lack of smaller ownership opportunities for those who want to age in place (or very nearby) likely contribute to older homeowners staying in these larger houses, rather than selling them to younger and larger households.  The mismatch between household sizes and housing units suggests that part of the solution to our perceived housing shortage would be to develop incentives for older homeowners to downsize more quickly.

New Knowledge

Integration and civic engagement. A growing body of evidence points to the importance of mixed income neighborhoods to the lifetime economic prospects of kids from low income families. A new study from xxxx shows that growing up in mixed income neighborhoods also seems to encourage greater civic participation. Eric Chyn, who looked at the economic outcomes from kids from families given vouchers to enable them to move out of low income neighborhoods in Chicago, has a new study looking at effects on voter participation. His key finding:  kids who grew up in these more mixed income neighborhoods tended to have higher voting rates as adults that otherwise similar peers who grew up in lower income neighborhoods. On average, kids growing up in mixed income neighborhoods were about 12 percent more likely to vote as adults than their peers.

Eric Chyn & Kareem Haggag, Moved to Vote:  The Long Run Effects of Neighborhoods on Civic Participation, University of Chicago, Human Capital and Economic Opportunity Global Working Group, Working Paper #2019-079, 2019.

 

 

The Week Observed, January 31, 2020

What City Observatory this week

1. A massive regional transportation spending plan that does nothing for climate change.  Portland’s leaders are in the process of crafting a $3 billion plus regional transportation package. One of its stated objectives is to help reduce greenhouse gas emissions. But a recently released staff analysis shows the multi-billion dollar plan will lower greenhouse gas emissions by just 5,200 tons out of a regional total of more than 9.5 million. That works out to a five one-hundredths of one percent reduction.  Graphically, something like this:

Each dot represents almost 5,000 tons of greenhouse gases; the one red dot shows how much reduction the region will get for its $3 billion transportation package.

What’s particularly alarming is that Portland’s greenhouse gas emissions from transportation have been growing rapidly since the collapse of oil prices in 2014, with the region emitting 1.6 million more tons of greenhouse gases now than in 2013; The entire multi-billion dollar package will offset just a couple of days worth of the growth in emissions. If the region is serious about climate change, its going to need something very different from this package.

2.  When it comes to climate change, it’s always Groundhog’s Day.  We seem to be stuck in an infinite loop when it comes to climate policy.  We adopt bold declarations that we’re going to reduce our greenhouse gas emissions . . . someday.  But when we look at the latest accounting, well, it turns out that we’re not making any progress, and when it comes to greenhouse gases from transportation, we’re making things worse, almost entirely because we’re driving more. We highlight the continuing–and growing–disconnect between the high minded rhetoric (and much celebrated long-term goals) that Oregon and other states have adopted for climate change, and the actual progress, which has been not just nil, but negative, when it comes to transportation. If we’re serious about climate change, we’re going to have to do something different, or next Groundhog’s Day is going to look pretty much the same.

Must read

1. The optimal serendipity of industry clusters. Rents near Kendall Square in Cambridge are pushing $100 per square foot, making it one of the most expensive places to locate your business. Yet biotech businesses willingly pay these high rents, rather than relocating to cheaper suburbs (or one of the many cities questing to be a biotech center).  Why? Well, as economist Robert Lucas opined three decades ago, people pay high rents in cities, and places like Kendall Square, to be near other people.  The Boston Globe recites first-hand accounts of the benefits that researchers, industry executives, venture capitalists, and other biotech specialists reap from being in close proximity to thousands of their peers. Some of this is the straightforward benefits of propinquity, its easier and quicker to connect with just the expert you need. But a big part of the benefit is serendipitous interaction, things you learn from the random accidents of who you encounter. This source of cluster competitive advantage is over-whelming and almost impossible to duplicate, which is why calls to spread innovation based industries more evenly across the landscape are almost certainly doomed to fail.

2. Grand strategy for housing affordability. The re-legalization of four-plexes in Oregon and triplexes in Minneapolis represent an important symbolic victory in the effort to restore housing affordability in the nation’s cities. But Sightline Institute director Alan Durning argues that its just a first step in a much longer and more difficult effort to reshape housing policy.

Legalizing these is just the start.

Durning compares the housing debate to World War I’s continental scale trench warfare, and while hopeful, the advances in “missing middle” housing are just a mile long incursion in what amounts to a thousand-mile front. We raised much the same concern in our 2019 commentary:  You’re going to need a bigger boat.  While triplexes and fourplexes are a step in the right direction, its only a beginning.  As Durning says:

And they are breakthroughs. They have few precedents in recent decades of local housing law on this continent: the sanctum of single-family zoning has been breached. But viewed from a broader perspective, they are breakthroughs in a war we are losing. Or at least, a war we’re winning so slowly that we might as well be losing.

This commentary steps back from the policy tactics of local and state legislation to consider the broader political map on which housing debates turn. Advancing from the current breakthrough is going to require new and stronger political coalitions. Durning promises to explore new political strategies that will help build the case for affordable, low carbon cities. We’ll follow this with interest.

3. There’s no shortage of housing–for cars. Streetsblog Boston has a fascinating analysis of new housing developments in Boston. The good news? The city is building more housing, and doing so in transit served locations.  The bad news: The housing developments have a huge amount of parking. In the 27 projects permitted in the past year, there are 1,984 housing units and 2,152 parking spaces–more than 1.1 per unit. That parking tends to be expensive, more than $25,000 per above ground parking space, (making housing less affordable), and according to careful studies, also tends to be underutilized (even at peak occupancy, 30 percent of parking spaces remain un-used).  Boston’s climate goals call for more housing and less driving, but the way real estate is getting developed, car dependence is built in to new housing.

New Knowledge

How green are electric cars? A defining feature of the electric car is the absence of a tailpipe. That leads some to conclude that electric vehicles are completely green or “zero emission,” but that misses the fact that both the car (and importantly its battery) require energy for manufacturing, and the electricity that the car uses has to come from some other sources of energy. If your Tesla gets charged up with electricity from a coal-fired power plant, its decidedly less green than if it draws its electricity from a windmill or solar cells.  And the same with its battery:  battery manufacturing requires a fair amount of electricity, so whether the battery factory is in China, for example, where most electricity comes from coal, it will have a substantial carbon footprint.

There are plenty of dueling studies out there estimating the greenhouse gas emissions from electric cars compared to traditional internal combustion engines and hybrid vehicles.  One study compares the life-cycle greenhouse gas emissions of a battery-powered plug-in Nissan Leaf with a Mazda 3, a similarly sized vehicle powered by a conventional internal combustion engine.  The following map shows the results, with the areas shown in blue being places where the Leaf has lower emissions, and red showing places where the Mazda has lower emissions. The difference reflects the relatively high use of coal for electricity generation in the Northern Plains States.

As Prof Jeremy Michalek, director of the Vehicle Electrification Group at Carnegie Mellon University, tells Carbon Brief, “which technology comes out on top depends on a lot of things”. These include which specific vehicles are being compared, what electricity grid mix is assumed, if marginal or average electricity emissions are used, what driving patterns are assumed, and even the weather.

Electric cars have the potential to reduce greenhouse gas emissions, but as this study suggests, when and where they are deployed, and how rapidly we can de-carbonize electricity generation are important factors in determining the role they should play in any climate strategy.

 

 

The Week Observed, December 13, 2019

What City Observatory this week

1. Oregon DOT repeats its idle lie about emissions. It’s every highway builder’s go-to response to climate change:  we could reduce greenhouse gas emissions if we could just keep cars from having to idle in traffic. That turns out to be a great way to rationalize any highway-widening project. Which is exactly why new Oregon Department of Transportation Director Kris Strickler invoked this claim in his recent confirmation hearing.  But its also the case that its an utterly unfounded urban myth that making cars move faster will reduce carbon pollution. Just the opposite, in fact. Measures that speed cars (whether additional lanes, better signal timing, or “improved” intersections) just prompt more people to drive. It’s this induced demand effect–now so well-demonstrated that its call the “fundamental law of road congestion”–more than wipes out any gains from lower idling. More and longer car trips, not time spent idling, are what cause increased greenhouse gas emissions.

2. A long history of highway department ignoring school kids. The latest objection to the Oregon Department of Transportation’s Rose Quarter freeway widening project comes from the Portland School Board, which has voted to ask for a full-scale environmental impact statement for the $500 million project which moves the I-5 freeway even closer to the Tubman Middle School. The district has already had to spend millions on air filtration equipment to make the school habitable; a wider freeway with more traffic would likely increase the impact on kids, and an array of new studies are showing long term damage to learning from chronic exposure to auto pollution. But if history is any guide, the school board seems likely to be ignored. Six decades ago, its predecessors raised very similar objections to the original siting of the freeway next to the school, and despite assurances from city and state officials that their concerns would be addressed, the freeway was built exactly as planned, slicing off part of Tubman’s school yard, bisecting attendance areas for other nearby schools, and dead-ending dozens of neighborhood streets.  Of course, in the early sixties, there was no National Environmental Policy Act, and no one knew for sure how harmful pollution was.  Will this time be different?

 

Must read

1. Climate-talking mayors are inexplicably building more freeways. We’ve long noted the discrepancy between the bold climate rhetoric of some Mayors and their political support for more and wider freeways. Curbed’s Alissa Walker lists progressive cities around the country, including Los Angeles, Houston, Portland, Austin, Chicago and Detroit who are planning to spend billions to widen freeways, in spite of their Mayor’s ostensible pledges to be moving to reduce carbon pollution. Walker writes:

. . . climate mayors are currently allowing massive expansions of highway infrastructure in their cities. There are at least nine major highway-widening projects with costs totaling $26 billion proposed or currently underway in U.S. metropolitan areas governed by members of the Climate Mayors group.

As Brent Toderian has frequently said, if you want to understand someone’s real priorities, look at their budget.  The spending priorities of these ostensible climate leaders shows that their actual priorities really aren’t very concerned about the climate crisis.

2. Inequality has gotten worse in almost every city–maybe inequality isn’t a local problem. Emily Badger and Kevin Quealy of The New York Times have a nice visualization of the growing wage inequality experienced in almost every US city over the past four decades.  Their animation shows that the gap between the 90th percentile wage and the 10th percentile wage; i.e. the difference between the average hourly earnings of the person at the bottom of the top ten percent of all wage earners and the average hourly earnings of the person at the top of the bottom ten percent of all wage earners.  In most metro areas, the 90th percentile worker earned about four times as much as the 10th percentile worker in 1980; by 2010, this had increased to five to six times times as much in 2015.

As we know, changes at the top and bottom of the wage distribution have accounted for this gap.  Some high paid occupations and industries have recorded much higher than average increases in wages. Conversely, at the low end of the labor market, the minimum wage has failed to keep pace with inflation. While the levels of wage inequality vary across metropolitan areas, the striking point here is that wage inequality has increased almost everywhere, suggesting that a common set of national factors, rather than principally local ones, are behind growing inequality.

3. Brookings calls for a fundamental re-thinking of federal infrastructure policy.  Everyone seems to agree that “infrastructure” is really important to the economy and the environment, but beyond glib generalities, there’s a world of nuance that makes all the difference. The Brookings Institution has a new report calling for re-visiting of the first principles guilding infrastructure. A big part of our current problems–especially climate change, and economic segregation–are a product of the kinds of investments we’ve made in infrastructure.  More of the same–more freeways, more far-flung sewer and water systems and so on, are likely only to exacerbate sprawl and create more car dependence.  As the report’s lead author, Adie Tomer writes:

Current infrastructure networks and land uses make our climate insecurity worse. The transportation sector is now the country’s top source of greenhouse gas emissions, representing 29% of the national total. The country continues to convert rural land into urban and suburban development faster than overall population growth, leading to greater stormwater runoff, higher fuel consumption, and accelerated loss of tree cover, wetlands, and other natural resources.

The default political trajectory for infrastructure is simply to plow more money into existing programs, like the highway trust fund, that simply repeat the mistakes of the past.  Brookings is calling for a careful look at how we change the framework and incentives around our infrastructure investment to align with the challenges we now face, particularly the climate crisis and growing inequality.  That conversation is long overdue.

New Knowledge

Food deserts?  Fuggedaboutit.  One of the most popular urban theories of the past decade or so has been the notion of food deserts, the idea that low income households have poor nutrition because they don’t have local stores that sell healthy food. The policy implication has been taken to be that if we could just get more healthy food closer to low income households, their health status would improve.

A recent study from six economists uses some extremely rich and detailed data on consumer spending patterns, and traces the effects of the opening of new supermarkets on the buying patterns of nearby residents.  It finds, contrary to popular belief, that significant changes the the local array of “good foods” have almost no effect on the buying patterns of nearby residents. For example, the entry of a new supermarket into a food desert tends to reduce sales at other nearby supermarkets, rather than shifting the composition of consumer spending away from less healthy stores (like convenience stores).

The authors show that there are strong correlations between income and education, and the propensity of consumers to purchase healthier foods.  By their estimates, only a tiny fraction of the variation in healthy food consumption is explained by supply-side factors (i.e. proximity to stores with healthier food).

Entry of a new supermarket has economically small effects on healthy grocery purchases, and we can conclude that differential local supermarket density explains no more than about 1.5 percent of the difference in healthy eating between high- and low-income households. The data clearly show why this is the case: Americans travel a long way for shopping, so even people who live in “food deserts” with no supermarkets get most of their groceries from supermarkets. Entry of a new supermarket nearby therefore mostly diverts purchases from other supermarkets. This analysis reframes the discussion of food deserts in two ways. First, the notion of a “food desert” is misleading if it is based on a market definition that understates consumers’ willingness-to-travel. Second, any benefits of “combatting” food deserts derive less from healthy eating and more from reducing travel costs

If we’re really concerned about nutrition, the author’s argue that some strategic changes to the SNAP (supplemental nutrition assistance program, long called Food Stamps) might be a more powerful way to encourage consumers to make healthier choices.  Right now, SNAP provides the same subsidy for low nutrition foods as high nutrition foods; a system that lowered or eliminated subsidies for the least nutritious foods and increased them for the most nutritious foods would generate healthier purchases.

Hunt Allcott, Rebecca Diamond, Jean-Pierre Dubé, Jessie Handbury, Ilya Rahkovsky, and Molly Schnell, Food Deserts and the Causes of Nutritional Inequality,  NBER Working Paper No. 24094

In addition to the technical working paper, the lead authors have published a short, eminently readable non-technical summary of their findings at The Conversation.

In the News

Willamette Week and BikePortland highlighted our debunking of Oregon Department of Transportation official claims that wider freeways will reduce greenhouse gas emissions.

The Week Observed, December 20, 2019

What City Observatory this week

1. Portland’s progress (or lack thereof) on climate. Portland likes to present itself as a climate leader, but the latest data on transportation-related greenhouse gas emissions shows that Portland is losing ground in a big way. Portland’s transportation greenhouse gas emissions have increased by 1,000 pounds per person since 2013, a major setback to city efforts to set an example of how to achieve globally agreed upon climate goals.

2. Lessons in the price elasticity of demand for transportation planners.  Perhaps the most fundamental notion in economics is the idea that when the price of something goes up, people buy less of it.  International comparisons, and the nation’s own experience with gas prices over the past two decades confirm that this applies to gasoline and driving.  High gasoline prices are associated with lower levels of driving. The high and rising price of gas from 2004-2014 was associated with an unprecedented decline in per capita driving; a trend that was unfortunately reversed when oil prices collapsed in the second half of 2014 (triggering more driving, more SUV purchases, and not surprisingly, more crashes and road deaths. Prices exert a powerful effect on behavior; unfortunately many transportation planners ignore or deny the role that prices play in shaping urban transportation and land use. More fully reflecting back to users the costs associated with their choices would make it much easier to achieve our environmental and safety objectives.

Must read

1. The high cost of driving. One of our maxims at City Observatory is that that when it comes to driving, the price is wrong. The decision to rely heavily on private automobiles as our primary means of urban transport is shaped by the fact that car travel is dramatically under-priced and heavily subsidized.  A new study from the Massachusetts Institute of Technology compiles the total cost associated with car travel in the Bay State.  It discovers that cars cost the average household $14,000 per year, with most of those costs due and payable whether or not one owns a car.  Less than half of the cost of car transportation ($27.4 billion of the $64 billion) are the private costs of paid by drivers.  Most of the cost is loaded on the public sector, and user fees (like gas taxes and vehicle registration fees) cover less than a third of the public sector’s costs of providing roads, parking, emergency services and other costs, as well as dealing with pollution and environmental consequences. Thus, most of the cost of car travel is hidden, and gets rolled up in all sorts of other bills, concealing from drivers (and citizens) the high cost of our auto-dependent transportation system.

2. Jane Jacobs & Robert Moses: the graphic novela. Sarah Mirk and Jackie Roche have collaborated to produce a short graphic novel relating some the lessons from Jane Jacob’s famous “Death and LIfe of Great America Cities.” Comic books, of course, need a super-villain, and real-life has supplied one, Robert Moses, New York’s master-builder and indeed, “The Power Broker.”

The extraordinary illustrations bring Jacobs, and some of her key insights to life (see the sidewalk ballet, below); if you haven’t read her books or Robert Caro’s biography of Moses, this will whet your appetite; if not it will remind you of the larger than life character of these two adversaries.

There’s a lot to like, and this is highly recommended, but we found one small annoyance, Mirk and Roche claim that Jacobs’ insights about the desirability of urban living have been “corrupted” by exploitative developers, who they blame for rising rents and gentrification. In our view, that misses the fact that the high and rising rents in dense walkable neighborhoods are really a product of their paucity, and of Jacob’s views neatly capturing what more and more American’s, especially younger generations are hungering for: more great urban places. But that’s a minor quibble:  The whole thing is a visual delight, and is available on-line at the Nib; be sure to have a look.

3. You can’t spell “Hyperloop” without hype. Why bother with the difficult nitty gritty of existing technologies like buses and trains when you can engage in the magical thinking of the hyperloop? Never mind that the technology is utterly unproven and the costs are unknown, cities around the country are treating it as a viable if not preferred alternative for a range of transportation challenges. While most media coverage is of the fawning variety, Aaron Gordon of Jalopnik takes a refreshingly skeptical view of the technology in a piece entitled “Hyperloop is the Midwest’s  Answer to a Question that No One is Asking.” Gordon takes a close look at a proposal to build a hyperloop between Cleveland and Chicago. Backers have produced a study claiming that it will produce hundreds of billions of dollars in increased property values and a range of other benefits. Like similar claims for sports stadiums, such studies are generally grossly inflated, and fail to consider the redistributive effects.  Gordon looks for details, and finds them, lacking:

Despite the study’s 156-page length, it is extremely light on methodology or the assumptions baked into the calculations. In fact, any mention of study methodology or assumptions directs inquiring minds to an appendix. However, the feasibility study does not have an appendix…

New Knowledge

Maybe Uber and Lyft are increasing transit ridership.  One of the most oft-voiced concerns about the growth of ride-hailing is the idea that it is cannibalizing ridership from transit systems.  A new study from three economists looks at the connection between the growth of ride-hailing and transit ridership trends in cities across the country.

Like other studies they proxy for the extent of ride-hailing by looking at the data Uber and Lyft entered a particular market, and also use data on the number of google searches for the company’s brand names to estimate growth ride hailing.  They then look to see whether entry is correlated with increases or decreases in transit ridership in the area.

In the aggregate, the authors found that transit ridership increased after Uber’s entry into the market.  Their overall data, summarized below, show that transit ridership was essentially flat in the months preceding the entry of ride-hailing, and then tended to increase somewhat in the following months.

The effect varied according across markets and transit providers, with large markets seeing the greatest complementary impact (i.e. ride-hailing being associated with increases in transit ridership).

One challenge, as we’ve noted with other similar studies of ride-hailing, is that the number of months since entry into a market or the number of searches, is at best a crude proxy for ride hailing, and misses the distinct temporal and spatial concentration of ride hailing activity (mostly on weekend nights, and in downtowns and to and from airports). Ride-hailing is much more likely to affect transit ridership in some locations and at some times than others, and these studies don’t exploit these variations to understand impacts. As the authors note, this is a subject the deserves further exploration.

Jonathan Hall, Craig Palsson & Joseph Price, “Is Uber a substitute or complement for public transit?,” Journal of Urban Economics (108:36-50) 2019.

Hat-tip to Bloomerg’s Noah Smith for highlighting this study.

In the News

Joe Cortright’s Op-Ed, “Portland’s phony, failing climate policy,” was published in the December 14, 2020 Oregonian.

Oregon Public Broadcasting quoted City Observatory’s Joe Cortright as applauding Governor Kate Brown’s call for the Oregon Transportation Commission to closely examine using congestion pricing to address Portland’s transportation problems.

Happy Holidays!

City Observatory will be on holiday break through the end of the calendar year. We’ll return with our regular commentary and “The Week Observed” in January. To all, a joyous holiday season and a Happy New Year.

The Week Observed, December 6, 2019

What City Observatory did the past couple of weeks

1. Using seismic scare stories to sell freeways. The Pacific Northwest is living on the edge; sometime (possibly tomorrow, possible several hundred years from now) we’ll experience a Cascadia subduction earthquake that will do significant damage to the region’s infrastructure.  Fear of that event is real, but is also a potent talking point for selling freeway expansion. In a guest commentary, Robert Liberty looks at claims made by officials of the Oregon Department of Transportation, that the I-5 bridges over the Columbia River are vulnerable to the quake. As Liberty points out, the department’s own assessments of seismic risk show that these bridges are no more at risk than others.

2. Why Cyber Monday won’t cause Gridlock Tuesday. Black Friday, the biggest brick-and-mortar shopping day is here, soon to be followed by Cyber-Monday’s on-line seasonal peak. The growth of e-commerce has fueled rampant speculation that city streets will be clogged by Fedex and UPS trucks delivering an ever larger stream of packages. That concern is misplaced: every package delivery means fewer miles traveled for shopping; MIT transportation expert William Wheaton estimates that e-commerce produces 30 times less vehicle miles of travel per dollar spent than brick and mortar shopping. National travel data bear this out: shopping travel has declined in the past decade, especially among young adults who do the most on-line shopping. There’s another factor at work here as well:  the more packages they deliver, the more efficient UPS and Fedex become, more packages mean a higher “delivery density”–i.e. fewer miles traveled per package, which increases their advantage over brick and mortar shopping as volumes increase. So when you see that Fedex truck dropping off a package, keep in mind it means 30 fewer cars on the road to the mall.

Must read

1. New York contemplates the end of free parking. The New York Times reports that there are serious discussions about eliminating free parking in much of the city. There are an estimated 3 million on-street parking spaces in the city of New York, about one for every three New Yorkers, and more than 95 percent of them are free. Because they’re unpriced, they’re chronically in short supply. It’s also a huge subsidy from the majority of New Yorkers who don’t own cars, to those residents (and non-residents who do). Giving away valuable public street space for private car storage cripples the city’s ability to move people more efficiently and safely by transit, bikes and walk,ing, An interesting historical note: New York City only legalized on-street, overnight parking in the 1940s; at the time the city considered a $60 monthly fee (which works out to about $640 in today’s money); instead, it let car owners use the streets for free subject to the city’s arcane “alternate side of the street parking rules.”  A city of 10 million people with 3 million “free” parking spaces will always have too many cars, too much traffic, and too little money to fix its transportation problems.

2. Airports and Climate Change:  As we all know, transportation is now the largest source of greenhouse gas emissions in the US. While the bulk of emissions come from our cars and trucks, a large and growing share of emissions come from increased flying. While there’s growing consciousness about the personal culpability for emissions, aka “flight shaming,” Curbed’s Alissa Walker makes a strong case that our policies for subsidizing air travel and especially airport construction, contribute to air travel emissions.  The federal government collects billions in ticket taxes which are funneled back into airport subsidies. Local governments also finance airport construction, and especially finance land-side infrastructure (parking lots and car rental facilities) that combine one source of greenhouse gas emissions (planes) with another (cars).  Metro New York will spend $28 billion upgrading and expanding its airports; what if, instead, it was spending that much on inter-city rail transportation, Walker asks? We all have tough choices to make if we’re going to reduce our climate impact, that isn’t made any easier by these systemic biases in favor of high carbon modes of travel.

3. End Apartment Bans to Save the Planet.  Sightline Institute calls our attention to a new report from the United Nations which definitely raises the stakes for housing policy. You may think of the YIMBY “Yes in my back yard” movement as a highly local effort primarily concerned about housing affordability. But as the UN report makes clear, legalizing apartments in urban centers is an essential strategy for fighting climate change. In the US and elsewhere, the most common land use policies have made it difficult or impossible to build additional density in the most walkable, transit served locations, with the result that new development is pushed to the urban fringe, creating car-dependent housing patterns that will be locked in place for decades. As the report says:

“In some locations, spatial planning prevents the construction of multifamily residences and locks in suburban forms at high social and environmental costs.”

Multifamily buildings have a lower carbon footprint both because they tend to be smaller (per person) than single family homes, and because common wall construction reduces energy consumption. But more importantly, multifamily buildings in dense urban settings reduce vehicle miles of travel and carbon emissions from transportation. As this report suggests, YIMBY policies to allow more housing are important both for improving housing affordability locally, and tackling climate change globally.

Put this in your backyard to lower housing costs and save the planet.

New Knowledge

How Title I discourages integration. One of the most important federal education programs is Title I, which provides supplemental funding for elementary and secondary schools with a high proportion of low income students. Schools with lots of kids from low income families often face a double challenge, because of low property values, they have limited local financial resources, and the concentration of kids with economic and educational challenges makes it more costly to provide an equivalent education. Title I helps offset this difference by providing additional funding to schools in poor cities and neighborhoods.

But there’s a rub: As a new report from the National Coalition for School Diversity points out, because the Title I funding is tied to the fraction of kids in a school or school district from families below the poverty line, when a school or district does a better job of integrating, it can face the loss of funds. Federal and local administrative policies for Title I typically set some thresholds for qualifying for Title I funding, and when a school falls below that line, it can lose funding. This creates a penalty for schools that integrate. One school in Brooklyn fell just below a local 60 percent poverty threshold and lost $120,000 in annual funding.

It’s a gnarly problem:  Title I funds are limited, and the intent of the law is to target money to schools that have the highest concentrations of need. But the data on the value of economic integration are extremely powerful: the less we concentrate kids from low income families in specific schools or districts, the better their educational outcomes. One suggested solution is to have a “hold harmless” provision that protects schools (at least for a period of time) from a reduction in Title I funding if they fall just below an eligibility threshold.

National Coalition for School Diversity, Title I Funding and School Integration: The Current Funding Formula’s Disincentives to Deconcentrate Poverty and Potential Ways Forward, Diversity Issue Brief No. 9, November 2019

In the News

In a blog post entitled “The $140 million lie” The Vancouver Columbian praised our analysis of federal funding requirements for the proposed Columbia River Crossing as “meticulous” (it turns out that if Oregon and Washington choose a “No-Build” alternative they aren’t required to repay federal planning funds (contrary to a claim made by state DOT officials.

Correction:  Our November 19 Week Observed mistakenly placed Bloomington, Indiana in a different state; we regret the error.

The Week Observed, November 22, 2019

What City Observatory did this week

1. No Deposit, No Return: Another lie to try and sell the $3 billion Columbia River Crossing. The state’s of Oregon and Washington spent nearly $200 million planning the failed Columbia River Crossing, a 12-lane, five mile long freeway project connecting Portland and Vancouver, Washington. The two states walked away from the project over irreconcilable disagreements over the project and its financing. The two states’ departments of transportation have urged a revival of the project, saying that if they don’t break ground in the next five years, they’ll have to repay the federal government $140 million. That’s panicked some political leaders into throwing an additional $44 million in new money after the $200 million already spent. And, unfortunately, there’s actually no requirement that the money be repaid:  Federal Highway Administration regulations specifically waive repayment if local governments choose the “No-Build” alternative as part of the environmental review. The original CRC proposal was plagued with mendacity: false traffic projections, designs for an unbuildable, and then too-low bridge. The revived project seems to be setting out in the same dishonest path.

2. To solve the housing shortange, build a landlord. There’s growing interest in policies that re-legalize so-called “missing middle housing”–duplexes, triplexes and fourplexes, and which make it easier to add accessory dwelling units (ADUs) in single family zones. All of these policies envision “gentle density” which add a few units in built up neighborhoods.  All well and good, but as City Observatory friend Ethan Seltzer argues in this commentary, all of these units will require new landlords. If we’re going to incentivize the development of duplexes and ADUs, we’ll want it to be relatively straightforward to be a landlord for such units. It’s likely that growing tenants rights efforts will make it more challenging and less desirable to be a landlord, which could have the unfortunate effect of discouraging the development of in-fill housing.

Must read

1. Why does it take so long to implement missing middle housing reform? Here’s a plaintive cry from Sightline Institute’s Dan Bertolet, who reflects on the protracted struggle to actually implement “missing middle” housing policies, by re-legalizing duplexes, triplexes and fourplexes, as well as accessory dwelling units, in single family residential zones.  Policy analysts and political leaders have increasing come to realize that this kind of “gentle density” is a logical way to easy housing shortages and improve housing affordability, and many cities and leaders have pledged to get on with such policies. But the going has been extremely slow; as Bertolet points out, its taken Seattle nearly five years to get close to implementing its decision to allow secondary cottages (accessory dwelling units); a forthcoming to legalize duplexes and row-houses, by Bertolet’s calculations, may take another five years, meaning that it may finally be implemented nearly a decade after everyone agreed it was a good idea.

The underlying problem? There’s just so much resistance built into the city politics and the local land use planning process. It has historically been dominated by NIMBY factions, and is designed to move slowly, and change little. The solution, as Bertolet notes, is to have state governments intervene. They tend not to be as politically compromised as cities, and state mandates and overrides can also help break through the “prisoners dilemma” concerns that individual neighborhoods have that they alone will shoulder increases in density. Even with strong state prodding, its still going to take a lot of work to move missing middle forward.

2. Buff Brown on the soft bigotry of low expectations in city climate strategies. Bloomington, Illinois is in many ways a model liberal college town; it skews blue politically, and like many cities has adopted a sustainability plan and endorsed the Paris climate accords. But like a lot of cities, its biggest source of greenhouse gas emissions in transportation, and its losing ground. As transportation planner Buff Brown points out, the city’s vehicle miles traveled per capita have increased more than 11 percent in the past few years. And while the city recognizes it needs to lessen driving, and get more people transit, it’s actions don’t come close to measuring up to its rhetoric.  As Brown notes,

. . . the Action Plan has very weak goals with no math to tie them to the emission goals, and the lists of actions are orders of magnitude below what is necessary to meet these targets.

For example, the city’s goal is to increase transit ridership by 5 percent over the next six years, a gain of just 1 percent a year. This falls far short of what the city actually accomplished between 2002 and 2010, when transit ridership increased almost 9 percent per year.  And it also makes even more alarming the fact that the city has recorded a 6 percent decline in ridership in the past year. The city’s goal is at best a feeble one, far less ambitious than it actually managed in the past, and current performance is showing an decline that should be signaling a need for much bolder action.

At the same time transit ridership is declining, the city is moving forward with a plan to use local tax increment financing (TIF) monies to subsidize construction of a parking garage. The city is planning to spend $29 million in TIF funds to expand one parking garage by 200 spaces and build another with 379 spaces.  If cities are serious about climate change, this is plainly the wrong set of priorities.

3. Even down under, Induced demand. A new report from Australia confirms what science is showing worldwide:  building more highway capacity simply stimulates more driving, and does nothing to ease congestion. The report, commissioned by a Australian ride-hailing company Go-Get

City-dwelling Australians need to end their love affair with private cars and stop building new roads to beat congestion . . . building new roads or expanding existing infrastructure . . .  signals to drivers that commuting will be easier so more road users fill the newly created space, which is known as “induced demand”.

The solution for transportation problems is largely found in how we build our communities. Places that are dense, diverse and walkable enable people to meet many daily needs without car travel. And it turns out that these are exactly the kind of places for which there is robust demand.

“We find that communities that do have walkability, that have local cafes and amenities, they are the most in demand with consumers looking for a home, they have the highest house prices and demand.”

New Knowledge

It’s no accident: How we talk about crashes matters. Its commonplace for media reports of car crashes to describe them as “accidents,” and systematically downplay driver responsibility and instead blame victims.  If you wore dark clothing or were outside a market crosswalk, or it was dark, your injury or death is implicitly your fault.

A new study published by the Transportation Research Board, examines the way that common descriptions of traffic crashes affects reader perception of who’s responsible.  The authors presented subjects with three alternate text descriptions of a single car crash.  One version was pedestrian focused, a second version was driver focused and a third was driver focused and provided additional context about the frequency and location of car crashes. The key finding was the way in which the crash was described made a strong impact on reader’s decisions about who was to blame and policies ought to be pursued to reduce crashes and injuries.  A more complete and driver focused description of crashes led to more support, for example, for infrastructure solutions.

The author’s helpful have some succinct advise for reporters and editors; and its a grammar for talking about roadway deaths that all of us should embrace.  To translate one big of unfortunate academese: “non-agentive” ignoring humans and their choices, for example, don’t say “the pedestrian was struck by the car,” say “the driver ran over the pedestrian.”

  • Avoid non-agentive and object-based language.
  • Shift the focus away from the pedestrian and towards the driver (or if necessary, the vehicle).
  • Be conscious of the counterfactuals that you include. Specifically, if you mention that the pedestrian was outside a crosswalk, check Google Street View to quickly determine whether there are any crosswalks available and note that in many jurisdictions it is legal to cross outside marked crosswalks.
  • Include data on the number of crashes, injuries or deaths, preferably locally. Time permitting, consider contacting a local transportation, urban planning, or public health expert to provide further context.

Tara Goddard, Kelcie Ralph, Calvin G. Thigpen, Evan Iacobucci, “Does news coverage of traffic crashes affect perceived blame and preferred solutions? Evidence from an experiment,” Transportation Research Interdisciplinary Perspectives, 2019.

In the News

Willamette Week wrote about our commentary on the untrue claim that not proceeding with the $3 billion Columbia River Crossing will require Oregon and Washington to repay $140 million to the federal government, in an article entitled:  “One less reason to restart the Columbia River Crossing.”

The Oregonian featured our critique of this same claim about a federal repayment liability in its story discussing Governor Kate Brown and Governor Jay Inslee’s plan to revive the Columbia River Crossing project.

 

The Week Observed, November 15, 2019

What City Observatory did this week

1.  Copenhagen’s cycling success hinges on tax policy and pricing, not just bike lanes.  The New York Times offers up yet another postcard view of cycling in Copenhagen, where riding a bike to school or work is the most common mode of transportation.  The Times reports that this is because cycling is just “easier” and “more convenient” due to the city’s extensive networks of bike lanes.  That’s true, but leaves out an important part of the story.  In Copenhagen, new cars are heavily taxed (a 150% sales tax), and gasoline costs more than $6 per gallon.  In addition, parking is in short supply, and the city is raising the price of parking to further discourage car use.  It’s also critical to recognize that the city’s compact development pattern means more common destinations are close-at-hand, making cycling a viable means of transportation for daily trips. By all means, lets build more bike lanes; but getting a more balanced transportation system also depends critically on asking cars to pay something close to the full social, environmental, and economic costs they impose on all of us.

2. Carmaggedon is a no-show in Seattle once again. Earlier this year, when Seattle shut down its crumbling Alaskan Way viaduct, traffic experts widely predicted gridlock, expecting more than 90,000 cars to divert to city streets. That didn’t happen, and in fact traffic levels were subdued. The same predictions were made about the effects of turning on tolling on the new $3 billion SR 99 downtown tunnel (the roadway built to replace the viaduct). Traffic engineers predicted a third or more of tunnel users would divert to city streets, producing Carmaggedon. Once again, the predictions were wrong, at least as evidenced by Google Maps contemporaneous reporting of road conditions.

What Seattle’s experience shows–again–is that measures that reduce road capacity, or price roadways, actually reduce traffic levels. It’s the opposite of “induced demand”–reducing highway capacity in the city reduces the number of cars flowing onto city streets, and enables the transportation system to work better. The naive models that highway engineers peddle, implying that traffic is an inexorable, tidal force, are simply wrong. If you want less traffic and congestion, price roads and constrain highway capacity.

3. The city as a labor saving device.  Some of the most powerfully transformative technologies have been those that save human labor; indoor plumbing, washing machines, refrigerators, vacuum cleaners and the like have all reduced dramatically the amount of domestic work, and given us more leisure time. While we don’t think of cities as a “technology,” a dense, mixed-used urban neighborhood is actually a big time saver. With more common destinations (grocery stores, coffee shops, restaurants, and friends) close at hand, cities can be a big time saver. A big part of the demand for urban living comes from households looking to live in places that save them the one form of wealth you can’t get more of:  time.

The latest in labor-saving technology: a walkable city.

Must read

1. The limits of micro-transit. The problem with buses, rail lines, and other “fixed route” transit, is that unless you live at a station or bus-stop, and all your destinations are adjacent to stops, you have to walk to and from the place the transit serves. One regularly imagined solution to this problem is the idea of micro-transit:  small vehicles that would ply varying routes, picking up people at their homes or orgins in one area, and then ferrying them to a common destination (or destinations).  In theory, that ought to eliminate pesky walking, and given enough trip density (and computerized scheduling) it ought to be possible to pool multiple trips with a single vehicle. The trouble with that idea, though (as Human Transit’s Jarrett Walker has frequently pointed out, is that is inefficient and expensive.  For those who are looking for a succinct statement of what microtransit is (mostly) a dead-end, they have a new policy brief that makes that case.  Here’s a excerpt:

Picking people up at their doorstep involves traveling greater distances than operating service along a fixed route, and a microtransit driver in a van or car can carry far fewer people than the operator of a bus or train. For these reasons, microtransit typically costs agencies much, much more to run than an average bus route. And while subsidies for bus and train service fall as more people ride, microtransit is locked into a high-cost format that consumes more subsidies as usage increases.

2. Why more cyclists are dying. The National Transportation Safety Board has just reported the grim death tally for 2018: Some 857 cyclists were killed on the road, the highest number in decades. In the face of a roundly (and justifiably) criticized call by the NTSB to make helmets mandatory, Bicyclist has a thoughtful look at the reasons why cyclists deaths are increasing.  By identifying the causes of deaths, this analysis points to a more reasonable set of recommendations for reducing the toll. Chief among the causes: more people are driving more miles (we might add, due to cheaper gasoline), more drivers are distracted by technology like smart phones, and  more of the cars on the road are especially lethal trucks and truck-like SUVs. It’s also true that more people are cycling than in previous years, but it appears the “safety-in-numbers” effect of more regularly encountering cyclists has been more than offset by the other negatives. As a result, cities who only a short while ago adopted great-sounding goals around Vision Zero, have seen this particular metric going in the wrong direction.

3. New cars are for old people. Michael Sivak has an interesting analysis in the shift in the demographics of car buyers in the past decade.  Sales data show that new car buyers are proportionately older than ever. Sivak reports that 52 percent of all car buyers are 55 or older. Only 28 percent of car buyers are under age 45, down from 45 percent of all buyers just a decade ago. Much of this trend is driven by the overall aging of the population (millions more baby boomers are turning 65 every year), but much of it is evidence of the declining interest (and perhaps purchasing power) of young buyers.

 

 

New Knowledge

More evidence of the ineffectiveness of rent control.  Around Germany, municipalities have implemented new rent control regimes in the past few years, and the roll-out of the program has provided researchers with a natural experiment for testing the actual effects of rent control against the theoretical predictions. Three researchers, Andreas Mense, Claus Michelsen & Konstantin Cholodilin, gathered data on the offering rents for apartments in German cities.

As the authors relate, much as in the United States, the growing interest in urban living has fueled youth migration to cities, which has led to a shortage of housing and an increase in rents.  This was the proximate cause of legislation authorizing local rent caps.

. . . since 2010, urban agglomerations have become more attractive. Thanks to an inflow of migrants from smaller cities and from abroad, the population of large German cities began to expand quickly. The result was a housing shortage, particularly putting pressure on rents for new contracts

Like many rent control systems, the German approach regulates the rents of some units, but not others. One of the key findings of this study is that while units under rent control are less expensive, rents go up even more in the uncontrolled sector.  In addition, rent control isn’t “means tested” meaning that people can benefit from rent control regardless of income.  One of the theoretical predictions of economics is that rent control will tend to artificially discourage people from moving to a new home.

They find that:

. . . regulated rents decreased immediately after the rent cap became effective, while rents in the free-market segment increased after a lag of one to two months . . . an increase in free-market rents in response to rent control is a clear sign of misallocation of households to housing units . . . 

Rent control allows some households, which otherwise would have been unwilling to rent a unit in the market, to compete for rent-controlled units, thereby replacing other households with higher willingness to pay. The latter households move to the free-market segment and bid up the price there.

Mense, Andreas; Michelsen, Claus; Cholodilin, Konstantin. Rent control, market segmentation, and misallocation: Causal evidence from a large-scale policy intervention,  2019.  FAU Discussion Papers in Economics, No. 06/2019, Friedrich-Alexander- Universität Erlangen-Nürnberg, Institute for Economics, Erlangen

 

In the News

The New York Times quoted City Observatory’s report Less in Common in their article “Are my neighbor’s spying on me?”

 

The Week Observed, November 8, 2019

What City Observatory did this week

A two cent solution to climate change?  Around the world, plastic bags are an environmental scourge, both in the form a litter (a nuisance) and as a threat to wildlife. In response, many cities have implemented plastic bag fees, asking consumers to pay a nickel or more per bag. The result of such policies is dramatic:  In Britain, plastic bag consumption has fallen almost 90 percent. Our thought is that we apply the same idea to carbon pollution, asking people to pay about 2 cents per found of carbon that they emit.  That two cent a pound fee works out to about $40 a ton, which is in the range for what many economists are recommending as a way to fundamentally shift incentives in a way that would reduce greenhouse gas emissions.  If we’re willing to charge people a nickel for a plastic bag, why don’t we consider charging 2 cents a pound for carbon?

 

Must read

1. Why all the panic over gentrification? The premise of Matthew L. Schuerman’s new book “Newcomers” is that “gentrification is all around us.”  In his review at Washington Monthly, Wil Stancil asks whether gentrification is really as pervasive as Schuerman imagines:

As a demographic researcher, I decided to check. Using U.S. Census data, I looked at the share of people in New York, San Francisco, and Chicago living in places that met Schuerman’s definition of having gentrified between 2000 and 2016. In New York, it’s 3.1 percent of residents. In San Francisco, the number is 4.4 percent. In Chicago, it’s 4.8 percent.  . . . Using Newcomers’ own definition, the story of urban America is not a tidal wave of gentrification but creeping racial and economic transition. 

As City Observatory’s research, and that of others has regularly shown, concentrated poverty and neighborhood decline are far more widespread and far more devastating for the urban poor than the relatively few instances of gentrification.  But for author’s like Schuerman, who don’t dig deeply into the data, its unsurprising that their personal perceptions about gentrification hinge greatly on lived personal experience. It turns out that gentrification is all around “us,” if “us” is defined as well-educated, upwardly mobile, urban-leaning intellectuals.  As Stancil writes:

Almost by definition, it is members of the urban professional class who are the most likely to be exposed to affluent neighborhoods in the late stages of gentrifying. Among movers and shakers in media and politics, gentrification may truly seem to be everywhere they go. Often, it’s because they’re bringing it with them. 

Gentrification seems to be one of those subjects where personal anecdotes regularly trump careful data. We’d be much more likely to effectively address neighborhood change if the policy dialog were better grounded in facts.

2. How to talk about reforming housing policy.  Sightline’s Michael Andersen opens the hood on the right way to talk about changing housing policy. Earlier this year, Oregon passed pioneering legislation that legalizes duplexes, triplexes and fourplexes in single-family residential zones in the state. One key to this breaktrhough legislation was a conscious decision to talk about housing in different way than we have in the past.  The Sightline Institute prepared a guide to the rhetoric of housing reform that underscores the key messages. For starters, rather than talk about “density” and “housing units” Sightline urges advocates to adopt more meaningful and concrete terms that allude to specific kinds of homes, i.e. “re-legalizing duplexes.”  The umbrella term “Missing Middle Housing” helps communicate what they’re trying to achieve that puts it in a historical context and defuses some of the fears that added density can raise.  Sightline has published its full communication guide, and its a helpful resource for all housing advocates.

3. Uber’s car killed a pedestrian because it was programmed to ignore people outside crosswalks.  More than a year ago, Elaine Herzberg was struck and killed by an Uber autonomous vehicle in Arizona. The National Transportation Safety Board has been investigating the crash, and newly released documents show that Ubers sensors detected Herzberg several seconds before the crash, plenty of time to brake the car before striking her.  But the car’s software was programmed not to treat people outside of crosswalks as objects.  The cars software struggled to correctly categorize Herzberg, and by the time it alerted its on-board human supervisor of its confusion, it was too late to avert the crash. As Wired relates:

. . . despite the fact that the car detected Herzberg with more than enough time to stop, it was traveling at 43.5 mph when it struck her and threw her 75 feet. When the car first detected her presence, 5.6 seconds before impact, it classified her as a vehicle. Then it changed its mind to “other,” then to vehicle again, back to “other,” then to bicycle, then to “other” again, and finally back to bicycle.

It never guessed Herzberg was on foot for a simple, galling reason: Uber didn’t tell its car to look for pedestrians outside of crosswalks. “The system design did not include a consideration for jaywalking pedestrians,” the NTSB’s Vehicle Automation Report reads.

It appears that the software for autonomous vehicles is replicating the implicit bias of current road use conventions: and for pedestrians, that can have fatal consequences.

In the News

Willamette Week has a story reporting on the pushback climate advocates are offering to Portland’s proposed $3 billion, highway heavy transportation bond measure, slated to go before voters next year.  The article quotes City Observatory’s Joe Cortright on the climate implications of highway “improvements.”

 

The Week Observed, November 1, 2019

What City Observatory did this week

1. Tim Bartik explains business incentives. States and cities spend about $50 billion a year on tax breaks and other incentives to try to influence business location decisions.  The nation’s leading scholar on the subject, Upjohn Institute’s Tim Bartik, has a new book explaining succinctly and in non-technical terms about whether these incentives work and what can be done to make them work better. Bartik’s research shows that roughly three-quarters of time, incentives make no difference to business location choices–they’re simply wasteful. But its almost always impossible to convincingly argue that any individual deal is one of the three-quarters that are giveaways, and it’s very much in the interest of prospective recipients (and economic development agencies) to always argue that their incentives is making a difference. Ultimately, the reason incentive deals persist (and have grown to such huge proportions) is that the political calculus rewards deal-makers, who are seen as “doing something” to benefit the economy, while the costs are generally hidden and passed on to others. Because of this dynamic, we can’t expect incentives to go away, but Bartik has four concrete suggestions for policy makers looking to minimize the costs of incentives and maximize their benefits. You can buy printed copies of the book, or download a free PDF; either way, its a must read for anyone interested in economic development.

2. Portland’s regional government is push-polling a climate change denial message. Portland and Oregon have both fallen behind in their stated efforts to reduce greenhouse gas emissions, largely due to increases in driving. At the same time, the regional government is moving forward with a multi-billion dollar transportation funding measure. It commissioned some survey research, ostensibly to gauge public attitudes on the subject, but in practice, the survey is a thinly veiled effort to foist some phony choices and factually inaccurate claims on the public. The survey repeatedly claims that greenhouse gas emissions can be reduced by eliminating congestion so that cars don’t have to idle in traffic so much–a thesis that has been disproven. Actually, the opposite is the case: measures that speed traffic create induced demand, and the carbon emissions from more driving increase greenhouse gas pollution, rather than decreasing it. The purpose of the survey is plainly to develop the talking points and media campaign for selling a funding measure. It’s dishonest and reprehensible for a public body to disseminate a phony and misleading message about the causes, and solutions to the climate crisis. This is what climate change denial looks like.

3.  More evidence for the importance of talent to metropolitan economic success.  The Brookings Institution has a new report “Talent-Driven Economic Development:   A new vision and agenda for regional and state economies.” As the title suggests, the authors build the case for focusing on building skills and talent and outline the steps cities and states can take. Their research confirms the strong relationship between the educational attainment of the population and the productivity of the local economy.

It’s clear that working to increase the talent level of the local population is key, and the report describes a series of strategies for doing so, including more formal education (helping high school students reach college, and succeed once they’re there), and also helping and encouraging employers to provide additional training to their entire workforce (and not just those with the highest levels of skill, which tends to be the pattern). The sole caveat one might add to this report is the observation that talent is mobile, and well-educated workers tend to be gravitating to great urban locations. As a result, cities should look both to investments in increasing skills and talent, and also in building the kinds of great urban neighborhoods that will attract and retain talented workers.

 

Must read

1. Forget new technology, fixing transportation is about making the right choices. Henry Grabar has a perceptive essay at Slate debunking the fascination with technological solutions to our transportation problems, and emphasizing that how we build cities, and how we employ well-proven technologies like bikes, buses and elevators is really where the future lies. Hyperloops and autonomous vehicles get all the attention it seems, but miss the point that building walkable, human-scaled cities and neighborhoods rather than sprawling automobile-dominated ones is needed.

The tools we need to change transportation are right there in front of us. It’s not the lack of bleeding-edge technology that has stopped us from building cities where a person can live without owning a two-ton, $25,000 vehicle . . .  It’s not for want of “innovation” that we aren’t turning parking into parks, or traffic-clogged arterial roads like New York’s smoggy crosstown arteries into multimodal streets. It’s not the deferred promise of automation that stops us from charging people for the full, ice cap–melting cost of driving. The future of transportation is not about inventions. It’s about choices.

A big part of our problem is that we’re obsessed with movement, rather than with destinations. We treat transportation as an end in itself, rather than as a means of accessing people, opportunities and experiences. When we build systems to maximize speed and travel, we produce sprawling, automobile-scaled environments that are inimical to enjoying life. Just as the last great technology revolution (the automobile) only made things worse, newer iterations, like the autonomous vehicle and the hyperloop, will not overcome this essential problem. As Grabar writes:

A better world is possible, and it doesn’t start in the U.S. patent office or with what is on display at the Detroit Auto Show, the Paris Air Show, or the Consumer Electronics Show in Las Vegas. It won’t require supersonic travel tubes or cars that drive themselves. The ideas are here if we want them. We’ll have to rethink the trip, but even more than that, we’ll have to rethink the places we’re trying to connect.

(Slate)

2. Devin Bunten:  Housing affordability and gentrification. Writing at CityLab, economist Devin Bunten sets out to untangle the links between housing affordability and gentrification.  This is a thoughtful, and extremely well-written essay.  It makes many important and salient points in eloquent and persuasive fashion.  For example, Bunten writes:

Housing policies are designed to ensure that new neighborhood entrants are as rich or richer than those who arrived before them. The typical resident of multifamily housing in the U.S. earns half as much as the typical resident of a detached single-family home. A ban on apartments is a ban on these families. Within single-family-home neighborhoods, minimum lot sizes are wealth sieves.

While in some senses these problems are separable (affordability problems tend to be region-wide; gentrification is much more localized), it’s difficult to argue that they are un-related. Gentrification is a more severe and widespread problem in markets where housing shortages and affordability problems are most acute. In many cases, gentrification occurs when demand for high amenity housing can’t be met by building additional units in high amenity neighborhoods, chiefly because of zoning limitations and NIMBY opposition. Gentrification and housing affordability are both symptoms of our shortage of cities:  the growing demand for urban living is running headlong into serious constraints on building more housing in great urban neighborhoods. Gentrification and housing affordability may not be exactly the same problem, but an essential part of any solution to both problems is allowing for more housing to be built in cities.

3. How Los Angeles is planning to make its housing problems worse.  It’s well-established that Southern California’s housing affordability problems stem from the difficulty of building additional housing in the most desirable parts of the region. In theory, the state’s land use laws require the region to plan for enough space to accomodate housing demand, but there’s a hitch. The estimated regional total demand for new housing is allocated across different local jurisdictions under a process know and RHNA, the Regional Housing Needs Allocation.  The allocation is prepared by the Southern California Association of Governments, a confederation of cities and local agencies. The problem is that the allocations are skewed away from high demand locations. As Leonora Camner, writes in an Op-Ed in the Los Angeles Times, the allocations bear little relation to where housing is needed.

Beverly Hills, which has nearly twice as many jobs (57,000) as people (34,400), needs only 1,373 new units of housing. Meanwhile, the desert city of Coachella, with a population of 42,400 and 8,500 jobs, will be expected to build a whopping 15,154 units.

This matters because building more housing at the urban fringe, far from job centers, makes the region even more car-dependent, and locks in long commutes and high levels of greenhouse gas emissions. It may seem like a minor technocratic detail, but tweaking these planning estimates to allow for more new housing where its most needed is essential to addressing housing affordability, transportation challenges, and climate change.

 

 

The Week Observed, October 18, 2019

What City Observatory did this week

1. Our 5th Anniversary.  October 17 marked 5 years since we started publishing our research and commentary at City Observatory. We reflect back on five years of work, and thank all those who made it possible, and especially you, our readers.

2. Rites of Way. A complex set of laws and customs, some legislated and enforced, but many unwritten, govern the way we use public space. For a century, we’ve progressively warped these rules to favor car movement. But there’s a lot we can do to redefine the priorities and informal rules that govern public spaces, notably urban roads. Simply erasing the center lines on urban streets, for example, effectively forces drivers to negotiate the use of space with other road users, rather than assuming they have unfettered rights on their side of a line. Experience in other countries shows that there are many possible regimes to guide how we make use of shared spaces; putting greater responsibility on individuals to look out for others, rather than commandeer the “right of way” could make the urban realm safer and more pleasant.

3. More capacity just creates more traffic. The engineering profession proffers the naive belief that car traffic is like a kind of incompressible liquid:  If you constrain it in one place, it will flow out elsewhere. That incorrect analogy is regularly used to argue against projects that reduce car capacity. But time and again, when we reduce the road space dedicated to cars, traffic mostly goes away. That’s happened again in New York where the “Miracle on 14th Street” (eliminating through car movements has sped buses, without any impact on adjacent streets), and is likely to happen again when Seattle finally starts charging tolls on its new $3 billion deep bore tunnel under downtown. It’s time we thought of urban freeways and arterials as creating congestion, rather than relieving it.

Must read

1. More on the the Miracle on 14th Street:  No gridlock on 13th and 15th Streets.  We (like others) are celebrating the success of New York City’s speeding bus service on 14th Street by dedicating more right of way to buses. Critics fretted that banning cars on 14th would produce gridlock on parallel streets like 13th and 15th. With the experiment in place, the verdict is now in. The critics were wrong.
Travel monitoring firm Inrix checked its records before and after implementation of the “bus only policy” on 14th Street and found trivially small changes in traffic speeds, with declines ranging from 0.1 to 0.4 miles per hour, which Inrix regards as statistically insignificant. They conclude:
. . . initial congestion fears appear misplaced. According to INRIX data, the 14th St busway had no discernible performance changes to neighboring roads.
Can we just say what a big deal this is?  One of America’s leading traffic monitoring firms (one which we’ve occasionally excoriated for exaggerating congestion costs) says that closing a major urban thoroughfare to private car traffic had no negative effect on traffic congestion.  If you want less congestion in cities, you should be restricting car capacity, not building more.
2. Speed Cameras save lives. New York’s Legislature has authorized a big increase in the number of school zone speed cameras, and importantly extended their hours of operation. The effect has been fast and dramatic: The number of crashes and injuries in the first week of school in New York City is down by roughly one-quarter from the previous two years. StreetsblogNY quotes State Senator Andrew Gounardes:
“Speed cameras work. Period. They change driver behavior and cause people to slow down, protecting New Yorkers from injury and death in traffic collisions.”
3.  Steve Higashide: Demand more from the transportation bill. Transit Center’s Steve Higashide has a no holds barred take on proposed congressional transportation legislation. In the face of a growing climate crisis, it’s essentially business-as-usual and highway widening as usual. The $287 billion bill has a $10 billion sop for those concerned about the fact that transportation is now the largest source of greenhouse gases (and is growing), but as Higashide points out:
The new bill “acknowledges” climate change in the same way that sending a pallet of bottled water to Flint, Michigan, “acknowledges the lead crisis.”
While the bill nominally earmarks $10 billion to deal with climate change, a significant share of that will be used to protect existing infrastructure from ongoing climate change (i.e. flood proofing roads) rather than reducing vehicle miles traveled and carbon emissions.  The bulk of the bill, which is as yet un-funded, represents a continuing subsidy for more road-building, more driving, and more climate destruction. Higashide has a new book:, “Better Buses, Better Cities.” It should be on the congressional required reading list.

New Knowledge

Redlining’s fading relevance.  In the 1930s, the federal government codified the existing widespread racialized attitudes about local housing markets when the federal Home Owners Loan Corporation  published maps delineating desirable and undesirable neighborhoods.  (While the Feds appropriately get the rap for this, its important to note that the maps were largely compiled by local real estate professionals hired under the New Deal, and reflected established local biases). The legacy of redlining cast a long shadow over those neighborhoods, and led to systematic disinvestment for decades.  But today, eight decades later, are these redlined areas relevant to policy.  Several presidential candidates have proposed targeting federal investment into areas redlined in the 1930s.

Brookings Institution’s Andre Perry and David Harshberger question the relevance of 1930s redlining to 2020 policy-making. They analyze the location and demographics of these redlined areas today, using the most recent available Census data. When they were drawn, redlined areas closely mirrored neighborhoods with high levels of black residents.  But that’s no longer true today, as Perry and Harsberger’s analysis shows:

More generally, the redlined areas are no longer a good match with the hardest hit neighborhoods in US metro areas. Many formerly redlined areas have changed so dramatically–either been depopulated, or revitalized, and no longer hold many poor families. Conversely, areas that weren’t poor in the 1930s (think the first-tier suburbs developed in the 1940s and 1950s) hold a significant number of low income households. As a result, using old-timey redlining maps to define today’s target areas doesn’t make much sense.  And there are other issues as well: American population was very differently distributed across metro areas eight decades ago; while rustbelt cities have large redlined areas, Western and Sunbelt cities have very few.  While redlining was a real factor contributing to urban disinvestment then, those old maps are a poor guide to where we need to invest now.

Andre Perry and David Harshberger, America’s formerly redlined neighborhoods have changed, and so must solutions to rectify them, Brookings Institution, Metropolitan Policy Program, October 14, 2019.

In the news

Treehugger highlighted our critique of the Transportation Research Board’s report calling for even more spending on highway widening, which will lead to more driving and increased greenhouse gas emissions.

The Week Observed, October 11, 2019

What City Observatory did this week

1. Transportation for America won’t be fooled again..  After years of getting rolled by the freeway lobby, it appears that T4America has finally said “Enough.”  Transit and active transportation activists have been roped into an unholy alliance with highway advocates, pressing the federal government for more money for “multi-modal” transportation, with the vain hope that if more money goes to transportation, some additional crumbs will come their way.  In practice that’s meant tons of money for new freeways, and mostly token amounts for “alternative” modes of transportation.  Last week, Transportation for America finally pulled the plug on this strategy, they write:

We’ve been going down the wrong road on transportation long enough:  Transportation needs a change in priorities and policies, not just more funding for the same failed approaches of the past.

To its credit, T4America has laid out three guiding principles that can set us on the right course:  Fix it first, Safety before speed, and Accessibility not mobility. These are a solid foundation, to which we might append:  No free ride–getting serious about pricing road transportation is the key to getting value for our expensive investments in infrastructure.

2. Trouble in Ecotopia. The upper left hand corner of North America has long regarded itself as the bastion of ecological passion; Ernest Callenbach coined the moniker “Ecotopia” to describe the region in the 1970s. And to be sure, the region, stretching from San Francisco north to British Columbia is the home of many innovative pro-environmental policies. And the region’s leader uniformly talk a good game about their concern for global warming and their fealty to the Paris Climate Accords.

But as we take a close look at their greenhouse gas inventories, every part of Ecotopia is chalking up big increases in carbon emissions, in spite of adopted commitments to bring them down. The main reason: people are driving more in the region, not because of a growing economy but because of cheap gas. And here’s where action falls short of rhetoric: Ecotopian political leaders, including for example, Governor Jay Inslee, are supporting major freeway construction projects and other efforts that subsidize more driving, and more greenhouse gas emissions. If Ecotopia is ever going to live up to its stated values, its got a lot of work to do.

Must read

1. Cars are death machines, whether automated or not.  Alison Arieff challenges the idea that autonomous vehicles will do anything to address the inherent dangers of cities full of cars.  Autonomous vehicles seem to have a blind spot when it comes to, some things, specifically cyclists and pedestrians.  That’s led the industry, unsurprisingly to propose to equip people with technology so that they can be “seen” by machines.  As Arieff argues, this is perverse:

Among the safety measures proposed by car companies are encouraging pedestrians and bicyclists to use R.F.I.D. tags, which emit signals that cars can detect. This means it’s becoming the pedestrian’s responsibility to avoid getting hit. But if keeping people safe means putting the responsibility on them (or worse, criminalizing walking and biking), we need to think twice about the technology we’re developing.  This may be the worst outcome of the automobile-centered 20th century: the assumption that it’s people who need to get out of the way of these lethal machines, instead of the other way around.

If we want to reduce the growing number of road deaths, the best solution is to encourage fewer people to drive.  Traffic deaths are directly related to the number of cars on the road and the number of miles they are driven.  Reducing car travel is a key to achieving many important objectives, with pedestrian safety, promoting livable urban spaces and reducing greenhouse gas emissions.

2. Want affordable housing? End parking requirements.  Sightline Institute’s Michael Andersen has an insightful look at the economic implications of Portland’s Residential Infill Policy, which is part of city’s implementation of Oregon’s new “missing middle” legalization of duplex, triplex and four-plex housing in single family zones. A critical feature, as Andersen points out, is whether the city code requires the provision of off-street parking.  If parking is required, the economics of development suggest that the city will see fewer and more expensive housing built; with parking requirements relaxed, a similar site is more likely to be developed with more housing units at lower price points.  As Andersen says:

More broadly, there’s an issue of principle here that could apply in any city: Mandating off-street parking, even when we’re fully aware that it makes more and cheaper homes impossible, requires a judgment that housing cars is more important than housing people.

Nowhere in Portland—nowhere on the planet—is that true.

3. The racial divide between baseball and soccer in Atlanta.  Comedian George Carlin has a classic monologue contrasting the pastoral character of baseball with the militaristic vibe of football–in baseball, the object is to get home; in football, its to reach the “end zone.”  Sports Illustrated has a fascinating essay exploring a somewhat different schism in Atlanta.  The baseball Braves have moved out of their central city location into taxpayer subsidized stadium in suburban (and still, relative to Atlanta, disproportionately white) Cobb County.  Meanwhile the city’s new Major League Soccer franchise, Atlanta United, has set up shop in the downtown Mercedes Benz stadium.  As Sports Illustrated points out, the differences between the locations and the teams respective fan bases is striking. With baseball in the suburbs and soccer in the city, there’s a huge missed opportunity for these different groups to interact:

Had the Braves not left downtown, supporters in red-and-black United jerseys might have been packed in tight on nights like this with baseball fans in navy Braves T-shirts, at bustling bars and on sweaty train rides home. They might have traded smiles and congratulations. Maybe raised glasses. Maybe shared a fleeting moment of commonality with someone who looked different or believed in something else.

While Atlanta United’s location is more urban and its supporters more diverse, the franchise is still not fully a model of inclusion. As we’ve reported at City Observatory, the teams practice field was built on a site where the suburban city of Marietta used tens of millions of taxpayer dollars to acquire and demolish low income housing, mostly occupied by people of color.

New Knowledge

A decline in international immigration. The Trump Administration has vilified immigrants and erected barriers to people moving to the US, and now immigration to the US has fallen to its lowest level in a decade, according to Census data compiled by the Brookings Institution’s Bill Frey.  In just the past year, data from the latest American Community Survey suggest that the number of foreign born residents of the US increased by just over 200,000, down almost 75 percent from the level of the previous year.

The irony is that the biggest declines in immigration have been in blue states, while the number of foreign-born persons living in red states continues to increase. As Frey observes:

 . . . over the last year, Clinton states, as a group, registered declines in their foreign-born populations—including substantial declines in New York and Illinois—and more modest declines in California, New Jersey, and Maryland, among other states. Meanwhile, Florida and Texas exhibited significant gains, as did other Trump states including Arizona, Pennsylvania, and Ohio.

Foreign immigration has been an important driver of the US economy; many of those who work for, and who start the nation’s most innovative companies have chosen to move to the US; a decline in immigration is a potential threat to future economic prosperity.

In the news

An editorial–“Back to basics: Walkability in Chapel Hill”–in the the Daily Tar Heel quoted our research on the value premium commanded walkable homes.

The Week Observed, October 4, 2019

What City Observatory did this week

1. We debunk the Wall Street Journal’s claim of an exodus of young adults from cities.  Last week, the Wall Street Journal trumpeted an “exodus” of 25 to 39 year old adults from cities. Upon closer inspection, the data cited by the Journal simply don’t support this conclusion.  When your survey-based estimates don’t reach a threshold level of statistical significance, when they’re contradicted by other, more straightforward ways of making a similar calculation (ones with a lower margin of error), when even by your own calculations at least eight major cities are recording increases in the young adult population, and when the trend you claim to have spotted has declined by 50 percent in the past year, it’s probably not valid evidence for claiming that there’s an exodus of young adults from the nation’s cities. We understand the journalistic temptation to want to be the first (or among the first) to make a provocative contrarian call about some well-known trend, but in this case the evidence is just too thin to be believable.

These Census data show Portland’s 20-39 year old population increasing, not decreasing as claimed by the Wall Street Journal.

2. A modest proposal:  EIS for DMV. Many states have environmental impact review requirements which apply to government regulatory decisions.  In California and Washington State, anti-housing activists have used these state EIS-requirements to delay or block apartment development, ostensibly because of the negative environmental consequences associated with new buildings. That gave us a thought: if we’re worried about the negative environmental effect of government granting permission for something, why shouldn’t we be applying this standard to the biggest single source of greenhouse gas emissions: cars.  Suppose that in order to get a new car registered, you had to file an environmental impact statement disclosing its effects on air pollution and traffic congestion. It’s not so far-fetched–in Japan, in order to register your car, you must show that you have a private-off-street parking spot for it. If the EIS for new housing is a good idea, we really ought to be applying it to cars as well.

Must read

1. Your definitive guide to road safety. David Levinson, aka The Transportist, is an invaluable resource for those interested in transportation policy. Levinson writes with the scholarly knowledge (and careful footnoting) of an academic, but with a clear and accessible voice.  He has a new explainer on road safety that lays out in a balanced way all of the various strategies for reducing the carnage on the world’s roadways. Levinson catalogs, explains and quickly appraises 21 different strategies for reducing road crashes and associated injuries and deaths, ranging from roundabouts and road maintenance, to vehicle automation and urban density. It’s a quick tour of the horizon, but serves as a useful and balanced framing for thinking about safety. In the end. Levinson regards the continued (and growing) death toll on roads as an indication that we lack resolve, more than we lack knowledge:

Like congestion and global warming, the road death toll can be significantly reduced, but there is little evidence that the United States, in particular, is collectively interested in solving it. While there are obviously advocates, they do not have the upper hand, otherwise deaths would not be rising in recent years off its 2014 lows.

2. America’s most vegan/vegetarian cities. A growing proportion of the American population is either reducing or eliminating meat in their diets. Going vegan is easier in some cities than others. WalletHub has compiled an array of data about the presence and cost of vegan and vegetarian options to create a ranked list of cities nationally. By their reckoning–which is based on a complex weighted formula involving 17 different variables ranging from the number of farmer’s markets to the proportion of restaurants with vegan options–Portland is the most vegan friendly city in the nation.  Here’s their list of the top 10:

Rank City Score
1 Portland, OR 66.55
2 Los Angeles, CA 62.72
3 Orlando, FL 59.74
4 Seattle, WA 57.86
5 Austin, TX 57.69
6 Atlanta, GA 57.52
7 New York, NY 57.3
8 San Francisco, CA 57.1
9 San Diego, CA 56.81
10 Tampa, FL 56.31

At the other end of the spectrum, among the 50 largest metro areas, Memphis, Tennessee has the lowest ranking for vegetarian options–apparently barbecue still dominates. There’s inherent subjectivity in choosing which weights to use for the ranking; if we ignore the affordability measures, for example, then San Francisco would edge out Portland as the #1 city for vegan and vegetarian living.

3. The biggest contributor to microplastic pollution of the oceans?  Car tires. There’s been something of a fetish lately to ban the use of plastic straws as a way to reduce the amount of plastic that ends up in ocean ecosystems. While straws and six-pack rings make for gut-wrenching photos of affected wildlife, the more serious and insidious threat may be microplastics–tiny beads of plastic that are washed into rivers and streams and end up in the oceans–and their sensitive ecosystems. A new study reported in the Los Angeles Times indicates that the vast bulk of these microplastics come from car tires. Every mile you drive, a small portion of the tire is worn away in the form of tiny particles, which rain then washes into the streams and rivers. The particles get distributed throughout the food chain, and are ultimately ingested by humans. The study suggests that the volume of microplastics from tires is an amount 300 times greater than what comes from microfibers washing off polyester clothes, microbeads from beauty products and the many other plastics washing down our sinks and sewers.

4. Hacking electronic road signs.  Just this, from @sugarpond on Twitter:

Apparently, if you don’t reset the default password, its relatively easy for anyone to reprogram the message on these electronic road signs. This particular warning deserves a broader audience.

New Knowledge

Household size is increasing for the first time more than a century and a half.  One of the most persistent trends in US housing, the steady decline in average household sizes, has reversed since 2010, according to Census data compiled the the Pew Research Center. In this decade, the average size of a household has blipped upward, from 2.58 persons per home to 2.63 persons per home.

For as long as anyone can remember, we’ve been building more housing faster than population has been increasing, and the combined effect of lower fertility, longer lifespans, and social preferences for living alone have pushed down the average number of persons living in a house or apartment. One of the ways to ameliorate the housing shortage is to fit more people into each dwelling unit.  This reversal of the historic downward trend in household sizes suggest thats starting to happen.

 

The Week Observed, August 30, 2019

What City Observatory did this week

1. 20 Reasons to ignore the Texas Transportation Institute’s Urban Mobility Report. It’s back. After a four-year hiatus Texas A&M University’s transportation institute trotted out another iteration of its periodic urban mobility report, which predictably claims (as it does every time) that traffic congestion is a horrible problem and is getting worse. We and others have long since debunked the methodology, data and findings of the UMR:  it’s core measure, the travel time index, is misleading and penalizes cities with compact development and short commutes; its fixated on car travel and literally regards pedestrians as “inappropriate data”.  It treats time lost from being able to exceed legal speed limits as an economic cost to drivers. It values increased travel time in congestion at a rate five-times higher than what people actually pay to avoid congestion. And that’s just the tip of a larger iceberg:  We identify 20 reasons why, as in years past, you can and should ignore the Urban Mobility Report. Its discredited propaganda designed to justify more and wider roads, a failed strategy which has never worked and as only made car dependence, sprawl and traffic worse.

2. Round 1 in Portland’s freeway fight goes to the scrappy upstarts.  For the past couple of years we’ve been reporting on the Oregon Department of Transportation’s proposal to spend half a billion dollars widening a mile-long stretch of I-5 near downtown Portland. The local press reports a major victory for project opponents:  ODOT has given up trying to pursue the short-form environmental assessment, and instead will do a full-scale Environmental Impact Statement. That will add between one and three years to the project development process, and hopefully fix the manifest problems in the ODOT analysis to date create an opportunity to consider some alternatives actually promote safety, reduce driving, and improve the environment.

 

Must read

1. Do community preference housing policies enshrine segregation? Anti-gentrification efforts often call for neighborhood preference policies affordable housing opportunities (for example, giving local neighborhood residents preferential access to new affordable apartments that become available). Next City explores the potential problem with these policies: they tend to lock-in patterns of segregation, and deny people the opportunity to move to a different neighborhood. Because many city neighborhoods are substantially segregated, these policies tend to keep them that way. Fair housing activists have challenged neighborhood preference rules in New York and San Francisco; Seattle is wrestling with you to avoid reinforcing segregation.

2. Michigan cities need more congestion and gentrification. Never one to shy away from a controversial proposition, Michigan Future’s President Lou Glazer argues that contrary to popular belief, what Detroit and other Michigan cities need is both more congestion and more gentrification. In his view, rising home values and more traffic are closely related to more demand for a city, and its a lack of demand for Michigan cities that is at the root of most of their current problems. None of this is to say that gentrification and congestion aren’t problems themselves, but for economically struggling cities, its a better class of problem to be wrestling with. As always, Glazer is worth a read.

3. India mandates RFID tags for all vehicles. If you think about it, the ubiquitous stamped metal license plate used to identify cars is a 19th century technology. India has mandated that starting next year, all vehicles must have a radio frequency identification (RFID) tag–a windshield-mounted sticker, similar to the fast-pass technology used in many states in the US.  The change will enable a shift to all-electronic toll collection.

Road Transport and Highways Minister Nitin Gadkari announced that FASTags will become mandatory for all vehicles from December this year. This means that any vehicle, private or commercial, which has the FASTag will be able to swiftly make contactless payments at toll plazas on national highways and be on their way.

In a related story half a world away, the San Jose Mercury News reports a new California law requiring temporary metal plates instead of unreadable temporary paper licenses has  reduced toll evasion 75 percent on Bay Area highways and bridges. Owners of new cars would delay getting (or mounting) new license plates and dodge paying tolls. The one area where “smart” technology makes the most sense is assuring all vehicle operators obey the law.

4. YIMBY gets political.  Software engineer Steven Buss is a YIMBY activist in San Francisco. He’s got an essay laying out a clear political strategy for moving the YIMBY agenda forward in San Francisco, starting with running for key positions in the local Democratic Party, and then nominating (and electing) housing-friendly candidates to the city’s board of Supervisors. In Buss’s view, homeowners and landlords now control the city, and are effectively blocking new construction (restricting supply and thereby keeping rents and home values high). It’s as much a hard-edged financial calculation as a political manifesto: Buss has an interesting calculation showing that this rent-seeking has enabled landowners to capture a significant fraction of the value created by the city’s tech economy. Bottom line: Tech folks used to say that software was eating the world — “now landlords are eating everything.”

In the news

The Houston Chronicle’s Dug Begley quotes City Observatory’s Joe Cortright on the methodological flaws in the latest urban mobility report.

The Week Observed, September 13, 2019

What City Observatory did this week

1. Beto O’Rourke brings a strong inclusive urbanist message to the Presidential contest.  While its been great to see housing affordability and climate change grow in prominence on the national stage, some of what’s been proposed, especially to reduce greenhouse gas emissions, has been too much alternative fuels and electric cars, and too little on how we build more sustainable and just communities, where we don’t have to drive so much. Beto O’Rourke has the most provocative and explicitly urbanist take on how America needs to change. His recent remarks connect the need for economic integration, more dense development, and reduced driving.

 

2. Why smart cities are about more than technology. It’s tempting to think that for every urban problem there’s an easy technological fix: between Elon Musk’s tunnels and the “there’s an app for that” view of civic problem solving, the urban future is too enamored of technology. Part of this hinges on having the wrong mental model of how cities work, thinking of them as machines to be tweaked, rather that complex, ever evolving, human centered system. We’ve been down that road–literally–before, when engineers like Robert Moses, took a careless meat ax approach to urban space, with devastating social, economic and environmental consequences we’re still wrestling with.  While new technology creates opportunities, we need to be keenly aware of the risk of unintended consequences, the biases of big data, and the need to make sure that the rules and prices that shape urban places reflect our broader values, rather than some narrow technological imperative.

Must read

1. Why Houston should scrap efforts to expand its I-45 freeway. Jeff Speck weighs in on the Texas Department of Transportation’s $7 billion (and counting) plan to widen Interstate 45 to as many as 13 lanes through Houston. While its pitched as an “improvement plan,” this is really a classic Bob Moses meat-ax project, which would wipe out 1,200 homes (chiefly occupied by low income people of color) and displace 5,000 people and 300 businesses.  It’s being sold, as so many are, as a solution to congestion, when freeway widening has been proven time and again to induce more traffic and generate even longer commutes–as Houston’s own experience with the 23-lane Katy Freeway amply generates. Some Houstonians are trying to soften some of the sharp edges of the project, to make I-45 “better;” but Speck will have none of it. Compromise now means just more of the same disastrous consequences for people, for cities and for the environment:

. . . Houston’s fatalistic response to its TxDOT incursion has been to just “make I-45 better.” The well-resourced but cautious Make I-45 Better Coalition has proposed a collection of modifications, all good, that unfortunately do not begin to question the underlying folly of fighting congestion, car crashes, and tailpipe emissions by welcoming more driving.

Here’s how to make I-45 better: first, fix the parts that need repair, without making them any wider. At the same time, introduce congestion-based pricing on the entire roadway to maximize its capacity around the clock. Invest the proceeds in transit, biking, walking, and in those poor people who truly have no choice but to keep driving.

2. The Future is not retro.  In the era of “MAGA,” no one can doubt that allusions to recovering the imagined verisimilitude of an earlier era have a strong emotional appeal.  Urbanists have made a good case that it many cases, we’ve simply forgotten (or outlawed) well-established recipes for building successful urban places. But while reminders of how well we did things in the past are helpful for motivating introspection, simply reverting to a status quo ante isn’t where we should be headed. In an essay entitled “The Future is not Retro” Pedestrian Observation’s Alon Levy pointedly challenges the limits of nostalgia as blueprint for building the future. Levy takes aim at some of advocates who disparage high rise urban redevelopment and seem to long for a restoration of small town, midwestern main streets.

As Levy argues, communities of the future will embody many of the critical urbanist principles of the past, but they’ll be different as well:

The theme of the future is that, just as the Industrial Revolution involved urbanization and rural depopulation, urban development patterns this century involve growth in the big metro areas and decline elsewhere and in traditional small towns.  . . . Already, people lead full lives in big global cities like New York and London without any of the trappings of what passed for normality in the middle of the 20th century, like a detached house with a yard and no racial minorities or working-class people within sight. The rest will adapt to this reality, just as early 20th century urbanites adapted to the reality of suburbanization a generation later.

3. The high cost of free roads. In an editorial, Toronto’s Globe and Mail highlights the seductive appeal of apparently “free” roads in otherwise frequently sensible Canada. While several provinces have toyed with asking road users to pay a portion of major new capital projects, like British Columbia’s Port Mann Bridge and the new Champlain Bridge in Montreal, it’s been politically opportunistic to campaign against tolls. In both cases, new provincial governments have taken off the tolls–and shifted the multi-billion dollar cost of the bridges to all of their non-users.  It’s a crazy system, as the Globe and Mail editorial concludes:

Federal, provincial and municipal governments should be liberating taxpayers from billions of dollars in road construction and maintenance costs, and unlocking hundreds of billions of dollars in profitable assets. That would be a boon for both the environment and the fiscal bottom line, and would free up tax dollars for things we really need.

In the news

Writing in the Grand Rapids Business Journal, Lou Glazer cities City Observatory’s commentaries on the positive effects of gentrification for the long-time residents of low income neighborhoods.

StreetsblogUSA cited City Observatory’s critique of the recent Urban Mobility Report in an article “Traffic Study comes under fire for being to pro-car.”

Highway to Hell: Climate denial at the TRB

The Transportation Research Board, nominally an arm of the National Academy of Sciences, is engaged in technocratic climate arson with its call for further highway expansion and more car travel.

  • The planet is in imminent peril from global warming, with much of the recent increase in emissions in the US coming from increased driving.
  • In the face of this monumental crisis, the Transportation Research Board, which should represent science, is calling for tripling spending on highway construction to as much as $70 billion annually, to accomodate and another 1.25 trillion miles of driving each year.
  • They’re ignoring global warming (except as an excuse to flood-proof highways), and hoping that somebody else electrifies cars, so that carbon emissions go down.
  • No consideration is being given to how we might reduce driving to reduce pollution, and make our cities–which were devastated by the construction of the interstate highways—more livable, green and just.

At City Observatory, we think climate change is the challenge of our time. One of the biggest opportunities to meet this crisis is to dramatically rethink the way we get around and the way we build our cities. The interstate highway system and the car-centric transportation system and land use patterns it fostered have undermined our cities, torn our civic fabric, segregated our citizens and threaten our environment. Much of what is happening today in urbanism is a wave of experiments aimed at reversing the damage done by cars and highways. It’s a project of collective remembering that great urban places are walkable, bikeable and well-served by transit, bringing us closer together and freeing us from our dependence on cars and their substantial costs, pecuniary, social and environmental.

If we’re serious about tackling climate change, reversing the damage done by the Interstate Highway system should be at the top of our list. A new congressionally mandated review of the system provides, in theory, an opportunity to think hard about how we might invest for the kind of future we’re going to live in. Sadly, the report we’ve been provided by the Transportation Research Board is a kind of stilted amnesia, which calls for us to repeat the today just what we did 70 years ago. Now is no time for indulging nostalgia for the Eisenhower era. But that’s exactly what we’re being offered.

 

A global conflagration shouldn’t get in the way of business as usual, right? Role models for the Transportation Research Board’s thought leadership on the future of Interstate highways.

The new report looks at the future of the Interstate Highway System, and calls for spending hundreds of billions of dollars expanding interstates and to facilitate more than a trillion miles of additional driving every year. The report comes from the Transportation Research Board, which is part of the National Academies of Engineering and is affiliated with the National Academy of Sciences. The report was overseen by the “Committee on the Future Interstate Highway System,” and written by TRB staff and consultants.

While much of its work is prosaic and uncontroversial (coming up with standards for paving materials and road markings, and figuring out how to optimize traffic signals), some of the Transportation Research Board’s  work has a profound and subtle bias that is at the root of our urban transportation problems.

In the past we’ve written about how engineering “rules of thumb“–minimum parking requirements for buildings, minimum lane widths for roads, “level of service” traffic standards, and hierarchical, dendritic street systems–systematically lead to less livable, more car-dependent communities. While its ostensibly a technocratic exercise, the new report titled “Renewing the National Commitment to the Interstate Highway System: A Foundation for the Future” is really a strongly political statement in favor of more and more road building.  The title clearly signals the messaging: this is a backward looking plea for funds, asking us to repeat what we did in the past.

And even if you didn’t read the title, The cover illustration pretty much gives away the game:  This is all about rationalizing building more highway capacity.

Alternate title: “NO FUTURE FOR YOU.”

 

Climate Change:  That’s someone else’s problem

There’s an overwhelming consensus in the scientific world that anthropogenic climate change is rapidly reaching irreversible and catastrophic proportions. The International Panel on Climate Change (IPCC) has issued a dire warning that we have just a little over a decade to save the planet.

If you wade through the first 118 pages of the TRB report, you’ll finally reach a mention of climate change.  A short sidebar (Box 4.2) concedes that by stimulating vehicle use and car-dependent travel problems the Interstate Highway System contributed to the problem.  And the solution is to . . . wait for somebody else to develop low-carbon and no-carbon transport technologies:

. . . a transformation to a low- and no-carbon transportation system will increasingly mean that [freeway] planning is integrated with the planning of low-carbon mobility options, from public transit to zero-emission trucks.  Many states, counties, and cities are investing in low-carbon transportation solutions, seeking to create new opportunities for both low-carbon mobility and economic development.

The report acknowledges the reality of climate change, but largely ignores and minimizes the contribution of vehicle travel to carbon emissions. Instead, the report is chiefly concerned about using climate change as yet another excuse to spend more money on rebuilding highways (to harden them against flooding, storms and other climate-related disruptions). In essence the report paints highways (and highway department’s distressed budgets) as victims of climate change, rather than one of its principal causes.

We’re told that the Interstate Highway System accounts for a mere 7 percent of US greenhouse gas emissions.  That of course misses the fact that in many states, highway travel accounts for 40 percent of greenhouse gas emissions, and unlike other sources of carbon pollution, it is increasing—for example, causing both Oregon and California to lose ground on their climate change objectives.

More importantly, the Interstate Highways make all of us more car dependent, whether we travel on the Interstate or not. As the report acknowledges–grudgingly, and in passing–the toll the Interstate system wreaked on cities.

It is generally understood that urban Interstates and other freeways contributed to suburbanization and the depopulation of many major U.S. cities, which accelerated after World War II in concert with an expanding middle class and the fast-growing personal motor vehicle fleet. Although many other factors were at work, the Interstate System facilitated greater dependence on the automobile for commuting to work and other household and social activities.
However, the effects of urban Interstates were not entirely beneficial for center cities and their neighborhoods. The postwar mass movement of people and employers to the suburbs led to the loss of center city population, a declining housing stock, and impoverished urban neighborhoods (TRB 1998).

“Not entirely beneficial” is perhaps the most banal way of conceding the fact that highways devastated cities and amplified segregation, problems that plague us today. Nathan Baum-Snow has estimated that each additional radial freeway constructed in a metropolitan area reduced the central city’s population by 18 percent.

One has to be very resolute in overlooking all of the negative consequences of increased car dependence wrought by the interstate highway system, in the form of sprawl, the deaths and injuries (especially to vulnerable non-car road users) pollution, and the destruction of urban neighborhoods.  Allusions to the supposed economic benefits of the original interstate highway system in the 1950s and 1960s (which benefitted the trucking industry and suburban land developers to be sure), are not something that would be repeated by additions to the system in the 2020s and beyond. The careful scholarship on road investments has shown that the economic return on investment from additional highway capacity has fallen to almost zero in the past several decades. And that doesn’t allow for full accounting of the social, environmental and health effects of car dependence.

TRB’s Vision:  One and a quarter trillion more miles of driving annually by 2040

The case for “renewing” our commitment to expanding the Interstate Highway System is predicated on forecasts of increased driving activity.  While the report is careful to couch its findings in technical terms and talk about the smallest possible numeric increments (1.5 percent per year), the implications of their traffic projections are for a huge increase in the volume of driving.  At their mid-range figure 1.5 percent per year of 1.5 percent annual growth, vehicle miles traveled in the US will increase by about one and a quarter trillion miles annually by 2040, up from about 3.2 trillion today to nearly 4.5 trillion in 2040.

The climate implications of all that driving?  Not TRB’s problem. Automakers or somebody might electrify vehicles, and reduce carbon emissions, but that’s not something that highway engineers are going to worry about–at all.

It’s worth questioning whether that 1.5 percent annual increase makes any sense.  The report and its technical appendix is careful to hedge its forecasts with all kinds of qualifications about the economic and technological uncertainty of predicting future travel patterns, but when it comes down to it, they conclude that come hell or high water (and what with global warming, it’s likely to be both), traffic will only go up, somewhere between .75 percent annually and 2 percent annually.

A quick review of recent trends in driving is in order.  The key metric here is “vehicle miles traveled. Last year Americans drove about 3.2 trillion miles, according to the US Department of Transportation. The following chart shows the trend in VMT since 1970, with actual values in blue, the TRB’s baseline forecast in red, and an alternative lower forecast (which we’ll explain in a moment, in green). The long term trend, particularly from the 1970s until the turn of the millennium was for a steady increase in driving year over year, but after 2000 things began to change.

At first the rate of increase slowed and then declined; from 2004 through 2014 per capita vehicle miles traveled actually declined. For economists, this was hardly a surprise:  real fuel prices increased dramatically after 2004, and remained high through 2014; it was only after the oil market bust that year that driving started going up.  Interestingly the growth rate for the four years 2014 to 2018 was slightly more than 1.5 percent, the exact figure that TRB forecasts to continue for the next two decades.  Implicitly, TRB is assuming that driving will grow for the next 20 years or so at the same pace that it managed (a) after a decade of stagnation, and (b) over a period of time in which real fuel prices fell by more than 40 percent.

The experience of 2004 to 2014 suggests that a very different future is entirely possible.  Somewhat higher gas prices—high enough to reflect the social and environmental cost of carbon—are likely to depress if not entirely eliminate the growth in VMT.  If we changed gas prices to resemble the 2004-14 period, we would expect driving to increase only very slowly, as shown in the green line. The experience of the past decade shows that its extremely possible for the US to have a much slower rate of increase in driving.  All it takes are the right policies.

And increasingly, policies that discourage driving will be essential to saving the planet.  As economists of every political stripe have stressed, some sort of carbon tax is essential to lowering carbon emissions—plus they’ll have the side benefit of lowering the need to build extremely expensive highway infrastructure.  An intelligent report would consider how the US could, building on the experience of leading cities and other countries, work to build communities that simply require less driving, thereby reducing carbon pollution, lowering road costs, and not incidentally, mitigating some of the historic damage done to cities by the construction of the interstate highway system.

The upward slope of that red line and the added trillion miles we’re assumed to drive each year is the oldest trick in the highway engineer’s book. “More cars are coming!  It’s an inexorable, immutable force that we must respond to.  We must expand capacity.” But these forecasts have been repeatedly shown to be wrong.  For example, Clark Williams-Derry exposed the serial misrepresentation by the Washington Department of Transportation:

Short answer: “No.” (Chart: Sightline Institute)

The engineers routinely ignore or grossly downplay the effects of induced demand—that the principal reason that driving is increasing is that we’re building vastly more un-priced road capacity.  More road capacity, generates more sprawl, longer trips and more VMT, in a never-ending, self-reinforcing cycle, one which is now so well established as to be called the “fundamental law of road congestion.”

If we build highways for another trillion miles of vehicle travel we’re actively making the carbon pollution problem worse, not, as engineers would have it, passively responding to some unchanging natural trend.  If Americans drive a trillion more miles each year by 2040, it’s going to make it vastly more difficult to reduce greenhouse gases. Failing to acknowledge that fundamental fact, and suggesting a casual mention of vehicle electrification absolves them of any responsibility for thinking further about this question is simply irresponsible.

Same old punchline: Give us billions more

As Chuck Marohn of StrongTowns has pointed out, reports by engineering groups are almost invariably self-serving demands for more money thinly disguised as impartial technical advice. Not surprisingly, the TRB report’s primary recommendation is that we give more money to highway builders—lots more money. The report calls for doubling to tripling the amount of money spent on Interstate highways:

Recent combined state and federal capital spending on the Interstates has been about $20–$25 billion per year. The estimates in this study suggest this level of spending is too low and that $45–$70 billion annually over the next 20 years will be needed

And a big chunk of this funding would be for explicitly earmarked for capacity expansion.  The report is vague, but its “blueprint” says $22 billion of a $57 billion pot would be set for capacity expansion, with the remainder for pavement, operations and bridges (categories that are often used to fund wider roadways—bridges that are “rebuilt” are almost invariably widened).

While the report talks a good game about aging bridges and multiplying potholes, there’s abundant evidence that when additional revenue is available, maintenance is still deferred in favor of marquee capacity expansions. State Departments of Transportation (who are allocated the bulk of money for the interstate system) have repeatedly engaged in bait and switch tactics; marketing potholes and spending on wider roads.

This is a report that looks backward, learns nothing, and is destined to repeat the mistakes of the past, while remaining obstinately blind to the manifest threat that climate change poses to our collective global future. Now is not the time to be squandering precious billions on more and wider roads, and stimulating trillions of additional miles of vehicle travel. This report tragically evades the most serious problem we face and avoids shouldering any responsibility for meaningful action. It’s difficult to imagine that anything so willfully narrow-minded and self-serving could be allowed to masquerade as the product of scientific endeavor.

National Academies of Sciences, Engineering, and Medicine 2019. Renewing the National Commitment to the Interstate Highway System: A Foundation for the Future. Washington, DC: The National Academies Press. https://doi.org/10.17226/25334.

Highway to Hell: Climate denial at the TRB

The Transportation Research Board, nominally an arm of the National Academy of Sciences, is engaged in technocratic climate arson with its call for further highway expansion and more car travel.

  • The planet is in imminent peril from global warming, with much of the recent increase in emissions in the US coming from increased driving.
  • In the face of this monumental crisis, the Transportation Research Board, which should represent science, is calling for tripling spending on highway construction to as much as $70 billion annually, to accomodate and another 1.25 trillion miles of driving each year.
  • They’re ignoring global warming (except as an excuse to flood-proof highways), and hoping that somebody else electrifies cars, so that carbon emissions go down.
  • No consideration is being given to how we might reduce driving to reduce pollution, and make our cities–which were devastated by the construction of the interstate highways–more livable, green and just.

At City Observatory, we think climate change is the challenge of our time. One of the biggest opportunities to meet this crisis is to dramatically rethink the way we get around and the way we build our cities. The interstate highway system and the car-centric transportation system and land use patterns it fostered have undermined our cities, torn our civic fabric, segregated our citizens and threaten our environment. Much of what is happening today in urbanism is a wave of experiments aimed at reversing the damage done by cars and highways. It’s a project of collective remembering that great urban places are walkable, bikeable and well-served by transit, bringing us closer together and freeing us from our dependence on cars and their substantial costs, pecuniary, social and environmental.

If we’re serious about tackling climate change, reversing the damage done by the Interstate Highway system should be at the top of our list. A new congressionally mandated review of the system provides, in theory, an opportunity to think hard about how we might invest for the kind of future we’re going to live in. Sadly, the report we’ve been provided by the Transportation Research Board is a kind of stilted amnesia, which calls for us to repeat the today just what we did 70 years ago. Now is no time for indulging nostalgia for the Eisenhower era. But that’s exactly what we’re being offered.

 

A global conflagration shouldn’t get in the way of business as usual, right? Role models for the Transportation Research Board’s thought leadership on the future of Interstate highways.

The new report looks at the future of the Interstate Highway System, and calls for spending hundreds of billions of dollars expanding interstates and to facilitate more than a trillion miles of additional driving every year. The report comes from the Transportation Research Board, which is part of the National Academies of Engineering and is affiliated with the National Academy of Sciences. The report was overseen by the “Committee on the Future Interstate Highway System,” and written by TRB staff and consultants.

While much of its work is prosaic and uncontroversial (coming up with standards for paving materials and road markings, and figuring out how to optimize traffic signals), some of the Transportation Research Board’s  work has a profound and subtle bias that is at the root of our urban transportation problems.

In the past we’ve written about how engineering “rules of thumb“–minimum parking requirements for buildings, minimum lane widths for roads, “level of service” traffic standards, and hierarchical, dendritic street systems–systematically lead to less livable, more car-dependent communities. While its ostensibly a technocratic exercise, the new report titled “Renewing the National Commitment to the Interstate Highway System: A Foundation for the Future” is really a strongly political statement in favor of more and more road building.  The title clearly signals the messaging: this is a backward looking plea for funds, asking us to repeat what we did in the past.

And even if you didn’t read the title, The cover illustration pretty much gives away the game:  This is all about rationalizing building more highway capacity.

Alternate title: “NO FUTURE FOR YOU.”

 

Climate Change:  That’s someone else’s problem

There’s an overwhelming consensus in the scientific world that anthropogenic climate change is rapidly reaching irreversible and catastrophic proportions. The International Panel on Climate Change (IPCC) has issued a dire warning that we have just a little over a decade to save the planet.

If you wade through the first 118 pages of the TRB report, you’ll finally reach a mention of climate change.  A short sidebar (Box 4.2) concedes that by stimulating vehicle use and car-dependent travel problems the Interstate Highway System contributed to the problem.  And the solution is to . . . wait for somebody else to develop low-carbon and no-carbon transport technologies:

. . . a transformation to a low- and no-carbon transportation system will increasingly mean that [freeway] planning is integrated with the planning of low-carbon mobility options, from public transit to zero-emission trucks.  Many states, counties, and cities are investing in low-carbon transportation solutions, seeking to create new opportunities for both low-carbon mobility and economic development.

The report acknowledges the reality of climate change, but largely ignores and minimizes the contribution of vehicle travel to carbon emissions. Instead, the report is chiefly concerned about using climate change as yet another excuse to spend more money on rebuilding highways (to harden them against flooding, storms and other climate-related disruptions). In essence the report paints highways (and highway department’s distressed budgets) as victims of climate change, rather than one of its principal causes.

We’re told that the Interstate Highway System accounts for a mere 7 percent of US greenhouse gas emissions.  That of course misses the fact that in many states, highway travel accounts for 40 percent of greenhouse gas emissions, and unlike other sources of carbon pollution, it is increasing–for example, causing both Oregon and California to lose ground on their climate change objectives.

More importantly, the Interstate Highways make all of us more car dependent, whether we travel on the Interstate or not. As the report acknowledges–grudgingly, and in passing–the toll the Interstate system wreaked on cities.

It is generally understood that urban Interstates and other freeways contributed to suburbanization and the depopulation of many major U.S. cities, which accelerated after World War II in concert with an expanding middle class and the fast-growing personal motor vehicle fleet. Although many other factors were at work, the Interstate System facilitated greater dependence on the automobile for commuting to work and other household and social activities.
However, the effects of urban Interstates were not entirely beneficial for center cities and their neighborhoods. The postwar mass movement of people and employers to the suburbs led to the loss of center city population, a declining housing stock, and impoverished urban neighborhoods (TRB 1998).

“Not entirely beneficial” is perhaps the most banal way of conceding the fact that cities devastated cities and amplified segregation, problems that plague us today. Nathan Baum-Snow has estimated that each additional radial freeway constructed in a metropolitan area reduced the central city’s population by 18 percent.

One has to be very resolute in overlooking all of the negative consequences of increased car dependence wrought by the interstate highway system, in the form of sprawl, the deaths and injuries (especially to vulnerable non-car road users) pollution, and the destruction of urban neighborhoods.  Allusions to the supposed economic benefits of the original interstate highway system in the 1950s and 1960s (which benefitted the trucking industry and suburban land developers to be sure), are not something that would be repeated by additions to the system in the 2020s and beyond. The careful scholarship on road investments has shown that the economic return on investment from additional highway capacity has fallen to almost zero in the past several decades. And that doesn’t allow for full accounting of the social, environmental and health effects of car dependence.

TRB’s Vision:  One and a quarter trillion more miles of driving annually by 2040

The case for “renewing” our commitment to expanding the Interstate Highway System is predicated on forecasts of increased driving activity.  While the report is careful to couch its findings in technical terms and talk about the smallest possible numeric increments (1.5 percent per year), the implications of their traffic projections are for a huge increase in the volume of driving.  At their mid-range figure 1.5 percent per year of 1.5 percent annual growth, vehicle miles traveled in the US will increase by about one and a quarter trillion miles annually by 2040, up from about 3.2 trillion today to nearly 4.5 trillion in 2040.

The climate implications of all that driving?  Not TRB’s problem. Automakers or somebody might electrify vehicles, and reduce carbon emissions, but that’s not something that highway engineers are going to worry about–at all.

It’s worth questioning whether that 1.5 percent annual increase makes any sense.  The report and its technical appendix is careful to hedge its forecasts with all kinds of qualifications about the economic and technological uncertainty of predicting future travel patterns, but when it comes down to it, they conclude that come hell or high water (and what with global warming, it’s likely to be both), traffic will only go up, somewhere between .75 percent annually and 2 percent annually.

A quick review of recent trends in driving is in order.  The key metric here is “vehicle miles traveled. Last year Americans drove about 3.2 trillion miles, according to the US Department of Transportation. The following chart shows the trend in VMT since 1970, with actual values in blue, the TRB’s baseline forecast in red, and an alternative lower forecast (which we’ll explain in a moment, in green). The long term trend, particularly from the 1970s until the turn of the millennium was for a steady increase in driving year over year, but after 2000 things began to change.

At first the rate of increase slowed and then declined; from 2004 through 2014 per capita vehicle miles traveled actually declined. For economists, this was hardly a surprise:  real fuel prices increased dramatically after 2004, and remained high through 2014; it was only after the oil market bust that year that driving started going up.  Interestingly the growth rate for the four years 2014 to 2018 was slightly more than 1.5 percent, the exact figure that TRB forecasts to continue for the next two decades.  Implicitly, TRB is assuming that driving will grow for the next 20 years or so at the same pace that it managed (a) after a decade of stagnation, and (b) over a period of time in which real fuel prices fell by more than 40 percent.

The experience of 2004 to 2014 suggests that a very different future is entirely possible.  Somewhat higher gas prices–high enough to reflect the social and environmental cost of carbon–are likely to depress, if not entirely eliminate the growth in VMT.  If we changed gas prices to resemble the 2004-14 period, we would expect driving to increase only very slowly, as shown in the green line. The experience of the past decade shows that its extremely possible for the US to have a much slower rate of increase in driving.  All it takes are the right policies.
And increasingly, policies that discourage driving will be essential to saving the planet.  As economists of every political stripe have stressed, some sort of carbon tax is essential to lowering carbon emissions–plus they’ll have the side benefit of lowering the need to build extremely expensive highway infrastructure.  An intelligent report would consider how the US could, building on the experience of leading cities and other countries, work to build communities that simply require less driving, thereby reducing carbon pollution, lowering road costs, and not incidentally, mitigating some of the historic damage done to cities by the construction of the interstate highway system.

The upward slope of that red line and the added trillion miles we’re assumed to drive each year is the oldest trick in the highway engineer’s book. “More cars are coming!  It’s an inexorable, immutable force that we must respond to.  We must expand capacity.” But these forecasts have been repeatedly shown to be wrong.  For example, Clark Williams-Derry exposed the serial misrepresentation by the Washington Department of Transportation:

Short answer: “No.” (Chart: Sightline Institute)

The engineers routinely ignore or grossly downplay the effects of induced demand–that the principal reason that driving is increasing is that we’re building vastly more un-priced road capacity.  More road capacity, generates more sprawl, longer trips and more VMT, in a never-ending, self-reinforcing cycle, one which is now so well established as to be called the “fundamental law of road congestion.”

If we build highways for another trillion miles of vehicle travel we’re actively making the carbon pollution problem worse, not, as engineers would have it, passively responding to some unchanging natural trend.  If Americans drive a trillion more miles each year by 2040, it’s going to make it vastly more difficult to reduce greenhouse gases. Failing to acknowledge that fundamental fact, and suggesting a casual mention of vehicle electrification absolves them of any responsibility for thinking further about this question is simply irresponsible.

Same old punchline: Give us billions more

As Chuck Marohn of StrongTowns has pointed out, reports by engineering groups are almost invariably self-serving demands for more money thinly disguised as impartial technical advice. Not surprisingly, the TRB report’s primary recommendation is that we give more money to highway builders–lot’s more money. The report calls for doubling to tripling the amount of money spent on Interstate highways:

Recent combined state and federal capital spending on the Interstates has been about $20–$25 billion per year. The estimates in this study suggest this level of spending is too low and that $45–$70 billion annually over the next 20 years will be needed

And a big chunk of this funding would be for explicitly earmarked for capacity expansion.  The report is vague, but its “blueprint” says $22 billion of a $57 billion pot would be set for capacity expansion, with the remainder for pavement, operations and bridges (categories that are often used to fund wider roadways–bridges that are “rebuilt” are almost invariably widened).

While the report talks a good game about aging bridges and multiplying potholes, theres abundant evidence that when additional revenue is available, maintenance is still deferred in favor of marquis capacity expansions. State Departments of Transportation (who are allocated the bulk of money for the interstate system) have repeatedly engaged in bait and switch tactics; marketing potholes and spending on wider roads.

This is a report that looks backward, learns nothing, and is destined to repeat the mistakes of the past, while remaining obstinately blind to the manifest threat that climate change poses to our collective global future. Now is not the time to be squandering precious billions on more and wider roads, and stimulating trillions of additional miles of vehicle travel. This report tragically evades the most serious problem we face and avoids shouldering any responsibility for meaningful action. It’s difficult to imagine that anything so willfully narrow-minded and self-serving could be allowed to masquerade as the product of scientific endeavor.

National Academies of Sciences, Engineering, and Medicine 2019. Renewing the National Commitment to the Interstate Highway System: A Foundation for the Future. Washington, DC: The National Academies Press. https://doi.org/10.17226/25334.

Its back, and its still wrong: the Urban Mobility Report

After a four year hiatus, the Texas Transportation Institute has once again generated its misleading Urban Mobility Report–and its still wrong.

The UMR has been comprehensively debunked–it has never been peer-reviewed nor have its authors responded to authoritative critiques, it relies on a series of false premises, penalizes cities with compact development patterns and short commutes, ignores non-automobile travelers, and exaggerates all of its key claims.  We’ll have an updated look at the latest iteration of the UMR (which relies on the same discredited methodology).  In the mean time, here’s what we’ve written at City Observatory about this and other similar congestion cost “studies.”

The top ten reason’s to ignore the Urban Mobility Report.

Is congestion worse now? The Urban Mobility Report can’t tell us.

Another tall tale from the Texas Transportation Institute.

Boo! The annual carmaggedon scare is upon us.

Our essay, the Cappuccino Congestion Index, explains why, fundamentally, the methods used by TTI and others are utterly meaningless as measures of consumer costs.

Unsurprisingly, the newest iteration of the report is sponsored by the Texas Department of Transportation. And as always, the message is “build, baby, build.”  The UMR has always been thinly veiled propaganda for building more and wider roads. It’s not designed either to help understand the root causes of traffic congestion (i.e. underpriced roadways), nor to fashion meaningful solutions.  No one should take it seriously.

Todd Litman of the Victoria Transportation Policy Institute has written a comprehensive critique of the Urban Mobility Report.

Our own detailed critique of the Urban Mobility Report–“Measuring Urban Transportation Performance”– is published here.

 

 

 

 

 

 

Devaluation of housing in black neighborhoods, Part 2: Appreciation

Are home prices appreciating more or less in black neighborhoods? Is that a good thing?

Today, in part 2 of our analysis of the home price gap between majority black and predominantly white neighborhoods we look at the pattern of home price appreciation for black and white home buyers. Yesterday, in part 1 of our series, we looked at a Brookings Institution report showing that homes in majority black neighborhoods sold at a 20 percent discount to otherwise similar homes in otherwise comparable neighborhoods. This systematic undervaluation of housing in majority black neighborhoods works out to a national value disparity of more than $150 billion. We took a close look at how this plays out in Memphis, a metro area with a large American population.

Housing appreciation and cyclical variability

A key question the Brooking’s report leaves unanswered is whether the black/white housing differential is larger or smaller than it was 10 or 20 years ago.  If it was larger in the past and is smaller today, that implies that homes in majority black neighborhoods, although still undervalued relative to homes in predominantly white neighborhoods, have enjoyed greater relative appreciation. From the standpoint of wealth creation, the amount of appreciation since you bought your home is likely to matter more than whether the current price of your house is more or less than otherwise similar properties. Another way of expressing this is that homeowners in black neighborhoods had a lower purchase price (or basis) in their home, and even though it is still undervalued, it may have gained more value in percentage terms than homes in non-majority black neighborhoods.

Indeed, Dan Immergluck and his colleagues at the Georgia State University found that for those who bought homes in 2012, price appreciation for black homebuyers from 2012 through 2017 was higher than for white homebuyers.  Immergluck’s data show that in most markets, homes bought by black buyers appreciated more than homes bought by white homebuyers.

Immergluck’s study looks at appreciation rates for black homebuyers compared to white homebuyers, rather than in majority black neighborhoods compared to predominantly white neighborhoods.  As Immergluck reports, nationally, a majority of black homebuyers bought homes in neighborhoods that were not majority black. Home prices appreciated more for black buyers in strong market cities (Boston, California) and least in weaker market cities (St. Louis, Birmingham).

There are noticeable variations in home appreciation by race over the course of the business cycle:  Zillow’s Skylar Olsen, in her report “A House Divided” plotted the change in home values by the predominant racial/ethnic group in each zip code.  Her results show greater volatility for home prices in black plurality neighborhoods.  Her data show home values in plurality black neighborhoods (the green line) and plurality non-Hispanic white neighborhoods (the blue line).  Houses in black neighborhoods declined more in value from 2007 to 20012 ( the housing bust) than did homes in white neighborhoods.

Another recent report, written by Michela Zonta for  the Center for American Progress finds that home values in neighborhoods where black homebuyers purchased homes had lower appreciation rates than neighborhoods where white homebuyers purchased homes. But Zonta’s study uses appreciation rates from 2006 through 2017, and looks at home purchases in 2013 through 2017, so doesn’t reflect the actual return enjoyed by current buyers.  It’s likely that the opposite results reported by Immergluck and Zonta reflect the greater volatility of home prices in predominantly black neighborhoods (with bigger percentage declines in the bust, and bigger percentage gains the the recovery).  Immergluck looks just at the recovery and finds faster appreciation in black neighborhoods, Zonta looks at the whole cycle and finds lower overall appreciation in black neighborhoods.

There’s other evidence of this cyclicality, Zillow’s data show a similar pattern for foreclosed homes, with greater volatility in neighborhoods of color. In the past five years, home values for foreclosed homes in black neighborhoods have rebounded more sharply (doubling in value) than home values for foreclosed homes in white neighborhoods.  Foreclosed homes in black neighborhoods rose in price 110 percent compared to a 70 percent increase in the value of foreclosed homes in white neighborhoods.

There’s also evidence of variations in cyclicality by price tier of the market.  Lower priced housing tends to have greater price volatility than higher priced housing. Some of the faster growth of lower priced housing in the past few years may simply reflect that cyclical pattern and especially greater price volatility for homes in lower price tiers. According to data from the Federal Housing Finance Agency, since 2012, housing prices for lowest tier of the housing market have increased more rapidly than high tier housing. Nationally, between 2012 and 2017, homes in the low priced tier increased by about 40 percent while homes in the highest priced tier increased by about 20 percent.

If Black and Latino households are buying primarily in the lower priced tiers and white households are buying primarily in the higher priced tiers, then this may account for the difference in appreciation observed in Immergluck’s statistics..

Un-answered questions and what’s next

There’s little question that homes in black neighborhoods are systematically devalued compared to otherwise similar homes in white neighborhoods. The resulting devaluation means that homeowners in black neighborhoods have accumulated less wealth than they might have otherwise, but also somewhat paradoxically means that housing in these neighborhoods is more affordable, especially for renters. The causes of this devaluation are deep-seated and slowly changing. The fact that homebuyers, both black and white, have apparently recognized the size and persistence of this value (and appreciation) difference likely prompts investment decisions that only reinforce its existence.

In theory, we should expect the low price of equivalent quality homes in predominantly black neighborhoods to attract buyers, whether they be value-seeking white buyers who are indifferent too or actively seek out neighborhoods of color, or black households. But as Jason Segedy has pointed out, low prices may be an marker of stigma, decline or low expectations, and produce a dynamic that makes it hard to attract new buyers.

What to do about this problem is an equally complicated problem.  Rising home values in predominantly black neighborhoods would represent a gain in wealth for incumbent homeowners, but would also imply at least a nominal decline in affordability. This reflects the deep-seated contradiction between our two goals for housing policy: we want housing both to be a great investment, and to be affordable. Debates over housing value disparities will always confront this obstacle.

 

 

Portland’s food cart pod is dead, long live Portland’s food cart pods!

How food carts illustrate the importance of dynamic change in cities.

There’s a tension in the city between the permanent (or seemingly permanent) and the fleeting, between the immutability of the built environment and the minute-by-minute change in human behavior. Great cities not only change, but the excel at incorporating and encouraging change.

There are deep lamentations and wailing about the death, three weeks ago of one Portland’s largest and longest running food cart pods. For more than a decade, a cacophony of cuisine sprung up on the edges of a surface parking lot between 9th and 10th and 11th Avenues and Alder and Washington Streets.  You had your choice of everything from tacos to tempeh, pad thai to potato knishes. The site was a lively gathering place, especially at lunchtime and in evenings.

But its all gone now. The carts have been cleared away.

 

The block is being leveled for a new 35-story development that will include offices, residences and a Ritz-Carlton Hotel (the first in the Pacific Northwest).  On one level its the story of the most diabolical capitalism imaginable: hardy family-run immigrant entrepreneurs tossed out on their ear to make way for a chic hotel that caters to the one-percent. (For good measure, the block is part of an Opportunity Zone, and so investors are likely to get a cushy capital gains tax break on the $600 million project.

A tower rises where food carts once ruled: The Block 216 development, (Next Portland)

The foodies at Eater plaintively asked “Can Portland’s food carts survive the city’s development boom.” Their article somewhat misleadingly appeared to blame city government for the decision to close the carts, saying “When the city shut down a downtown cart pod to make way for a hotel, dozens of small-business owners — many of whom are immigrants — felt left in the lurch.”  While the parking lot that hosted the carts was called “City Center Parking” it was a private company, operating on private land, and the plans for developing the site have been widely publicized for a year now. Still, the carts that occupied the site are scrambling to find new locations.

In true Portland fashion, however, steps are already underway to buffer the impact of the change. Ultimately, the new development will include one street facing with “food hall” space for small scale food proprietors.  (We doubt seriously that it will approach the gritty edge-of-the-parking lot charm of what it replaced, but it does at least provide space for some businesses to continue. That’s at least two- to three-years off, and won’t have enough space for everyone to return, so the city is setting aside space in one of its North Park blocks, just a couple of blocks away for the food carts to settle.

But the bigger picture is that foods carts are, and ought to be, a decidedly mercurial and always evolving aspect of urban space.  They’re great for quickly activating under-used spaces at low cost; the number and composition of carts in a pod is usually steadily changing as different carts go out of business, move to different locations, or in a handful of cases, make the jump to brick and mortar.  And pods themselves are a cheap market research: a pod in the burgeoning East Side Industrial District (on Stark and Martin Luther King) flourished for a while, but has reverted to parking (and staging for nearby construction).  Southeast Portland’s popular Tidbit food cart pod, hailed as one of the best in the city, lasted just three years, and has given away to an apartment building–but other pods have started or grown nearby.

Tidbit: A great food cart pod, it lasted just three years.

And fear not gentle (or hungry) reader:  Portland continues to have a robust street-food scene. At last count there are more than 500 carts–so many, and changing so frequently that there’s a full time web-site dedicated to tracking their comings and goings. licensed mobile food vendors in the city.  There are food cart pods in neighborhoods throughout the city, each with its own distinct atmosphere and assortment of cuisines. Plus, they represent the adaptive reuse of urban space. One food cart pod on Killingsworth Street is a re-purposed gas station, which has seamlessly woven the garage, pump island and surrounding lot into a series interconnected dining spaces and play areas. And food carts are a great entrepreneurial opportunity for immigrants: helpfully, Multnomah County, which licenses carts, provides application materials and guidance in Spanish, Chinese, Russian, Thai, Korean, Vietnamese and Arabic).

The ephemeral quality of food carts also needs to be viewed in the context of the restaurant business, which itself is rife with turnover. The restaurant business is a fickle and fashion oriented one, and the struggle to develop the next new thing is only only slightly less pressing than in the tech world.

The real lesson about food carts and food cart pods may not be so much about protecting the existing locations, but instead making it sure that its always possible to up-cycle, even temporarily, under-used bits of the urban landscape. Cities, at their best, are living and dynamic, and open to new ideas, new businesses and new arrangements.  The openness to new things necessarily means that at least some of the old things will change. The opportunities to innovate, improve, and occasionally fail will move the city forward.  So, to answer Eater’s question, food carts will continue, even with Portland’s development boom.  They’ll find new niches and adapt as the city changes, which is a lesson for all of us. Bon Appetite!

How gentrification benefits long-time residents of low income neighborhoods

The new Philadelphia Fed study of gentrification is the best evidence yet that gentrification creates opportunity and promotes integration

To many “gentrification” is intrinsically negative. When wealthier, whiter people move into the neighborhood, it must necessarily mean that lower income people of color are either driven away (to even worse neighborhoods) or suffer from higher rents and loss of community if they stay.

A new study from Quentin Brummet of the University of Chicago and  and Davin Reed of the Philadelphia Federal Reserve Bank, is the best evidence yet that this view of gentrification is fundamentally wrong.  Gentrification creates substantial benefits for long time residents of low income neighborhoods, and causes little displacement. The study shows:

  • There are very small differences in out-migration rates between gentrifying neighborhoods and otherwise similar neighborhoods that don’t gentrify. Over ten years, we would expect about 60 percent of less educated renters to move out of non-gentrifying low income neighborhoods, compared to about 66 percent of less educated renters in gentrifying neighborhoods.
  • Those who leave gentrifying neighborhoods do not end up moving to destination neighborhoods that are measurably worse than the destination neighborhoods of households moving out of poor but not gentrifying neighborhoods
  • The demographic composition of gentrifying neighborhoods, post-gentrification, remains highly mixed, with many less educated renters and homeowners
  • Less educated renters that remain in gentrifying neighborhood don’t see significant increases in rents:  There’s no appreciable difference in rent increases between less educated living in gentrifying and non-gentrifying neighborhoods
  • Demographic changes in gentrifying neighborhoods are those that are generally associated with better outcomes for low income children growing up in these neighborhoods.
  • Poor neighborhoods that don’t gentrify, don’t state the same, they decline:  non-gentrifying neighborhoods lose population and experience declining incomes.
  • Many existing studies of gentrifying neighborhoods that don’t account for high levels of migration even in non-gentrifying neighborhoods, that don’t disaggregate effects by socioeconomic status, and that rely primarily on median rent statistics can produce misleading pictures of neighborhood change.

The Brummet and Reed study focuses on the 100 largest US metro areas, and measures change key neighborhood characteristics between 2000 and 2010-14, using Census data. An important innovation of their work is linking data for individuals across the two time periods, to measure in detail what happened to a neighborhood’s original residents. The study uses a definition based on changes in the relative educational attainment of adults in Census tracts; gentrifying neighborhoods are the low income census tracts in central cities of the nation’s 100 largest metro areas that recorded the highest rates of increase in adult educational attainment over the past decade or so.  They use educational attainment as their key metric for sorting households by socioeconomic status, comparing results for more-educated (college and higher) and less educated households.

Displacement is negligible

The most common critique of gentrification is the notion that it forces long-term residents, especially low income renters, out of the neighborhood. Brummet and Reed stratify households by education level (which is a good proxy for income levels).  They find that gentrification has a very small impact on the tendency of less educated renter households to move away from the neighborhood.  Over a ten-year period, about 60 percent of less educated renters moved out of their neighborhood, regardless of whether it gentrified.  The author’s estimate that for gentrifying neighborhoods, for less educated renter households, the rate is about 6 percentage points higher (66 percent vs 60 percent) over the period of a decade.

Another common concern about those who migrate out of gentrifying neighborhoods is that somehow they are forced to live in locations that are measurably worse than the gentrifying neighborhood they left. Brummet and Reed compare the characteristics of the destination neighborhoods of those who left gentrifying and non-gentrifying tracts, and found no significant differences:

. . . for all types of individuals, movers from gentrifying neighborhoods do not experience worse changes in observable outcomes than movers from non-gentrifying neighborhoods. That is, they are not more likely to end up in a higher poverty neighborhood, to become unemployed, or to commute farther than individuals moving from non-gentrifying neighborhoods

To the extent that there is demographic change in gentrifying neighborhoods, it is a product of the different characteristics of people who move in to a neighborhood, and not due to an increase in the number of people who move out.

Exposure to poverty declines

The benefit of gentrification is that these more mixed income neighborhoods lower the exposure to poverty for all residents.  Brummet and Reed estimate the average poverty rate of neighborhoods that gentrified declined by 3 percentage points.  As a wide body of research has shown, the poverty rate of one’s neighborhoods tends to aggravate all of the negative effects of poverty, so reductions in poverty improve living conditions and opportunity for residents who remain. The key question is whether many long-term residents of these neighborhoods are able to remain and enjoy these benefits.

The archetypal description of gentrification, from Ruth Glass, who coined the term in the 1960s, is that persons from an upper class totally replace the original lower class residents of a neighborhood. Brummet and Reed show that even in the most gentrified neighborhoods, the amount of demographic change over a decade and a half is relatively small, and gentrified neighborhoods have a high degree of income diversity, with 30 to 60 percent of long term residents remaining, post-gentrification (about the same amount as in non-gentrifying neighobrhoods).  They conclude:

. . . less-educated renters and less-educated homeowners each make up close to 25 percent of the population in gentrifiable neighborhoods, and 30 percent and 60 percent, respectively, stay even in gentrifying neighborhoods. Thus, the benefits experienced by these groups are quantitatively large . . .

Rents for less educated are unaffected by gentrification

After displacement, perhaps the second most commonly cited harm of gentrification is the notion that long term residents are forced to pay higher rents. Brummet and Reed look at the actual levels of rents paid by less educated residents both pre- and post-gentrification.  Contrary to the received wisdom on the topic, they find that rents did not increase for less-educated long-term residents:

. . . somewhat surprisingly, gentrification has no effect on reported monthly rents paid by original resident less-educated renters.

The reason likely has to do with the variation in rental unit size and quality within gentrified neighborhoods.  If many housing units remain smaller, with fewer amenities, less favorably located, they may rent for lesser amounts.  It’s fully possible for median rents to increase in a neighborhood, and for the rents for some other segment of less desirable housing to not increase, or not increase as much. As we’ve noted at City Observatory, it may be more appropriate to look at the 25th percentile of rents to judge affordability for lower income households.  As Brummet and Reed conclude:

These results caution against using simple neighborhood median rents when studying gentrification, as is almost always done. Changes in median rents can miss important segmentation and heterogeneity, leading to incorrect conclusions about how the housing costs paid by different types of households are actually affected.

The dynamics of neighborhood change

Implicitly, much of the discussion of gentrification takes a highly static view of neighborhoods.  It assumes, that somehow, in the absence of gentrification, and neighborhood will somehow stay just as it is.  As Brummet and Reed show, all neighborhoods, including both gentrifying and non-gentrifying lower income neighborhoods, experience substantial turnover in population over the course of a decade or so.

Neighborhoods are always changing, and the low income neighborhoods that don’t gentrify don’t somehow stay the same:  they lose population, and experience declines in income.  The average low income neighborhood that didn’t gentrify between 2000 and 2012 lost 8 percent of its population, and saw a decline in average incomes of 21 percent (Tables 4 and A5). Also:  change in gentrifying neighborhoods is not a zero sum game, with each new resident replacing, one-for-one, a previous resident:  Gentrifying neighborhoods saw their total population increase 21 percent.  These results parallel closely the findings from City Observatory’s own Lost in Place study of neighborhood change from 1970 to 2010 that showed that high poverty neighborhoods that didn’t gentrify lost 40 percent of their population over four decades, while gentrifying neighborhoods recorded increases in population.  This underscores why adding new housing in gentrifying neighborhoods can be an important ingredient in assuring they remain diverse and equitable.

Are these results really surprising?

While the Detroit Free Press labeled these results “surprising”, — “Study reveals surprising results about Detroit gentrification,” — they actually fit the pattern of a growing body of careful studies of neighborhood change.  Martin and Beck found that there was almost no difference in out-migration between gentrifying and non-gentrifying poor neighborhoods.  Ingrid Gould Ellen and her colleagues, using data on kids born to families receiving Medicaid benefits showed virtually no displacement due to gentrification, and also showed that residents who moved generally relocated to similar neighborhoods. Lance Freeman‘s work has shown that displacement rare, Jacob Vigdor has shown that long-time residents are somewhat more likely to stay in gentrifying neighborhoods than non-gentrifying ones.

There’s a decided “man-bites-dog” character to media reporting on this paper.  The Detroit Free Press summarizes the findings as “challenging the notion than gentrification is as harmful as many believe.” The Philadelphia Inquirer headlined its story, “Effects of gentrification on long time residents are not as negative as typically perceived, Philly Fed study says.” CityLab talks about the “Hidden Winners” from gentrification; apparently, gentrification is so widely known to be A BAD THING, that it’s counter-intuitive to think that existing residents might benefit. For example, Kriston Capps of CityLab writes:

Often it goes without saying that the drawbacks of neighborhood change—above all the displacement of existing lower-income residents, but also increases in rents and upticks in cultural conflicts—greatly outweigh any benefits.

It is probably too much to ask, but what the data show, is that for many residents and neighborhoods, gentrification is a good thing.  It raises property values for long-time homeowners, increasing their wealth. It doesn’t appear to be associated with rent increases for less educated renters who remain. Poverty rates decline, and objective changes in neighborhood characteristics–notably greater income mixing–are associated with higher levels of inter-generational mobility for kids growing up in such neighborhoods.  In addition, the data show that poor neighborhoods that don’t gentrify steadily deteriorate on these measures. Implicit in much of the popular discussion and press coverage of gentrification is the assumption that neighborhoods that don’t gentrify will stay the same–but they don’t.  Things get worse.  That’s the relevant story for many places, and its simply not reported.

Methodology and definitions

What makes this study special is that it tracks residents over time as they move in and out of different neighborhoods. The bugbear of most displacement studies is that they rely on comparing one-time snapshots of neighborhood composition, and don’t track individuals.  There’s also an implicit assumption in much work in the field that any time someone moves out of a neighborhood that’s evidence of displacement.  But as Brummet and Reed painstakingly illustrate, population dynamics in urban neighborhoods are far more dynamic that usually thought. As we’ve noted at City Observatory, the average tenure for renters in the US is less than two years. With their longitudinal tracking of individual households, Brummet and Reed estimate that about three-fifths to three -quarters of all renters and a third to forty percent percent of all homeowners will move away from their low income neighborhood over a-ten year period (Table 1). Failing to control for this background of steady change biases many gentrification studies.

Brummet and Reed use confidential and anonymized census data from 2000 and the 2014 American Community Survey to track the residential location of individuals over time. This connected, longitudinal data series is a gold standard of measuring neighborhood change. Snapshot comparison studies can’t tell whether a decline in the number of households with incomes below the poverty line is due to such households enjoying higher incomes, are is due to lower income households moving out of the neighborhood.

One critical ingredient in any study of neighborhood change is the definition of gentrification. Brummet and Reed develop two measures:  one is a continuous measure of demographic change that examines the degree to which a neighborhood gentrifies, the second is a more commonly used-binary measure that divides all lower income, central city neighborhoods into two categories, gentrifying and non-gentrifying, based on whether they are in the top 10 percent of such neighborhoods nationally on this index.

Policy Implications: Leverage gentrification for good

Change of all kinds is hard, and for residents of low income communities, there’s a long and sad history of most changes, many of them promised to make things better, only making things worse.  So its little surprise that their should be skepticism about gentrification.  But unfortunately, the knee-jerk reaction to the perception of negative effects from gentrification has generally been a series of policies that promise to only make the problem worse:  rent control, NIMBY development restrictions, and inclusionary housing requirements. As Brummet and Reed write:

. . . concern that gentrification displaces or otherwise harms original neighborhood residents has featured prominently in the rise of urban NIMBYism and the return of rent control as a major policy option.

The fact that gentrification is mostly benign, and in many respects beneficial to long-term residents should change the policy calculus.  Rather than blocking growth and development in gentrifying neighborhoods, cities should recognize that gentrification creates benefits and also leverage this investment to provide more opportunities for less-well off households to live in great urban neighborhoods  

This could provide new options for policies designed to increase children’s exposure to high-opportunity neighborhoods, for example by targeting subsidies to help them stay in neighborhoods that are improving around them.  . . . Accommodating rising demand for central urban neighborhoods, such as through building more housing, could maximize the integrative benefits we find, minimize the out-migration effects we find, minimize gentrification pressures in nearby neighborhoods, and minimize

As we’ve argued at City Observatory, using tax increment financing to subsidize affordable housing in neighborhoods experiencing rapid change.  Tapping the property appreciation in gentrifying neighborhoods leverages increased wealth to assure that a neighborhood is inclusive.

This study is perhaps the best evidence we have yet of the nature and extent of gentrification in US cities.  It should dispel many of the widely repeated myths and misunderstandings about neighborhood change.  If we better understand the dynamics of urban neighborhoods, we’ll view gentrification not as a scourge, but as an opportunity to use the growing demand for urban living to build cities that are more diverse and inclusive.

Quentin Brummet and Davin Reed, The Effects of Gentrification on the Well-Being and Opportunity of Original Resident Adults and Children, Philadelphia Federal Reserve Bank Working Paper 19-30, , July 2019.

 

Why homeownership is frequently a bad bet

Home buying is a risky bet: There’s a 30% chance your house will be worth less in five years

It’s a widely agreed that promoting homeownership is a key means to help American households build wealth.  But as we and others have argued, homeownership can be a risky and problematic investment for many households–and is fundamentally at odds with the other pillar of housing policy–that housing ought to be affordable.

The latest reminder of the risky business of home ownership comes from financial blogger Felix Salmon, writing at Axios:

You wouldn’t make a 5x leveraged bet on the S&P 500 — not unless you were an extremely sophisticated financial arbitrageur, or a reckless gambler. Even then you wouldn’t put substantially all of your net worth into such a bet. Stocks are just too volatile. But millions of Americans make 5x leveraged bets on their homes — that’s what it means to borrow 80% of the value of the house and put just 20% down.

While homes are much less volatile than individual stocks, they’re just as volatile as the kind of diversified stock indices most people invest in.

The bottom line: Any given home has roughly a 30% chance of ending up being worth less in five years’ time than it is today. If you can’t afford that to happen, you probably shouldn’t buy.

Salmon links to data from housing investment firm Unison that plots the volatility of home prices in markets around the country. It turns out that home prices are nearly as volatile as major stock indices:

You can observe this volatility in markets around the country.  Almost on cue, the latest data for several markets around the country, ones that you wouldn’t expect, is showing how risky housing purchases can be. For example, prices in Portland and Seattle have both declined in the past year, showing that even “hot” markets with strong economies can experience home price volatility. Portland prices are down 1.6 percent over the past 12 months, Seattle’s are down 4.5 percent in the same time, according to Zillow. Zillow also predicts Seattle prices will decline into next year.

Whether homeownership turns out to be a wealth-building, or wealth destroying endeavor depends a lot on timing and luck–you have to buy the right house, at the right time, in the right neighborhood, for the right price. And the way housing and credit markets generally work, the most financially vulnerable households tend to make their housing investments at the wrong time, in the wrong neighborhoods and for the wrong price, and end up paying more both for their homes and their mortgages, putting them in the position of maximum exposure to market volatility.

There’s a wealth of biased information out there about the merits and safety of homeownership.  While media reports frequently draw attention to hot markets, and emphasize places where prices are increasing, there is considerable variability. It’s still the case, more than a decade after the housing bust, that inflation-adjusted home prices have yet to recover to their pre-collapse highs.  Even in the worst of times, for example, realtors are telling people “it’s a great time to buy a home.” In fact, when it comes to the real estate market, it’s always a good time for the buyer to beware, or at least be aware, of the risks of homeownership.

Note:  The commentary has been revised to more accurately characterize the business specialization of Unison.

Why are US drivers killing so many pedestrians?

US drivers are killing 50 percent more pedestrians, European drivers are killing a third fewer

If anything else–a disease, terrorists, gun-wielding crazies–killed as many Americans as cars do, we’d regard it as a national emergency. Especially if the death rat had grown by 50 percent in less than a decade.  But as new data from the Governor’s Highway Safety Association (via Streetsblog) show, that’s exactly what’s happened with the pedestrian death toll in the US.  In the nine years from 2009 and 2018, pedestrian deaths increased 51 percent from 4,109 to 6,227:

There are lots of reasons given for the increase, distracted driving due to smart phone use, a decline in gas prices that has prompted even more driving, poor road design, a culture that privileges car travel and denigrates walking, and the increasing prevalence of more lethal sport utility vehicles.  Undoubtedly, all of these factors contribute.

While some may regard a pedestrian death toll as somehow unavoidable, the recent experience of European countries as a group suggests that there’s nothing about modern life (Europeans have high rates of car ownership and as many smart phones as Americans) that means the pedestrian death toll must be high and rising. In fact, at the same time pedestrian deaths have been soaring the US, they’ve been dropping steadily in Europe.  In the latest nine year period for which European data are available, pedestrian deaths decreased from 8,342 to 5,320, a decline of 36 percent. Here are the data from the European Road Safety Observatory:

In the past decade, Europe and the US have reversed positions in pedestrian death rates.  It used to be that the number of pedestrian deaths per million population were higher in Europe, now the US pedestrian death rate per million population is now 75 percent higher than in Europe. The following chart compares the change in pedestrian death rates over the last nine year for which data are available for both Europe and the US.

It’s worth noting that even though walking is far more common in Europe, and streets are generally narrower, and in older cities, there aren’t sidewalks, but pedestrians share the roadway with cars.  Despite these factors, Europe now has a lower pedestrian death toll per capita than the US.

We walk less, but we die more.

These data should be at once heartening and discouraging to advocates of Vision Zero.  On the one hand, they show that it is entirely possible to have a modern economy, with technology and with lots of cars, that doesn’t kill so many pedestrians.  On the other hand, it shows that the US is very much on the wrong track.

The devaluation of black neighborhoods: Part 1.

Lingering racism holds down property values in majority black neighborhoods

For most American households, their home is their largest financial asset; how valuable that asset is, and whether it appreciates has a profound impact on a household’s financial well-being. Unsurprisingly, a big component of the racial wealth gap in the United States has to do with differences in rates of homeownership, and also in the value of homes owned by black and white households.

There’s a profound gap between the value of homes in majority black and predominantly white neighborhoods in the nation’s metropolitan areas, according to estimates from the Brookings Institution. Even after adjusting for home characteristics and neighborhood attributes, homes in majority black neighborhoods sell at a 20 percent or greater discount to otherwise similar homes in predominantly white neighborhoods.

The reasons for and implications of the value gap between white and black neighborhoods are complex. Lower value housing means incumbent homeowners have less wealth–but also implies that rents are lower and housing is more affordable in black neighborhoods than predominantly white ones. The devaluation of housing in black neighborhoods is primarily a result of the continued resistance many white homebuyers have to consider black neighborhoods, but also likely reflects the growing tendency of higher income black households to move to more suburban and integrated neighborhoods.

The Black/White Home Value Gap in Memphis

To get an idea of what this means in just one city, we look at the data for Memphis.

  • Homes in majority black neighborhoods in Memphis are devalued by $2.3 billion compared to homes in predominantly white neighborhoods, according to estimates from the Brookings Institution.
  • The typical home in a majority black neighborhood in Memphis sells for about $88,500, about $25,000 less than a similar house in a predominantly white neighborhood.
  • Depressed values in black neighborhoods are correlated with lower wealth, but paradoxically may contribute to affordability.

These estimates come from a recent report from the Brookings Institution undertakes a detailed comparison of home prices in majority black neighborhoods compared to other neighborhoods in the United States. The report–The Devaluation of Assets in Black Neighborhoods— by Brookings scholars Andre Perry and David Harshbarger and Gallup’s Jonathan Rothwell uses home price data from 113 cities around the country and finds a consistent pattern of undervalued homes in majority black neighborhoods.

The Brookings Institution compared the sales value of owner-occupied homes in majority black neighborhoods with otherwise similar homes in exclusively white neighborhoods and found that the houses in black neighborhoods sold at a more than 20 percent discount to the homes in white neighborhoods. Here is a summary of the report’s findings for Memphis.

 

What this table means: This table shows that there are nearly 92,000 owner-occupied homes in majority black neighborhoods in the Memphis metropolitan area; of these, nearly 70,000 are owned by black homeowners. In the aggregate, these homes are worth about $8 billion, but would be worth more than $10 billion, if they were valued the same way as otherwise similar homes in similar, but predominantly white, neighborhoods.  The typical home in a majority black neighborhood has a market value of about $88,000, which is about $25,000 less than a similar home in a majority white neighborhood.

The figures in this table reflect the difference in home values after controlling for house and neighborhood characteristics. The gross difference (without these controls) between home values in majority black neighborhoods and exclusively white neighborhoods is even larger.  Majority black neighborhoods tend to be older, have smaller houses, and havefewer amenities and higher rates of poverty, compared to exclusively white neighborhoods.  Perry and his colleagues used a regression analysis to statistically control for the effects of structural characteristics (home size, age, etc), and neighborhood characteristics (crime rates, schools, commuting distances).  Even after adjusting for these effects–which explain some of the variation in home prices among neighborhoods–homes in majority black neighborhoods were still undervalued.

Economic Effects of Undervaluation

The data make a strong case that homes in majority black neighborhoods are systematically undervalued in comparison to otherwise similar homes in predominantly white neighborhoods. The magnitude of the disparity–the author’s estimate at $156 billion nationally–is also sizable.

The economic effects of this undervaluation are complicated and manifold.  First, it is obvious that homeowners in these neighborhoods, most of whom are black, have less wealth than they otherwise would have if their homes commanded the same values as those in predominantly white neighborhoods. If it weren’t for this devaluation, homeowners in these majority black neighborhoods would have significantly greater home equity.

Beyond that, the picture is more complex, and the effects are mixed. For renters, lower home and property values probably mean they pay lower rents. And it could also be the case that these same homes have long been undervalued, meaning that while current owners have less equity than if they were fairly valued, they may also have paid a lower purchase price to acquire the property.  The Brookings report is candid about these mixed effects:

. . . the devaluation of rental properties is advantageous to renters, in so far as it results in a lower rental payment for similar quality housing. The devaluation of owner-occupied housing makes it easier to acquire the home, but once purchased, it is unambiguously disadvantageous to the owner and occupier, who would otherwise benefit from being able to refinance, borrow, or sell at a higher valuation.

As we’ve often pointed out, there’s an inherent tension in US housing policy between affordability and wealth creation. Houses in majority black neighborhoods may be more affordable (i.e. sell at a discount to otherwise similar houses in predominantly white neighborhoods), but may therefore represent a loss of wealth for their owners.

What causes values to be depressed?

Why would otherwise similar houses in majority black neighborhoods sell for so much less than comparable houses in predominantly white neighborhoods? The first, and most obvious answer is the lingering effects of racism. Many white homebuyers may avoid searching in predominantly black neighborhoods, and the lower demand for housing in these areas causes prices to be lower.  It is also likely that “steering” by real estate agents — directing white buyers primarily to white neighborhoods — also has this effect.

The decline in values is also likely compounded by the decisions of black homebuyers.  Upper- and middle-income black households may rationally choose to buy homes in more integrated neighborhoods, not just for amenities like schools (which were arguably controlled for in the Brookings study), but also because they may believe that such neighborhoods will appreciate more. Over the past four decades, middle-income and higher-income black households have tended both to suburbanize and to integrate.  According to Patrick Sharkey of New York University, a majority of black middle class households now live in the suburbs and almost as many lived in neighborhoods that are not majority black. In 1970, fewer than a third of middle- and upper-income African-American households lived in majority non-black neighborhoods; today it is more than half.  In 1970, only about 20 percent of middle-and upper-income households lived in suburbs; today it is nearly half.

The exit of upwardly mobile black households from majority black neighborhoods has increased economic polarization according to David Rusk of the D.C. Policy Center.  Similarly, the relatively low prices of homes in majority black neighborhoods may mean that these are the only locations that black households with more modest incomes can afford.  The net result may be that lower-income homeowners become more concentrated in majority black neighborhoods. The movement of many black households out of majority black neighborhoods–particularly those households with means–coupled with the continued tendency of white households not to purchase in such areas likely both contribute to the housing price disparities observed in the Brookings report. Finally, as the report points out, the low value of homes in majority black neighborhoods means that homeowners there have less equity in their homes and therefore may find it difficult or impossible to sell and move to a different neighborhood.

Housing prices, like those of other investments, reflect not just the current utility or value of assets, but also the expected return.  Buyers will pay more for housing in areas they expect will appreciate.  Robert Shiller has shown that variations in buyer expectations of future home price appreciation play a key role in the formation of housing bubbles. The same tendency is likely to play out across neighborhoods in metropolitan areas:  Neighborhoods that are perceived as up and coming are likely to have a different price trajectory than neighborhoods that are seen as stagnant or declining.

One finding that these data suggest is that whites are paying a premium to live in all-white neighborhoods.  One implication of the substantial (20 percent plus) differential in home values between comparable houses in majority black and nearly all white neighborhoods is that white households are foregoing the opportunity to get a much less expensive home by buying in a black neighborhood.  Because some white households may be much less averse to having black neighbors, some of what we see as gentrification may be propelled by the substantially lower cost of housing in majority black neighborhoods.

Tomorrow, in part 2 of this series, we’ll take a further look at this issue of wealth disparities, and focus on how housing values in black and white neighborhoods have changed over time, and what challenges that closing this gap poses for policymakers.

 

It’s official: I-5 Rose Quarter freeway widening is a boondoggle

Frontier Group and USPIRG’s annual report on highway boondoggles calls out the Oregon DOT’s wasteful, ineffective I-5 Rose Quarter freeway widening project as a national level boondoggle.

Portland is famous for making top ten lists when it comes to urban transportation policy, what with its number one in the nation share of big city bike commuters, its pioneering efforts to revive the streetcar and similar measures. But today it’s on a list that clearly signals the city is headed in the wrong direction.

Today, Frontier Group and the OSPIRG Foundation released its annual report identifying the worst highway projects in the nation.  Among the nation’s top nine highway boondoggles for 2019:  the Oregon Department of Transportation’s proposed half billion dollar widening of the I-5 freeway near downtown Portland.  The report says:

In Portland, a city that has taken great strides toward more sustainable transportation, an expensive highway project would constitute a step backward to the car-dependent policies of the past. It would also likely fail to meaningfully improve safety compared with other investment strategies.

The award comes on top of a long list of critical comments from national transportation policy experts including Robin Chase, Janette Sadik-Khan, Jeff Speck, and others.

For the past five years, the Boondoggles report has cataloged the worst proposed highway projects in the nation. In several cases, the projects have stalled or been cancelled, with big savings for state and local taxpayers and positive results for transportation and the local economy.  For example, local groups stopped the widening of I-275 in Tampa Heights, and the community is thriving, with new restaurants and businesses, and even e orts to reduce tra c capacity on local roads to improve walking and biking. Wisconsin scaled back its freeway widening plans, shifting more resources to a “fix it first” maintenance strategy designed to take better care of existing roads. Freeway removals, like the closure of Seattle’s Alaskan Way viaduct showed that reducing road capacity causes traffic levels to decline.

It will come as no surprise to regular readers of City Observatory that the Rose Quarter freeway widening project has all the hallmarks of a boondoggle.  We’ve chronicled dozens of reasons that the project shouldn’t go forward:

All-in-all, the Rose Quarter freeway widening project is a classic boondoggle.  One can only hope that this critical national attention will help state, regional and local leaders recognize that moving forward with the project is a tragic blunder, one at odds with our stated values.

Full disclosure:  City Observatory was provided with a draft copy of the report prior to its publication and was offered the opportunity to comment on the report. It was not compensated for providing its comments.

Note: This post has been revised to reflect the report’s sponsorship by the OSPIRG Foundation.

Fruit and economics: Local goods

Perishable, special, and local: The economics of unique and fleeting experiences

 

I pity you, dear reader.  You likely have no idea what a real strawberry tastes like. Unless you spend the three weeks around  the Summer Solstice in the shadow of this mountain, chances are you have never tasted a Hood strawberry.

Mt. Hood & Hood strawberries: The peak of Oregon & the peak of flavor (in peak season).

The Hood is a variety grown exclusively in the Northern Willamette Valley of Oregon, on family-owned farms scattered around the edges of Portland’s urban growth boundary.

The Hood is as different from an industrial strawberry as an heirloom tomato or a piccolo San Marzano is from their rubber factory-farm cousins.

You may not know, for example, that strawberries have juice.  They were meant to be a juicy fruit.  The industrial strawberry has been bred to be a fibrous, indestructible and infinitely shelf-stable.

You may also not know that a real strawberry is monochromatic:  It is red, without a trace of white.  When you cut a Hood  strawberry open, and it is red through and through (and bleeding, in consequence of its wound).

The hood is a fragile vessel for carrying strawberry juice.  It’s both delicate and perishable, taking about three days from being picked to dissolving. It can’t be shipped. You either get it at a farmer’s market, go to the farm and “U-pick” the berries yourself, or find one of a relative handful of markets who’ll stock the tender things. It’s one of the things–along with the ending of the rainy season, that marks the beginning of summer in Oregon.  And it’s just here for a few weeks: gone before the end of June. We’re not alone in our obsession: actual scientists say the same thing about the Hood.

The point here is not to brag on the Hood Strawberry (well, not entirely). The point is that in an increasingly globalized world, where everything is the same everywhere, thanks to a combination of the World Wide Web, Starbucks, and Fedex, there are still some things that a distinct and different about every single place. These local goods (and services) things that you can’t get unless you’re there, and that you are simply unlikely to know anything about, absent local knowledge, are what make that place special. Ubiquity is over-rated. What matters isn’t the ubiquitous, the interchangeable, the digital. What makes things interesting and desirable is that they are special, and different and even transitory. If you’re not in the right place and the right time, you’ll not discover or enjoy them.

A couple of years ago, on Twitter, Paul Krugman waxed poetic about fruit and economic theory.  Krugman is back from Europe, and thirsting for summer fruits coming into season. That led him to reflect on a fundamental flaw in economic logic, the notion that more choice is always better. The short, uncertain season for his mangoes and figs, makes them all the more valuable, not less so. He observes:

 . . . seasonal fruits — things that aren’t available all year round, at least in version you’d want to eat – have arrived. Mangoes! Fresh figs!  What makes them so great now is precisely the fact that you can’t get them most of the year. . . .The textbooks (mine included) tell you that more choice is always better. But a lot of things gain value precisely because they aren’t an option most of the time. I’d probably get tired of fresh figs and mangoes if I could get them all year round. But still, if you imagine that being rich enough to have anything you want, any time you want it, would make you happy, you’re almost surely wrong.

Krugman’s observation rings true.

Every city and every place has some special, idiosyncratic feature, it could be food, or music or plants or the smell of the forest after a rain. As Jane Jacobs observed:

“The greatest asset that a city can have is something that’s different from every other place.”

Maybe the thing we need to pay attention to in thinking about the global economy is not “the death of distance” but instead “the dearth of difference.” The more things and places and experiences become standardized, homogenized and universal, the less joy and stimulation we’re likely to get from them. I’m going to grab a handful of Hoods; I hope you’ll enjoy something fresh and local, too.

We are however somewhat less obsessed about strawberries than Humphrey Bogart.

Here’s Krugman’s full ode to seasonal fruit, from Twitter:

Look, the planet and the Republic are both in grave danger, possibly doomed. But it’s Friday night, I’ve just had a couple of glasses of wine, so I’m going to talk briefly about … fruit and economic theory.

OK, two pieces of background. I recently got back from almost a month in Europe, cycling and vacationing, and while it’s nice to not be living out of a suitcase, the adjustment back to reality is proving a bit harder than in the past, for a variety of reasons

The other piece of background is that I’m really into breakfast. I start almost every day with fairly brutal exercise – I’m 66 and fighting it; today that meant an hour-long run in the park. Breakfast, usually starting with yoghurt and fruit, is the reward

So one of the best things about coming home is that some seasonal fruits — things that aren’t available all year round, at least in version you’d want to eat – have arrived. Mangoes! Fresh figs!

Are these fruits better than other fruits? Objectively, no. What makes them so great now is precisely the fact that you can’t get them most of the year. And that, of course, tells you that standard consumer choice theory is all wrong

Does this have any policy implications? Probably not. What really really matters is being able to afford health care, decent housing, and good education; the things I’m talking about are trivial.

But still, if you imagine that being rich enough to have anything you want, any time you want it, would make you happy, you’re almost surely wrong. Limits are part of what makes life worth living. And the big question is, will those peaches be ripe by morning?

Fruit and economics: Riffing on Krugman

Perishable, special, and local: The economics of unique and fleeting experiences

Friday night on Twitter, Paul Krugman waxed poetic about fruit and economic theory.  Krugman is back from Europe, and thirsting for summer fruits coming into season. That led him to reflect on a fundamental flaw in economic logic, the notion that more choice is always better. The short, uncertain season for his mangoes and figs, makes them all the more valuable, not less so. He observes:

 . . . seasonal fruits — things that aren’t available all year round, at least in version you’d want to eat – have arrived. Mangoes! Fresh figs!  What makes them so great now is precisely the fact that you can’t get them most of the year. . . .The textbooks (mine included) tell you that more choice is always better. But a lot of things gain value precisely because they aren’t an option most of the time. I’d probably get tired of fresh figs and mangoes if I could get them all year round. But still, if you imagine that being rich enough to have anything you want, any time you want it, would make you happy, you’re almost surely wrong.

Krugman’s observation rings true to me.  I can tell you in two words:  Hood strawberries.

I pity you.  You have no idea what a real strawberry tastes like. Unless you spend the three weeks prior to the Summer Solstice in the shadow of this mountain, chances are you have never tasted a Hood strawberry.

Mt. Hood & Hood strawberries: The peak of Oregon & the peak of flavor (in peak season).

The Hood is a variety grown exclusively in the Northern Willamette Valley of Oregon, on family owned farms scattered around the edges of Portland’s urban growth boundary.

The Hood is as different from an industrial strawberry as an heirloom tomato or a piccolo San Marzano is from their rubber factory-farm cousins.

You may not know, for example, that strawberries have juice.  They were meant to be a juicy fruit.  The industrial strawberry has been bred to be a fibrous, indestructible and infinitely shelf-stable.

You may also not know that a real strawberry is monochromatic:  It is red, without a trace of white.  When you cut a Hood  strawberry open, and it is red through and through (and bleeding, in consequence of its wound).

The hood is a fragile vessel for carrying strawberry juice.  It’s both delicate and perishable, taking about three days from being picked to dissolving. It can’t be shipped. You either get it at a farmer’s market, go to the farm and “U-pick” the berries yourself, or find one of a relative handful of markets who’ll stock the tender things. It’s one of the things–along with the ending of the rainy season, that marks the beginning of summer in Oregon.  And it’s just here for a few weeks: gone before the end of June. We’re not alone in our obsession: actual scientists say the same thing about the Hood.

The point here is not to brag on the Hood Strawberry (well, not entirely). The point is that in an increasingly globalized world, where everything is the same everywhere, thanks to a combination of the World Wide Web, Starbucks, and Fedex, there are still some things that a distinct and different about every single place. These local goods (and services) things that you can’t get unless you’re there, and that you are simply unlikely to know anything about, absent local knowledge, are what make that place special. Ubiquity is over-rated. What matters isn’t the ubiquitous, the interchangeable, the digital. What makes things interesting and desirable is that they are special, and different and even transitory. If you’re not in the right place and the right time, you’ll not discover or enjoy them.

Every city and every place has some special, idiosyncratic feature, it could be food, or music or plants or the smell of the forest after a rain. As Jane Jacobs observed:

“The greatest asset that a city can have is something that’s different from every other place.”

Maybe the thing we need to pay attention to in thinking about the global economy is not “the death of distance” but instead “the dearth of difference.” The more things and places and experiences become standardized, homogenized and universal, the less joy and stimulation we’re likely to get from them. I’m going to grab a handful of Hoods; I hope you’ll enjoy something fresh and local, too.

We are however somewhat less obsessed about strawberries than Humphrey Bogart.

Here’s Krugman’s full ode to seasonal fruit, from Twitter:

Look, the planet and the Republic are both in grave danger, possibly doomed. But it’s Friday night, I’ve just had a couple of glasses of wine, so I’m going to talk briefly about … fruit and economic theory.

OK, two pieces of background. I recently got back from almost a month in Europe, cycling and vacationing, and while it’s nice to not be living out of a suitcase, the adjustment back to reality is proving a bit harder than in the past, for a variety of reasons

The other piece of background is that I’m really into breakfast. I start almost every day with fairly brutal exercise – I’m 66 and fighting it; today that meant an hour-long run in the park. Breakfast, usually starting with yoghurt and fruit, is the reward

So one of the best things about coming home is that some seasonal fruits — things that aren’t available all year round, at least in version you’d want to eat – have arrived. Mangoes! Fresh figs!

Are these fruits better than other fruits? Objectively, no. What makes them so great now is precisely the fact that you can’t get them most of the year. And that, of course, tells you that standard consumer choice theory is all wrong

Does this have any policy implications? Probably not. What really really matters is being able to afford health care, decent housing, and good education; the things I’m talking about are trivial.

But still, if you imagine that being rich enough to have anything you want, any time you want it, would make you happy, you’re almost surely wrong. Limits are part of what makes life worth living. And the big question is, will those peaches be ripe by morning?

A solution for displacement: TIF for affordable housing

The case for using tax increment financing for affordable housing in gentrifying neighborhoods

The problem with gentrification is that rising property values may make it expensive or impossible for lower and moderate income residents to live in an area. But what if we could tap some of that increase in land values to subsidize affordable housing in affected neighborhoods? We can, and that’s exactly what Portland has done for more than a decade:  It now sets aside 40 percent of the tax increment funds (TIF) raised in urban renewal districts.  Since 2006, the program has generated nearly a quarter of a billion dollars to support affordable housing.

And there’s little question that it has supported income diversity in redeveloping neighborhoods. Portland’s tony Pearl District, adjacent to downtown, as blossomed in the past decade, adding more than 7,000 new residential units, plus offices and stores. The city’s policy has plowed tens of millions of dollars in the tax increment from new construction and rising property values into affordable housing in the neighborhood.  That funding, coupled with other resources, has supported the construction of 2,200 units of affordable housing, interspersed with market rate units. Elsewhere in the city, urban renewal funds are being used to support the construction of affordable apartments and subsidize a homeownership program, with funding targeted to helping residents displaced in previous decades.

Portland’s Pearl District

Using TIF to support affordable housing is a key policy for minimizing the effects of neighborhood revitalization. It taps the increase in property values associated with gentrification, and spends that money on building additional affordable housing, and thanks to the geographic nature of TIF districts, puts the funding in exactly the places where development pressure is greatest. In addition, TIF funding is automatically proportional to need: the more property values rise, the more development occurs, the more money TIF generates for housing.

The Ramona: Affordable housing in the Pearl, subsidized with TIF.

And unlike inclusionary housing requirements that drive up the cost of development, and load costs solely on new residential units–which have the effect of restricting supply, and likely worsening affordability problems, TIF doesn’t change the development cost calculus for new housing construction.

Why use TIF for affordable housing in gentrifying neighborhoods?

Urban revitalization projects have the effect of driving up property values, and resulting in increased market rents, which creates justifiable concerns about housing affordability and potential displacement. If cities can harness some of the increase in market value, they may be able to generate resources to subsidize the preservation, rehabilitation or construction of affordable housing, and ameliorate these effects.

Lance Freeman of Columbia University, one of the nation’s leading scholars on gentrification, has suggested using tax increment financing to subsidize affordable housing in neighborhoods experiencing rapid change.  He wrote:

“Financing could come from the increase in property values and consequently property taxes in the zone that by definition accompanies gentrification of the type that might cause displacement. The increase in property taxes in the gentrifying neighborhood could be set aside specifically to fund affordable housing in the very neighborhood undergoing gentrification.”

One of the biggest problems associated with affordable housing is achieving scale.  While community land trusts and inclusionary zoning requirements are often touted as solutions to gentrification, they are seldom able to produce many units of housing in a timely fashion. For example, New York’s inclusionary housing program produced fewer than 200 units per year in all five boroughs combined; in the Pearl District alone, Portland’s  Portland’s TIF program supported more than that number in the Pearl alone in the same time period.

A key advantage of harnessing TIF is that it supports mixed income neighborhoods. TIF financing is geographically limited to redeveloping neighobrhoods assuring that  low and moderate income housing gets built in the same areas that are experiencing new market rate housing construction, automatically precluding the establishment of purely high income enclaves.

Portland’s TIF for affordable housing program

Portland Oregon dedicates approximately 40 percent of its tax increment financing (TIF) revenues to subsidizing affordable housing in urban renewal areas. In the past eight years, the program has generated nearly a quarter of billion dollars for affordable housing, and helped support the construction and rehabilitation of thousands of units of affordable housing. The program provides affordable housing in neighborhoods undergoing revitalization, and does so without creating dis-incentives to private market housing.

For the past decade, Portland has dedicated a portion of the revenue from its tax increment financing to help underwrite the cost of constructing new affordable housing. This program applies in the city’s designated urban renewal areas, which by statute are restricted to include not more than 15 percent of the city’s land area.

This program has evolved over time. Originally, it was applied to a single urban renewal district (the River District, which encompasses the Pearl District).  Since 2010, it has been extended to most urban renewal districts city-wide. Typically, TIF funding is pooled with other sources of funds, such as low income housing tax credits (LIHTC) and state funds. (Additional history of the program is described below). In general, city policy now provides that 40 percent of revenues from tax increment funds in urban renewal zones are to be set-aside for affordable housing. (The policy is waived for some purely industrial urban renewal areas).

Since its inception the program has helped subsidize the construction of several thousand affordable housing units in urban renewal areas citywide.  In the city’s largest urban renewal area, the River District (which includes the rapidly growing Pearl District), the program has generated $83 million in TIF funds which have helped support the construction of about 2,200 affordable housing units. Pearl District affordable housing projects are interspersed with market rate housing in the area. (Red buildings were affordable housing).

Source: (Schmidt, 2014)

The number of housing units financed through the TIF program is significant, especially when compared to the output of other programs local governments have enacted to try to create more affordable housing, such as inclusionary zoning requirements. The total number of units of housing constructed with assistance from the TIF set-aside in Portland’s river district exceeds the number of inclusionary housing units built in all but a handful of cities.

Policy advantages of TIF

Using TIF funds for affordable housing has two especially of desirable properties:

  1. TIF funding doesn’t add costs for developers or impose additional development restrictions. Once a TIF district is established, the provisions are essentially invisible to developers. Moreover, developers pay the same property taxes whether they build inside the TIF district or not. The TIF diverts the taxes that would have otherwise been collected by the city (and other tax levying entities) for the life of the TIF district. Thus, unlike other measures (impact fees, inclusionary zoning requirements), TIF financing doesn’t dis-incentivize  housing construction or development.
  2. TIF captures tax revenues both from the value of new investment in the area and the appreciation of existing properties (developed and undeveloped) in the district. Revitalization tends to increase the property values of existing residential, commercial and industrial land and improvements, and result in higher property tax revenues for the city. TIF then allocates a portion of these property tax revenue increases to affordable housing.

Portland’s program was subject to media criticism for falling short of an announced goal that a certain percentage of all the housing in the River District would be affordable–it produced about 2,200 units, short of a goal of about 2,400 units. Public officials should be cautious in setting expectations for TIF financed housing.

One important limitation of the TIF program in Oregon (which is typical of most states), is that the revenues raised in the district must be expended in the district. Portland is precluded, for example, from building housing in one district with funds from another district.  In the case of the River District, this requires the city to build housing in a high cost area.  This has positive and negative effects: It assures that new affordable housing gets build in high cost and high income areas, promoting socioeconomic integration; on the other hand, it likely increases the cost per unit of affordable housing, and reduces the total number of affordable units that could be built with public funds.

Tax increment financing has rarely been used as part of an inclusionary housing program. A recent survey by the Lincoln Institute of Land Policy of 250 inclusionary housing programs nationwide reports that only 20 programs (8 percent of those surveyed) provided a direct public subsidy (including tax increment financing). Thaden, E., & Wang, R. (2017). Inclusionary Housing in the United States: Prevalence, Impact, and Practices Working Paper WP17ET1 (Working Paper No. WP17ET1) (p. 67). Washington, D.C.: Lincoln Institute of Land Policy.

History of Portland’s tax increment financing for affordable housing program

In 2006, the Portland City Council  adopted the original Set Aside Policy, requiring that “30% of Tax Increment Financing (TIF) over the life of an Urban Renewal District shall be dedicated to the development, preservation and rehabilitation of housing affordable to households with incomes below 80% median family income.”

The Council modified that policy in 2011, to replace change the applicability of the set-aside from each individual urban renewal district to all urban renewal districts, collectively, city-wide.

In 2015, the City Council voted to further revise the program by adopting set-aside amounts for each of the city’s urban renewal areas, with amounts set aside for affordable housing being tied to the opportunity for housing development in the area. Several renewal areas are industrial districts with less land suitable for housing development.

Appendix

Tax Increment Funds Dedicated to Affordable Housing in Portland, 2006 to 2016

 

Urban Renewal Area  8 Year Set-Aside
Central Eastside URA  $5,236,964
Downtown Waterfront URA  $17,457,873
Gateway URA  $9,462,159
Interstate URA  $29,780,785
Lents URA  $23,742,202
North Macadam URA  $28,238,027
Oregon Convention Center URA  $10,903,583
River District URA  $83,090,101
South Park Blocks URA  $35,550,080
Total, All URAs  $243,461,774
Fiscal Year  Annual Set-Aside
FY06-07 Actuals  $19,698,857
FY07-08 Actuals  $20,412,794
FY08-09 Actuals  $26,908,476
FY09-10 Actuals  $41,891,776
FY 10-11 Actuals  $27,221,625
FY11-12 Actuals  $43,096,200
FY 12-13 Actuals  $28,004,694
FY 13-14 Actuals  $7,384,641
FY 14- 15 Actuals  $18,292,019
FY 15- 16 Actuals  $10,550,691
Total, All Fiscal Years  $243,461,774

Source: Portland Development Commission/Prosper Portland.

 

 

Electric vehicle subsidies: Inefficient & Inequitable

Subsidizing electric vehicle purchases is an expensive way to reduce carbon emissions, and mostly subsidizes rich households who would have bought electric vehicles anyhow

There’s a new study from the National Bureau of Economic Research that looks at the effectiveness and distribution of the electric vehicle (EV) purchase credits. The study, by economists from Peking University, Resources for the Future and Cornell concludes:

  • Most people who got the subsidies would have bought an electric vehicle even without the subsidy.
  • Electric vehicles mainly substitute for highly efficient internal combustion engine or hybrid vehicles
  • The net cost of reducing a ton of carbon with electric vehicle subsidies is $552 — at the high end of such strategies.
  • Most of the benefits of EV subsidies go to high income households

A largely wasted subsidy

A key finding of the study is that most of the buyers of electric vehicles who got the subsidies would have bought them even without the subsidy.  Some inefficiency is to be expected with any subsidy, but the authors estimate that 70 percent of all buyers would have bought their EV even without a subsidy.

Tesla P100D, 0 to 60 in 2.5 seconds via “ludicrous mode.” Also ludicrous: EV subsidies.

Chiefly benefiting high income households

New car buyers have, on average higher incomes than the average household, and given the price premium for most electric vehicles, incomes of EV buyers are higher still.  The authors report than the average household income of buyers of electric vehicles was $140,000.  As we pointed out in our article “Ten things more inequitable than road pricing,” subsidizing new car purchases tends to automatically be a pretty regressive policy.

In addition, most of the “wasted” subsidy goes to higher income households. The subsidy makes the biggest difference in buying decisions for price-sensitive moderate income households considering the more modest EVs (like a Nissan Leaf or Chevrolet Bolt). As the authors point out:

Since higher-income households are less sensitive to prices and have a stronger preference for newest technologies, they are more likely to adopt EVs without the subsidy.

Substituting for relatively cleaner cars, not dirty ones

In theory, subsidies to EVs make sense because the substitute cleaner cars for dirty ones. Many climate adaptation strategies now assume, for example, that vehicle electrification will spur rapid reductions in CO2 emissions. But most models of the effects of electrification are based on the naive assumption is that each eleçtric vehicle substitutes one-for-one with the “average” internal combustion engine (ICE) vehicle, and so the reduction in carbon and other emissions ought to simply be calculated as the difference between carbon pollution from the average car and carbon pollution from the electric vehicle (which is determined by the source of its electricity).  But the market data examined in this study show that those who bought electric cars were not typical buyers, and that if they hadn’t bought an EV, they would likely have purchased a highly efficient ICE vehicle or hybrid. If they hadn’t purchased an EV, the recipients of tax credits would have purchased a vehicle with an estimated fuel economy about 4.2 miles per gallon greater than the fleet average. Electric cars turn out not to attract folks who were planning to buy a gas guzzling SUV or pickup, instead, they seem mostly to cannibalize prospective Prius purchasers.

That has a critical implication for transportation models that assume that electric car adoption will quickly reduce carbon emissions.  If EVs substitute for highly efficient gas powered vehicles rather than the average vehicle (or better yet, the least efficient), the gains in carbon reduction will be much smaller and come more slowly than the “naive” model suggests.

In his famous essay Pollution, Property and Prices, the Canadian economist J.H. Dales, pointed out the folly and expense of trying to subsidize our way to lower pollution levels. Handing out subsidies–as opposed to a straightforward allocation of costs–turns out inevitably to be highly inefficient, prompting overconsumption of the things we subsidize, and turns out to be an inefficient way to get less pollution. It’s why a carbon tax–with the proceeds spent on some combination of equal per person rebates (the carbon dividend) and widely shared services, (like transit)–is a superior alternative.

Jianwei Xing, Benjamin Leard, Shanjun Li, “What does an electric vehicle replace?,” NBER Working Paper 25771 , April 2019.

Hat tip to Aaron Gordon of Jalopnik, who wrote “EV credits go mainly to rich people who would buy EVs anyway.

 

Another housing myth debunked: Neighborhood price effects of new apartments

New research shows new apartments drive down rents in their immediate neighborhood, disproving the myth of “induced demand” for housing

If you’re a housing supply skeptic, there’s one pet theory that you’ve been able to hang your hat on, in the face of a barrage of evidence that increasing the supply of housing helps hold down, or even drive down rents. It’s the theory of “induced demand”–that building nice new apartments (or houses) in a neighborhood, so changes a neighborhood’s attractiveness to potential buyers that it drives up prices. It’s a plausible sounding argument, but in our view a wrong one, meaning it’s time for another episode of City Observatory’s own “Myth Busters.”

Like Adam and Jamie, but for housing policy.

Regular readers of City Observatory will immediately recognize the term “induced demand” because we talk about it frequent in the context of transportation.  When a highway department widens a road (invariably, in a vain attempt to reduce traffic congestion), it tends to quickly attract new traffic and becomes just as congested as before (a phenomenon so common and well-documented that it is now termed “the fundamental law of road congestion“).

The induced demand theory applied to housing then, is that building new housing somehow signals a big change in the neighborhood’s amenities and livability and the new supply of housing triggers an even bigger increase in demand, such that any beneficial effects of added supply that would occur in the textbook model are more than offset.

A slightly more nuanced version is a claim that while it may help with supply regionally, in may trigger a change in a neighborhood’s relative perception as a desirable place to live, and while the supply effect may help lower rents regionally, it drives up rents locally:  Rick Jacobus makes this argument

In other words, the demand for housing in any neighborhood is highly variable and can switch from very low to very high quickly. But the supply is almost entirely fixed. In established neighborhoods, no matter how much building is going on, the new supply will be small relative to the overall market so increased supply will have almost no impact on rents. It might theoretically drive rent down some tiny amount but, in practice, the impact of new development in a neighborhood is usually the opposite because it increases demand (for that neighborhood) by more than it increases supply. Partly this is true because any new development is visible, new and exciting. Developers push this process along with marketing campaigns that invariably promote not just one building but the surrounding neighborhood (even if they have to coin a new name for the neighborhood). The result is that—on the neighborhood level—adding supply may not lower rents. It may raise them.

While it’s a semantically appealing analogy, it makes little if any economic sense.  A key difference between roads (nearly all roads, anyhow) and houses, is that we charge positive prices for housing in a way we don’t for freeway capacity. The reason expanding capacity induces demand in the case of roadways is that we charge road users a zero price. Thus capacity (and willingness to tolerate delay) are the only things regulating demand, and when capacity is expanded, demand responds quickly.  As we’ve shown time and again, as in the case of the Louisville I-65 Ohio River Crossing, when you actually price new capacity at even a fraction of its cost, demand evaporates.

Crying “induced demand” seems to be an increasingly popular gambit from housing supply skeptics.  The Southern California Association of Governments deployed it as part of its “Regional Housing Needs Assessment” a state mandated document to calculate how many housing units the region needed to add. SCAG argued that adding more housing to improve affordability would be futile because it would induce additional demand, as on freeways.  UCLA Professor Paavo Monkkonen, challenged that analogy:

The package compares housing supply and affordability to induced demand on freeways (page 23), which they properly note is unlikely to alleviate congestion in the long run. This comparison is not apt, because freeway access is free and housing is not. Congestion occurs when the absence of prices causes a shortage. A housing crisis occurs when a shortage of housing causes high prices. This crucial difference means that new supply is almost useless in the former and incredibly important in the latter.

Still, in the abstract, its possible to imagine that the construction of a new apartment building is some sort of watershed event that triggers a mass re-appraisal of the attitudes of potential renters.  There is evidence at least some externalities and positive feedbacks in development.  The empirical question is whether the size of these effects is enough to swamp the downward effect on rents from expanding the supply of housing in the neighborhood.

Until now, that’s been a factual void–one which lets supply side skeptics assert the induced demand hypothesis for housing.  But it is a void no more:  A forthcoming paper–“Does Luxury Housing Construction Increase Rents in Gentrifying Areas?” from Brian Asquith, Evan Mast, and Davin Reed, explores this question in detail.  Asquith and Mast are economists with the Upjohn Institute; Reed is an economist with the Philadelphia Federal Reserve Bank. Their paper, available in preliminary draft form here, uses very geographically detailed data on apartment rents and new apartment construction in gentrifying neighborhoods to see whether a new building drives up prices nearby (the induced demand theory) or whether it depresses them (the supply side theory).  Earlier, we reviewed an paper from Mast on the chain reaction of migration triggered by the construction of new buildings; it showed that constructing new market-rate buildings triggered a series of moves that produced significant numbers of vacancies in lower income neighborhoods.)

Asquith, Mast, and Reed gathered data for 100 new apartment buildings in each of eleven cities around the country. They identified geographically isolated buildings, and then gathered data on rental listings for apartments in the area surrounding the new building. They analyzed the data to see whether rents went up or down closer to the new building in the year after it was constructed.  If the “induced demand” theory were correct, one would expect the rental prices of existing apartments close to the new building to rise more or faster than other apartments located further away.  This chart summarizes rents by distance (in meters) from the newly constructed building.  The dashed line is the “before” showing the level of rents prior to construction, while the solid line is the “after” showing rents after construction.  (And to be clear:  the rents shown are for apartments in the area excluding the apartments in the newly built building.)

This chart shows that rental prices for apartments close to the new building fell relative to the prices for apartments located further away. The dashed “before” line has a negative slope, suggesting that prices declined the further you got from the site of the new building.  The solid “after” line has a positive slope (prices increase the further you get from the new building).  Overall, prices are higher (the solid line is above the dashed line), but prices actually went down next to the new building, and increased far less than in the area further away from the new building.

These data are a strong challenge to the induced demand theory.  If a new building made an area more attractive, one would expect the largest effect in the area very near the building. But, consistent with the traditional “more supply reduces rents” view, the addition of more units in an area seems to have depressed rents (or at least rent increases) compared to buildings in the surrounding area.

While this is still a preliminary paper, and has yet to be published, it does offer the best evidence yet presented on the theory of induced demand. We’ll reserve a final judgment until after review and publication, but based on the data presented here, this myth is busted.

Addendum: A hat tip to Trulia economist Issi Romem for flagging this study.

 

Who bikes?

Workers in low income households rely more on bikes for commuting, but the data show people of all income levels cycle to work

There’s a lot of hand-wringing and harrumphing about the demographics of cycling. Some worry that bike lanes cater to higher income, spandex clad commuters, and are yet another signal of gentrification.

Workers in every income category bike, but bike commuting skews toward lower income households.

In response, experts hasten to point out that workers from low income households are the ones who be more reliant on cycling to get to work.  (That’s correct, by the way–as we’ll see in a second).  But in the process of trying to make a point, they’ve exaggerated the case.  Here, for example, is a recent tweet repeating a claim made in a CityLab article earlier this year:

That’s a pretty strong claim.  Anne Lusk’s article says data for that claim comes from a 2015 CityLab article, written by then-staffer Eric Jaffe.  Jaffe published some Census data on bike commuting by income group, and offered two charts.  The first chart shows the fraction of persons biking and walking to work by household income (lower income households are more likely to to walk or bike to work than higher income households).  A second chart shows the number of persons commuting by bike, with data aggregated by income groups in multiples of 50,000 (i.e. 0 to $50,000, $50,000 to $100,000, etc).  While Jaffe’s original article shows that lower income households are more likely to cycle to work than their higher income counterparts, it actually doesn’t support the Lusk twitter claim about the largest number of commuters being from households with under $10,000.

Let’s take a look at the most recent census data, from the five-year American Community Survey for 2013-2017.  We’ve used the data from the IPUMS website, because that let’s us tabulate data by our own custom income ranges.  We’ve narrowed our look to persons aged 25 to 64 who reported commuting to work. For this study, we look at household income bins of $10,000 each ranging from zero to $250,000.  Our first chart looks at the share of commuters traveling by bike in each income group.  (As is common in such data, we’ve excluded persons who work at home from these tabulations).

Data in this table are arranged from low incomes (at the bottom) to highest incomes (at the top).  Data show each range of $10,000 in income; and we’ve truncated incomes for those with more than $260,000.  The data show that those with the lowest levels of income are the most likely to rely on bikes for commuting. Almost 1.6 percent of commuters income incomes with less than $10,000 commute by bike, nearly three times the national average of about 0.5 percent.  Those with incomes of 10,000 to $20,000 are more than twice as likely as other American workers to commute by bicycle.  In general, the share of workers who cycle to work declines with increasing income, up to about $100,000 in household income, and then increases modestly as income rises.  (That increase in biking is definitely modest: those with incomes of $150,000 or more are universally less likely to commute by bike than those with incomes of $20,000 to $30,000).

Our second chart show the number of bike commuters in each income bin. As with the US income distribution, the number of persons in higher income categories gets smaller and smaller as household income rises above $50,000. (The high number of persons in the above $260,000 income category reflects the fact that we’ve consolidated all the bins above that dollar amount).  When we divide bike commuters into $10,000 household income bins, the modal number (most frequent category) is $20,000 to $30,000.

So the claim in the City Lab article, that “the single biggest group of Americans who bike to work live in households that earn less than $10,000,” simply isn’t true. About 30,000 of the more than 800,000 regular bike commuters in the US live in households with incomes of less than $10,000.  Every income group up to $100,000 or more has more cycle commuters in that income group than the under $10,000 group.  Based on the statistics presented in Eric Jaffe’s 2015 CityLab article, a claim about “the most households” holds for households with incomes under $50,000, but not for the category under $10,000.  But more importantly, it’s pretty arbitrary how you define income groups:  should it be $10,000 increments, $50,000, or some other measure.  Which of these arbitrarily defined groups is “largest” tells us more about where someone has chosen to draw the lines than it does about who is cycling.

A better way to look at the distribution is to consider the median income of bike commuters, relative to other commuters.  The median bike commuter had a household income of about $72,000 according to this data series, meaning about half of all bike commuters have household incomes less than that amount and half have more.  To put than in context, the median car commuter had a household income of about $82,000.  So, on average, bike commuters live in lower income households than car commuters.  For reference, the median bus commuter lived in a household with an income of about $62,000.

On average, bike-riding skews more toward lower income households, but it turns out that workers in every household income category are bike commuters.  Lusk’s article–and others–make the excellent point that we need to build cycling infrastructure in a way that is inclusive for a range of income and social groups. Accurate data can help make that case.

Will upzoning ease housing affordability problems?

More housing supply denialism–debunked

It appears that we have been a bit premature in calling the housing supply debate over. Last week’s urbanist Internet was all a flutter with the latest claim of an academic study purporting to show that allowing more density in cities wouldn’t do anything to ameliorate the housing affordability problem. The latest installment is a new paper from  new paper from Andrés Rodríguez-Pose and Michael Storper claiming that upzoning will worsen, rather than ameliorate inequality and gentrification. The paper got wide attention thanks to a Richard Florida’s CityLab, article entitled, “Building More Housing is No Match for Rising Inequality.” Florida summarizes the study as follows:

“A new analysis finds that liberalizing zoning rules and building more won’t solve the urban affordability crisis, and could exacerbate it.”

There was little surprise that the Rodríguez-Pose/Storper paper, and Florida’s endorsement would be seized upon by NIMBY groups.  In San Francisco, 48Hills, a generally anti-development neighborhood publication, wasted no time citing the study yet another piece of evidence that “challenges the notion that allowing the private market to build more housing will bring down prices.”

Making higher density housing easier reduces rent inflation and displacement.

Florida quotes Rodríguez-Pose as saying: “Upzoning is far from the progressive policy tool it has been sold to be. It mainly leads to building high-end housing in desirable locations.”

The evidence for that in the paper seems to come from citations of two recent sources. An article on in the Washington Post examining Zillow data on changes in home prices by various price tiers (page 30), and a  a study of the effects of upzoning on land prices in Chicago by Yonah Freemark (page 31).

As City Observatory readers will recall, we took a close look at the Washington Post article when it came out, and challenged all of its principal claims. We found that the Post’s analysis, which purported to address the effects of new apartment construction, relied on a Zillow rental price series that didn’t include multi-family buildings. We also found that increases in supply were actually highly correlated with declines in in rental price inflation.  While building more housing in the highest price tiers produces declines in inflation in the highest tier first, these declines are rapidly echoed throughout the market.

Alex Baca and Hannah Lebovitz have also pointed out the limits of Yonah Freemark’s study.  Almost everywhere, but particularly in Chicago, there are many discretionary and contingent elements in the land use planning process that mean that changes in zoning are a necessary, but by themselves not sufficient condition for realizing greater density:

 No one who is intimately engaged with the complexities of affordable housing in America would suggest that zoning is the sole knob to twiddle to increase affordability—and Freemark doesn’t, either. Zoning is targeted because its origins are inherently racist, bigoted, and exclusionary. But, again, it is not the sole input to making housing more affordable. It’s just the one that, by changing it, allows for many other things that make housing more affordable. . . . But, for now, these findings are inconclusive and in many ways detached from the day-to-day reality of how local-level zoning and planning work. We hope they are not used to validate a continuation of exclusionary practices, or misguided power moves by elected officials in American cities and their suburbs.

Rodríguez-Pose and Storper also cite Rick Jacobus (page 32) as their source that housing markets are deeply segmented–i.e. that adding supply in one part of the market has little or no effect on prices in other parts of the market.  But Jacobus doesn’t offer data for that assertion, and as the recent Upjohn Institute study by Evan Mast shows, the succession of household moves set off by the construction of new higher income housing quickly reverberates through all tiers of a city’s housing market, with the effect that 60 percent of the increase in vacancies is felt in lower income neighborhoods.

As Bloomberg’s Noah Smith has suggested, the easiest way to poke a hole in the argument that more housing, even at the high end of the market, won’t help address housing supply is to consider the counter-factual of somehow demolishing 10,000 or 50,000 high income housing units.  Does anyone suppose that if there were that much less high end housing in say, San Francisco, that housing prices would be lower, or the plight of the poor would be lessened? Exclusionary suburbs, like Marin County, have tightly constricted the housing supply:  would Rodriguez and Pose consider them policy models of how to build a more equitable community?

From a policy standpoint, the more useful question is:  Will the poor in a city be better or worse off if it is easier to build new multi-family housing, especially, as California Senator Scott Wiener’s SB 50 would allow, in areas well served by transit?  In an important sense, Rodríguez-Pose and Storper simply decline to engage seriously in policy discussions.  The closest they come to is to seem to imply that YIMBY advocates are calling for more widespread building on environmentally sensitive greenspaces.  This ignores that there are many ways to accomodate greater density in built up areas in ways that are likely to advantage those with modest means, say by allowing the construction of apartments near transit.

The paper is remarkably silent as to how our land use planning regime contributes to exclusion and segregation, or how this might constructively be altered.  Aside from a gratuitous swipe at “the supposed social justice aspects of reducing housing regulation, assuming it would help the less skilled and reverse a long association of zoning with racial exclusion” there’s no acknowledgement that the balkanized suburban regimes of US metro areas have deployed zoning as a tool to exclude the poor and provide opportunity and access to amenities only to those wealthy enough to afford large single family homes. Among the words you will not find in their article are “exclusionary zoning” “single family housing,” “apartments,” or “parking.” It’s well-documented that our current system of deeply decentralized land-use decision-making, coupled with the strong incentives of “homevoters” to minimize tax burdens and shift costs and congestions to other jurisdictions, produces the expensive, sprawling and segregated land use patterns in US cities.

Rodríguez-Pose and Storper sidestep these nuts and bolts issues of how to fix zoning so that it isn’t exclusionary, in favor of a knocking down a straw man claim that upzoning is somehow a cure for  inequality, (an argument that no one seems to be making).  In the process, they (and by extension, Florida) lend credence to the NIMBY-denialism about  the central need to build more housing in our nation’s cities if we’re to do anything to meaningfully address affordability.

 

 

 

 

 

 

Let’s have an honest discussion about the Rose Quarter freeway widening project

Good decisions result only if state officials are transparent and honest

City Observatory has been closely following the proposal to spend $500 million widening the I-5 freeway at the Rose Quarter in Portland. In the process, we and others have repeatedly uncovered instances of state agency officials misrepresenting facts, suppressing key data, denying the existence of plans, concealing important assumptions and misleading the public about safety.

City Observatory director Joe Cortright testified to the Oregon Transportation Commission on April 18, 2019, about these issues.  A video of his testimony is shown below.

 

While there’s much to be debated, pro and con, about the merits of freeway widening, there’s a more fundamental point must be resolved before we can have that necessary discussion. If our democracy, if our system of government is going to work, it depends critically on the honesty, transparency and good faith of those who work for the government, in this case the Oregon Department of Transportation.  Objectively, the conduct of the Oregon Department of Transportation has failed to conform to the most minimal expectations of professional conduct.

This agency produced an environmental assessment with no data on average daily traffic (ADT) the most fundamental and widely used measure of traffic volumes; essentially the equivalent of presenting a financial report with no dollar figures. This agency concealed the assumption that its traffic projections assumed that the region would build the $3 billion Columbia River Crossing (in 2015). The agency denied its was widening the freeway, but engineered a 126-foot wide right of way, sufficient for an eight-lane freeway. The agency denied it had any engineering plans for the project, and was subsequently forced to release 33 gigabytes of such plans.  The agency made false claims that this freeway was the number one crash location in Oregon, when other ODOT roadways in Portland have higher crash rates and fatalities.

These are not random or isolated acts; they’re part of a pattern and practice of concealing, obscuring and distorting essential facts. If Oregon is to make a reasoned decision on a half-billion dollar investment, it needs a more honest, transparent state Department of Transportation. While the the citizen commissioners of the Oregon Transportation Commission aren’t expected to be experts second-guessing arcane engineering details, they can and should insist on basic standards of openness and truthfulness from their staff.

Cortright’s written testimony submitted to the commission is available here.

Cortright_to_OTC_2019April18

Our updated list from A to Z of everything that causes gentrification

Gentrification:  Here’s your all-purpose list, from artists to zoning, of who and what’s to blame

We first published this list in 2019, but the search for scapegoats has expanded, and now includes little libraries and microbreweries.

When bad things happen, we look around for someone to blame.  And when it comes to gentrification, which is loosely defined as somebody not like you moving into your neighborhood, there’s no shortage of things to blame.  We’ve compiled a long–but far from exhaustive–list of the things that people have blamed for causing gentrification. (This task has been made easier by the seemingly inexhaustible editorial/journalistic appetite for stories pitched as exploring the gentrification of “X”, although an essay at Jacobin branding graphic novels as the “gentrification of comic books” seems to represent the moment that this meme has jumped its shark.)

It may be cathartic to point the finger of blame at someone or something else, but as this list shows, the blame game sheds precious little light on what’s really causing gentrification, and none at all on what we might do to minimize its negative effects.  Any discernible symptom of change in a neighborhood is likely to change the way it is perceived by residents and others.

Cities, and their constituent neighborhoods are in effect living social structures–they’re always in a state of change. Sometimes the change is imperceptibly slow, and other times, when new buildings are built, it can be rapid and noticeable. But no neighborhood remains the same.  Even places with no new construction see a constant inflow and outflow of residents, driven mostly by the natural course of people’s lives. It’s an illusion to suggest that any neighborhood will remain unchanged, and especially so for low income neighborhoods. As we’ve shown at City Observatory, the three-quarters of urban high poverty neighborhoods that recorded no decline in poverty rates from 1970 to 2010 didn’t remain unchanged, they lost 40 percent of their population over those four decades–and concentrated poverty and all its ills increased and spread.

The challenge with that portion of urban change that people call gentrification is not to stop it, but to figure out ways to make sure it produces benefits, if not for everyone, than for a wide range of current and future neighborhood residents. To do that, we have to do more than complain about the symptoms of change, but instead look deeper to understand its causes, and fashion policies that will minimize its negative effects.

The real problem: A shortage of cities

The real underlying cause of gentrification, affordability challenges, and displacement is our shortage of cities.  Now that we’ve rediscovered the long-established virtues of urban living, we don’t have enough great urban neighborhoods, or enough housing in the few great urban neighborhoods that we have, to accomodate all those who would like to live there.  This shortage coupled with growing demand is running head on into land use planning systems that make it impossible to build more of the kind of neighborhoods more and more people value.  The reason housing prices are rising in great (and improving) urban neighborhoods is that we have so few of them, there’s so much demand for them, and we’ve made it too hard to build more, and build more housing in the ones we have.  If we’re looking to reduce displacement from neighborhoods that are becoming nicer, along a whole series of dimensions, the answer isn’t to block change, its to build more housing and more such neighborhoods, so that they’re not in short supply and everyone has a chance to live in one.  

As you read through our alphabetic list of the things people blame for causing gentrification, spend a minute to think about what’s really behind urban change, and what we might do to build more inclusive, more equitable cities.

A – B – C

Artists. In this BushwickTED talk, Brooklyn artist Ethan Petit argues that art causes gentrification, based on his personal experience. Petit says that art and gentrification are two heads of the same hydra, a conclusion long litigated in academia.

Banks. Finance and speculation figure prominently in many accounts of what causes gentrification. Forbes’ recent recent article, “What do hipsters and banks have in common: Gentrification.” is an example of how the narrative that greedy developers buy up low-cost housing and raise the rents, with money provided by shadowy banks, has made it even to mainstream financial publications

Climate Change. Rising sea levels are plainly a threat to low-lying places like South Florida. Already, several studies are claiming that Miami is experiencing “climate gentrification,” as real estate developers buy property in higher elevation locations in the city, like Liberty City, which have traditionally been regarded as less desirable, precisely because of their distance from water.

D – E – F

Declining Crime Rates. One of the biggest and most persistent changes in US cities in the past twenty years has been a steady reduction in crime rates. When crime rates decline in a neighborhood, property values tend to rise, and research shows a correlation between declining crime and an increase in the number of higher income and better educated residents.

Environmental improvement. Poor neighborhoods often have worse pollution and less green space than other places in cities, but when these environmental problems are addressed, property values rise. This has led some scholars to argue that we shouldn’t make living conditions in these poor neighborhoods too nice, for fear of increasing rents. Instead, we’re told, we should settle for improvements that make the neighborhoods “just green enough.”

Florida, Richard. Florida’s 2002 book Rise of the Creative Class led cities around the country to pursue strategies of improving urban amenities to attract creative workers. To many that was a recipe for gentrification, a charge that Florida wrestled with in his most recent book, The New Urban Crisis. And last year, a Washington DC lawyer sued the city government for following Florida’s ideas, which he claimed led to gentrification and displacement. The suit alleges the city “catered to what urban theorist Richard Florida famously identified as the “creative class” and ignored the needs of poor and working-class families,” which “lead to widespread gentrification and displacement.”

G – H – I

Galleries. In Los Angeles, Boyle Heights neighborhood, newly opened art galleries have has been ground zero for a sustained battle between neighborhood activists and gallery owners, replete with graffiti, assaults and performance-art like demonstrations. (See also: artists.) Publicly vowing to “stop at nothing to fight gentrification and capitalism in its boring art-washing manifestations, the group has staged protests, called for boycotts and used social media in savvy and withering ways — for example, describing one gallery owner as bearing the “stench of entitlement and white privilege.”

The Highline.  In 2009, a decrepit elevated industrial railroad on the city’s West side was saved from demolition and turned into a landscaped, mid-air linnear park. The city also upzoned the area, and development took-off. Metropolis magazine termed it the “Highline effect” and fretted that “our new parks are trojan horses for gentrification.”

Independent shops. A 1992 dissertation looked at change in Southeast Portland’s Hawthorne Boulevard (a street that then and now has almost no national chain businesses) and concluded an influx of independent stores and boutiques had triggered “commercial gentrification.”

Internet.  A close look at Chattanooga’s plan to roll out a municipal broadband system uncovers an insidious plan to gentrify city neighborhoods, leading Fast Company to ask: “Municipal broadband: Urban savior or gentrification’s wrecking ball.”

J – K – L

Java. The opening of a new coffee shop is often taken as a harbinger of gentrification. Whether it’s Starbucks or an independent local shop, espresso is often equated with upscaling from coast to coast, from Los Angeles, to Washington, D.C.—and even in Berkeley, Calif., where the San Francisco Chronicle implied that a “fourth-wave” shop opening across the street from Chez Panisse (in a former Philz coffee space) might gentrify a neighborhood where the average household income is over $98,000 per year, and the average home is worth $1,127,100. (Sometimes, though, coffee shops bring it upon themselves: Denver’s Ink coffee shop, rightfully,was a subject of protests for a sign saying “Happily gentrifying the neighborhood since 2014.”)

 

Kids. Gentrifiers are usually stereotyped as single, young hipsters. But demographically, these people are in (or approaching) their prime child-bearing years, and many are staying in cities and raising their children.  The New York Times wrote of a burgeoning number of “strollervilles” popping up in neighborhoods and apartment buildings that previously had few children. “

The El: Improved transit is often blamed for driving up rents and property values and bringing in gentrifiers. As developers race to erect fancy apartment buildings and condominiums that cater mainly to young professionals, longtime residents of neighborhoods adjacent to established or newly planned transit hubs. it is claimed, are increasingly finding themselves priced out of their own communities.

Little libraries.  You wouldn’t think to look at these tiny-free-standing book houses as harbingers of displacement, yet according to some, the undermine the traditional public library, by enabling people to get (and share) free reading material without patronizing the public building. The assumption here is that there’s a fixed demand for reading, and that little libraries rob big ones of needed customers; this ignores the high likelihood that little libraries will help kindle a lifelong interest in reading, and democratize the spread of knowledge. There are over 90,000 little free libraries, so it looks like we’re pretty much doomed.

M – N – O – P

Moms riding cargo bikes. In June, the Atlantic published an article entitled: “‘Cargo-Bike Moms’ are gentrifying the Netherlands: In Rotterdam, the bakfiets utility bike has become a symbol—and a tool—of urban displacement.”

Image by Amsterdamized/CCBY2.0

Microbreweries. Yes, it seems,  even drinking locally brewed beer–instead of the Fordist, mass-produced swill of some global mega-corporation–contributes to gentrification. University of Toledo geographer Neil Reid, summarizes the academic research on this one, and concludes that microbreweries mostly tend to follow, rather than precede neighborhood change. Still he finds that in Cleveland and Denver, microbreweries showed up first, before property values escalated, so even though they represent local entrepreneurs creating sociable, community accessible “third places,” they’re going to get fingered for gentrification.

Neo-Liberalism:  No two words are more conjoined in leftist academic urbanism than “neoliberal” and “gentrification.” There are too many fish in this barrel to justify choosing just one, but, for a flavor consider these titles: “Engagement, Gentrification, and the Neoliberal Hijacking of History,” “Gentrification in the Neoliberal World Order,” “Fighting gentrification in the neoliberal university,” and “Race, Space and Neoliberal Urbanism: Gentrification and Neighbourhood Change in Nashville.”  

Opportunity Zones. The Jobs and Tax Cuts Act Congress passed in 2017 contains a new provision sheltering taxes on capital gains made in designated distressed areas.  These opportunity zones are supposed to lead to additional investment that will help poor neighborhoods, but there are widespread concerns that the tax break will just fuel gentrification, by subsidizing the construction of market rate housing in distressed neighborhoods. Houston’s Kinder Institute says the opportunity zone program threatens to be “gentrification on steroids.”

Parks:  Poor neighborhoods often suffer from a lack of local parks, but efforts to improve local parks often raise concerns about possibly raising property values. In Los Angeles, “a proposal to improve bike safety and pedestrian access to parks along the Los Angeles River was recently denounced as ‘a gentrification scam.'”

Q – R – S

LGBTQ:  By moving in to low and moderate income communities, LGBTQ populations are sometimes labeled as the advance guard of gentrification, and paradoxically, some gayborhoods have themselves been gentrified by others.  As Peter Moskowitz wrote at Vice, “When It Comes to Gentrification, LGBTQ People are Both Victim and Perpetrator; The role queer people—and especially white queers—play in the history of urban inequality is thorny, to say the least.”

Restaurants. In Chicago, according to local real estate website DNAInfo, protesters are treating a new restaurant as life-threatening: “Anti-Gentrifiers Say New Pilsen Restaurant Puts ‘Our Lives … In Danger.'”

Smart Phones. Governing magazine reports on a study that easy access to Yelp and other place-based reviews leads swarms of hipsters to quickly colonize and gentrify new spaces. They write: “That smartphone in your pocket just might be speeding up the gentrification of urban neighborhoods.”

Soccer. The Guardian looks at Orlando’s Major League Soccer franchise, which has built a new stadium in a city’s neighborhood, it worries, “the specter of gentrification only grows.” Soccer is turning its back on greenfield, suburban stadiums, because so much of the fan base consists of urban-dwelling young professionals.

T – U – V

Tech firms. In her now-classic 2014 essay, “How burrowing owls lead to vomiting anarchists,” Kim-Mai Cutler described how the demand for housing stimulated by the growth of the Bay Area tech industry ran head on to the highly regulated California housing market.

Urban Renewal: Many urban renewal efforts that consciously targeted “blighted” lower income neighborhoods, did a pretty good job bringing in higher income households, but often fell short in replacing the low and middle income housing that they demolished.

Vouchers. A couple of academic studies have come to the conclusion that school choice, including policies like No Child Left Behind’s option to leave failing schools, boosts housing prices and triggers gentrification. Planetizen reports that one review found, “The ability to opt out of the neighborhood school increased the likelihood that a mostly black or Hispanic neighborhood would see an influx of wealthier residents.”

W – X – Y – Z

Whole Foods. The opening of a Whole Foods Market at 125th and Lenox led one resident to call it “the final nail in black Harlem’s coffin,” noting the Whole Foods effect, “which is shown to drive up property values by as much as 40 percent.”

GenX:  While millennials draw much of the attention for current gentrification, the “back to the city movement was propelled by the previous generation. An essay published by Slate argues ” . .  among all age groups, the biggest shift toward high-density urban living has been among 35-to-39-year-olds—the younger slice of Gen X.”

You: As City Observatory’s Daniel Kay Hertz has written, “there’s basically no way not to be a gentrifier.” Your demand for living space in a city, regardless of what neighborhood you personally choose to live in, tends to create more pressure on housing markets, including in lower-income neighborhoods, especially if your city has a growing population but has not build more places for those people to live.

Zoning: Arguably, we’ve saved the best and most important cause for last. What prompts affluent people of means to choose to move into what have been low income neighborhoods. A huge and wildly underestimated cause is the fact that we’ve generally prohibited building more dense, affordable housing in the most desirable neighborhoods. Restrictive zoning in high income neighborhoods displaces this demand elsewhere, contributing to gentrification.

While there’s an entire alphabet of factors to blame, we urge our readers to focus on “Y” and “Z.”  It is, individually and collectively our demand for urban spaces that’s the key factor fueling gentrification wherever it occurs. We simply need more great urban neighborhoods, and more housing in the great urban neighborhoods we’ve already built. And the chief obstacle to getting more such neighborhoods is that we’ve essentially made it illegal to build dense, new mixed use urban neighborhoods, and zoning (and a host of related restrictions) make it impossible or prohibitively costly to build more housing in these desirable places. When we realize that the challenges that manifest themselves as “gentrification” are problems of our making, and that the solutions are within our control, maybe we can move past a bitter and unproductive blame-game.

 

 

Everything that causes gentrification, from A to Z

Gentrification:  Here’s your all-purpose list, from artists to zoning, of who and what’s to blame

When bad things happen, we look around for someone to blame.  And when it comes to gentrification, which is loosely defined as somebody not like you moving into your neighborhood, there’s no shortage of things to blame.  We’ve compiled a long–but far from exhaustive–list of the things that people have blamed for causing gentrification. (This task has been made easier by the now seemingly inexhaustible editorial/journalistic appetite for stories pitched as exploring the gentrification of “X”.)

It may be cathartic to point the finger of blame at someone or something else, but as this list shows, the blame game sheds precious little light on what’s really causing gentrification, and none at all on what we might do to minimize its negative effects.  Any discernible symptom of change in a neighborhood is likely to change the way it is perceived by residents and others.

Cities, and their constituent neighborhoods are in effect living social structures–they’re always in a state of change. Sometimes the change is imperceptibly slow, and other times, when new buildings are built, it can be rapid and noticeable. But no neighborhood remains the same.  Even places with no new construction see a constant inflow and outflow of residents, driven mostly by the natural course of people’s lives. It’s an illusion to suggest that any neighborhood will remain unchanged, and especially so for low income neighborhoods. As we’ve shown at City Observatory, the three-quarters of urban high poverty neighborhoods that recorded no decline in poverty rates from 1970 to 2010 didn’t remain unchanged, they lost 40 percent of their population over those four decades–and concentrated poverty and all its ills increased and spread.

The challenge with that portion of urban change that people call gentrification is not to stop it, but to figure out ways to make sure it produces benefits, if not for everyone, than for a wide range of current and future neighborhood residents. To do that, we have to do more than complain about the symptoms of change, but instead look deeper to understand its causes, and fashion policies that will minimize its negative effects.

The real problem: A shortage of cities

The real underlying cause of gentrification, affordability challenges, and displacement is our shortage of cities.  Now that we’ve rediscovered the long-established virtues of urban living, we don’t have enough great urban neighborhoods, or enough housing in the few great urban neighborhoods that we have, to accomodate all those who would like to live there.  This shortage coupled with growing demand is running head on into land use planning systems that make it impossible to build more of the kind of neighborhoods more and more people value.  The reason housing prices are rising in great (and improving) urban neighborhoods is that we have so few of them, there’s so much demand for them, and we’ve made it too hard to build more, and build more housing in the ones we have.  If we’re looking to reduce displacement from neighborhoods that are becoming nicer, along a whole series of dimensions, the answer isn’t to block change, its to build more housing and more such neighborhoods, so that they’re not in short supply and everyone has a chance to live in one.  

As you read through our alphabetic list of the things people blame for causing gentrification, spend a minute to think about what’s really behind urban change, and what we might do to build more inclusive, more equitable cities.

A – B – C

Artists. In this BushwickTED talk, Brooklyn artist Ethan Petit argues that art causes gentrification, based on his personal experience. Petit says that art and gentrification are two heads of the same hydra, a conclusion long litigated in academia.

Banks. Finance and speculation figure prominently in many accounts of what causes gentrification. Forbes’ recent recent article, “What do hipsters and banks have in common: Gentrification.” is an example of how the narrative that greedy developers buy up low-cost housing and raise the rents, with money provided by shadowy banks, has made it even to mainstream financial publications

Climate Change. Rising sea levels are plainly a threat to low-lying places like South Florida. Already, several studies are claiming that Miami is experiencing “climate gentrification,” as real estate developers buy property in higher elevation locations in the city, like Liberty City, which have traditionally been regarded as less desirable, precisely because of their distance from water.

D – E – F

Declining Crime Rates. One of the biggest and most persistent changes in US cities in the past twenty years has been a steady reduction in crime rates. When crime rates decline in a neighborhood, property values tend to rise, and research shows a correlation between declining crime and an increase in the number of higher income and better educated residents.

Environmental improvement. Poor neighborhoods often have worse pollution and less green space than other places in cities, but when these environmental problems are addressed, property values rise. This has led some scholars to argue that we shouldn’t make living conditions in these poor neighborhoods too nice, for fear of increasing rents. Instead, we’re told, we should settle for improvements that make the neighborhoods “just green enough.”

Florida, Richard. Florida’s 2002 book Rise of the Creative Class led cities around the country to pursue strategies of improving urban amenities to attract creative workers. To many that was a recipe for gentrification, a charge that Florida wrestled with in his most recent book, The New Urban Crisis. And last year, a Washington DC lawyer sued the city government for following Florida’s ideas, which he claimed led to gentrification and displacement. The suit alleges the city “catered to what urban theorist Richard Florida famously identified as the “creative class” and ignored the needs of poor and working-class families,” which “lead to widespread gentrification and displacement.”

G – H – I

Galleries. In Los Angeles, Boyle Heights neighborhood, newly opened art galleries have has been ground zero for a sustained battle between neighborhood activists and gallery owners, replete with graffiti, assaults and performance-art like demonstrations. (See also: artists.) Publicly vowing to “stop at nothing to fight gentrification and capitalism in its boring art-washing manifestations, the group has staged protests, called for boycotts and used social media in savvy and withering ways — for example, describing one gallery owner as bearing the “stench of entitlement and white privilege.”

The Highline.  In 2009, a decrepit elevated industrial railroad on the city’s West side was saved from demolition and turned into a landscaped, mid-air linnear park. The city also upzoned the area, and development took-off. Metropolis magazine termed it the “Highline effect” and fretted that “our new parks are trojan horses for gentrification.”

Independent shops. A 1992 dissertation looked at change in Southeast Portland’s Hawthorne Boulevard (a street that then and now has almost no national chain businesses) and concluded an influx of independent stores and boutiques had triggered “commercial gentrification.”

Internet.  A close look at Chattanooga’s plan to roll out a municipal broadband system uncovers an insidious plan to gentrify city neighborhoods, leading Fast Company to ask: “Municipal broadband: Urban savior or gentrification’s wrecking ball.”

J – K – L

Java. The opening of a new coffee shop is often taken as a harbinger of gentrification. Whether it’s Starbucks or an independent local shop, espresso is often equated with upscaling from coast to coast, from Los Angeles, to Washington, D.C.—and even in Berkeley, Calif., where the San Francisco Chronicle implied that a “fourth-wave” shop opening across the street from Chez Panisse (in a former Philz coffee space) might gentrify a neighborhood where the average household income is over $98,000 per year, and the average home is worth $1,127,100. (Sometimes, though, coffee shops bring it upon themselves: Denver’s Ink coffee shop, rightfully,was a subject of protests for a sign saying “Happily gentrifying the neighborhood since 2014.”)

 

Kids. Gentrifiers are usually stereotyped as single, young hipsters. But demographically, these people are in (or approaching) their prime child-bearing years, and many are staying in cities and raising their children.  The New York Times wrote of a burgeoning number of “strollervilles” popping up in neighborhoods and apartment buildings that previously had few children. “

The El: Improved transit is often blamed for driving up rents and property values and bringing in gentrifiers. As developers race to erect fancy apartment buildings and condominiums that cater mainly to young professionals, longtime residents of neighborhoods adjacent to established or newly planned transit hubs. it is claimed, are increasingly finding themselves priced out of their own communities.

M – N – O – P

Moms riding cargo bikes. In June, the Atlantic published an article entitled: “‘Cargo-Bike Moms’ are gentrifying the Netherlands: In Rotterdam, the bakfiets utility bike has become a symbol—and a tool—of urban displacement.”

Image by Amsterdamized/CCBY2.0

Neo-Liberalism:  No two words are more conjoined in leftist academic urbanism than “neoliberal” and “gentrification.” There are too many fish in this barrel to justify choosing just one, but, for a flavor consider these titles: “Engagement, Gentrification, and the Neoliberal Hijacking of History,” “Gentrification in the Neoliberal World Order,” “Fighting gentrification in the neoliberal university,” and “Race, Space and Neoliberal Urbanism: Gentrification and Neighbourhood Change in Nashville.

Opportunity Zones. The Jobs and Tax Cuts Act Congress passed in 2017 contains a new provision sheltering taxes on capital gains made in designated distressed areas.  These opportunity zones are supposed to lead to additional investment that will help poor neighborhoods, but there are widespread concerns that the tax break will just fuel gentrification, by subsidizing the construction of market rate housing in distressed neighborhoods. Houston’s Kinder Institute says the opportunity zone program threatens to be “gentrification on steroids.”

Parks:  Poor neighborhoods often suffer from a lack of local parks, but efforts to improve local parks often raise concerns about possibly raising property values. In Los Angeles, “a proposal to improve bike safety and pedestrian access to parks along the Los Angeles River was recently denounced as ‘a gentrification scam.'”

Q – R – S

LGBTQ:  By moving in to low and moderate income communities, LGBTQ populations are sometimes labeled as the advance guard of gentrification, and paradoxically, some gayborhoods have themselves been gentrified by others.  As Peter Moskowitz wrote at Vice, “When It Comes to Gentrification, LGBTQ People are Both Victim and Perpetrator; The role queer people—and especially white queers—play in the history of urban inequality is thorny, to say the least.”

Restaurants. In Chicago, according to local real estate website DNAInfo, protesters are treating a new restaurant as life-threatening: “Anti-Gentrifiers Say New Pilsen Restaurant Puts ‘Our Lives … In Danger.'”

Smart Phones. Governing magazine reports on a study that easy access to Yelp and other place-based reviews leads swarms of hipsters to quickly colonize and gentrify new spaces. They write: “That smartphone in your pocket just might be speeding up the gentrification of urban neighborhoods.”

Soccer. The Guardian looks at Orlando’s Major League Soccer franchise, which has built a new stadium in a city’s neighborhood, it worries, “the specter of gentrification only grows.” Soccer is turning its back on greenfield, suburban stadiums, because so much of the fan base consists of urban-dwelling young professionals.

T – U – V

Tech firms. In her now-classic 2014 essay, “How burrowing owls lead to vomiting anarchists,” Kim-Mai Cutler described how the demand for housing stimulated by the growth of the Bay Area tech industry ran head on to the highly regulated California housing market.

Urban Renewal: Many urban renewal efforts that consciously targeted “blighted” lower income neighborhoods, did a pretty good job bringing in higher income households, but often fell short in replacing the low and middle income housing that they demolished.

Vouchers. A couple of academic studies have come to the conclusion that school choice, including policies like No Child Left Behind’s option to leave failing schools, boosts housing prices and triggers gentrification. Planetizen reports that one review found, “The ability to opt out of the neighborhood school increased the likelihood that a mostly black or Hispanic neighborhood would see an influx of wealthier residents.”

W – X – Y – Z

Whole Foods. The opening of a Whole Foods Market at 125th and Lenox led one resident to call it “the final nail in black Harlem’s coffin,” noting the Whole Foods effect, “which is shown to drive up property values by as much as 40 percent.”

GenX:  While millennials draw much of the attention for current gentrification, the “back to the city movement was propelled by the previous generation. An essay published by Slate argues ” . .  among all age groups, the biggest shift toward high-density urban living has been among 35-to-39-year-olds—the younger slice of Gen X.”

You: As City Observatory’s Daniel Kay Hertz has written, “there’s basically no way not to be a gentrifier.” Your demand for living space in a city, regardless of what neighborhood you personally choose to live in, tends to create more pressure on housing markets, including in lower-income neighborhoods, especially if your city has a growing population but has not build more places for those people to live.

Zoning: Arguably, we’ve saved the best and most important cause for last. What prompts affluent people of means to choose to move into what have been low income neighborhoods. A huge and wildly underestimated cause is the fact that we’ve generally prohibited building more dense, affordable housing in the most desirable neighborhoods. Restrictive zoning in high income neighborhoods displaces this demand elsewhere, contributing to gentrification.

While there’s an entire alphabet of factors to blame, we urge our readers to focus on “Y” and “Z.”  It is, individually and collectively our demand for urban spaces that’s the key factor fueling gentrification wherever it occurs. We simply need more great urban neighborhoods, and more housing in the great urban neighborhoods we’ve already built. And the chief obstacle to getting more such neighborhoods is that we’ve essentially made it illegal to build dense, new mixed use urban neighborhoods, and zoning (and a host of related restrictions) make it impossible or prohibitively costly to build more housing in these desirable places. When we realize that the challenges that manifest themselves as “gentrification” are problems of our making, and that the solutions are within our control, maybe we can move past a bitter and unproductive blame-game.

 

 

The case against the I-5 Rose Quarter Freeway widening

Portland is weighing whether to spend as much as $1.45 billion dollars widening a mile-long stretch of the I-5 freeway at the Rose Quarter near downtown. We’ve dug deeply into this idea at City Observatory, and we’ve published more than 50 commentaries addressing various aspects of the project over the past four years.  Here’s a synopsis:

Traffic congestion

Traffic is declining at the Rose Quarter: ODOT growth projections are fiction. December 22, 2022. ODOT’s own traffic data shows that daily traffic (ADT) has been declining for 25 years, by -0.55 percent per year. The ODOT modeling inexplicably predicts that traffic will suddenly start growing through 2045, growing by 0.68 percent per year. ODOT’s modeling falsely claims that traffic will be the same regardless of whether the I-5 freeway is expanded, contrary to the established science of induced travel. These ADT statistics aren’t contained in the project’s traffic reports, but can be calculated from data contained in its safety analysis. ODOT has violated its own standards for documenting traffic projections, and violated national standards for maintaining integrity of traffic projections.

The black box: Hiding the facts about freeway widening. November 28, 2022. State DOT officials have crafted an Supplemental Environmental Assessment that conceals more than it reveals. The Rose Quarter traffic report contains no data on “average daily traffic” the most common measure of vehicle travel. Three and a half years later and ODOT’s Rose Quarter’s Traffic Modeling is still a closely guarded secret. The new SEA makes no changes to the regional traffic modeling done for the 2019 EA, which was done 7 years ago in 2015. The report misleadingly cites “volume to capacity ratios” without revealing either volumes or capacities.

Wider freeways don’t reduce congestion.  March 4, 2019. The best argument that highway planners can muster for the Rose Quarter freeway widening is that it might somehow relieve congestion by reducing the number of crashes, but when they widened a stretch of I-5 just north of the Rose Quarter a decade ago, crashes not only didn’t decrease, crash rates actually went up.

Rose Quarter freeway widening won’t reduce congestion, March 2, 2019. Wider urban freeways have never reduced congestion, due to “induced demand” a problem so predictable, that experts call it “the fundamental law of road congestion.” Even the experts from ODOT and the Portland Bureau of Transportation concede that the freeway widening will do nothing to reduce daily “recurring” traffic congestion.

Backfire: How widening freeways can make traffic congestion worse, February 26, 2019.  It’s an article of faith among highway builders and boosters that adding more capacity will make freeways flow more smoothly. But in reality, widening a road or intersection at one point simply funnels more vehicles into the next bottleneck more quickly–which can lead a road to become congested even faster. That’s what’s happened on I-5 Northbound in Portland, where the I-5 bridge over the Columbia River carry fewer vehicles in the peak hour now because improvements to the freeway and intersections have overwhelmed the bridge bottleneck.

Congestion pricing is a better solution for the Rose Quarter, March 26, 2019. Congestion pricing on I-5 would dramatically reduce congestion, improve freight and transit travel times, and do so at far lower cost than freeway widening, according to . . . the Oregon Department of Transportation. Pricing has been approved by the state Legislature, but ODOT has violated NEPA by failing to include any mention of it in the Rose Quarter Environmental Assessment.

How tax evasion fuels traffic congestion in Portland, March 15, 2019. A big part of traffic congestion on I-5 and I-205 as they cross the Columbia River is due to Washington residents shopping in Oregon to evade Washington’s high retail sales tax (Oregon has none). Vancouver residents evade $120 million in sales tax per year by shopping in Oregon, but account for between 10 and 20 percent of all traffic across the river. 

Reducing congestion: Katy didn’t, December 27, 2016.  Add as many lanes as you like in an urban setting and you’ll only increase the level of traffic and congestion.  That’s the lesson of Houston’s Katy Freeway, successively widened to a total of 23 lanes, at a cost of billions, but with the latest widening, travel times on the freeway are now slower than before it was expanded.

ODOT’s I-5 Rose Quarter “Improvement”: A million more miles of local traffic. December 7, 2022. ODOT’s proposed relocation of the I-5 Southbound off-ramp at the Rose Quarter will add 1.3 million miles of vehicle travel to local streets each year. Moving the I-5 on ramp a thousand feet further south creates longer journeys for the 12,000 cars exiting the freeway at this ramp each day. The new ramp location requires extensive out-of-direction travel for all vehicles connecting to local streets. With more miles driven on local streets, and more turning movements at local intersections, hazards for all road users, but especially persons biking and walking, increase substantially.

Flat Earth Sophistry. December 30, 2022.  The science of induced travel is well proven, but state DOTs are in utter denial. Widening freeways not only fails to reduce congestion, it inevitably results in more vehicle travel and more pollution. The Oregon Department of Transportation has published a technical manual banning the consideration of induced travel in Oregon highway projects.

Safety

The Rose Quarter’s Big U-Turn: Deadman’s Curve? November 15, 2022.
The redesign of the I-5 Rose Quarter project creates a hazardous new hairpin off-ramp from a Interstate 5. Is ODOT’s supposed “safety” project really creating a new “Deadman’s Curve” at the Moda Center? Bike riders will have to negotiate on Portland’s busy North Williams bikeway will have to negotiate two back-to-back freeway ramps that carry more than 20,000 cars per day.

ODOT:  Our Rose Quarter “safety” project will increase crashes.  November 19, 2022.  A newly revealed ODOT report shows the redesign of the I-5 Rose Quarter project will:

  • creates a dangerous hairpin turn on the I-5 Southbound off-ramp
  • increase crashes 13 percent
  • violate the agency’s own highway design standards
  • result in trucks turning into adjacent lanes and forcing cars onto highway shoulders
  • necessitate a 1,000 foot long “storage area” to handle cars exiting the freeway
  • require even wider, more expensive freeway covers that will be less buildable

ODOT’s safety lie is back, bigger than ever. October 18, 2022. Oregon DOT is using phony claims about safety to sell a $1.45 billion freeway widening project.People are regularly being killed on ODOT roadways and the agency claims that it lacks the resources to fix these problems. Meanwhile, it proposes to spend billions of dollars widening freeways where virtually no one is killed or injured and labels this a “safety” project. A wider I-5 freeway will do nothing to improve road safety in Portland.

Oregon DOT admits it lied about I-5 safety. March 17, 2020. Oregon’s Department of Transportation concedes it was lying about crashes on I-5 at the Rose Quarter. For year’s ODOT falsely claimed that I-5 at the Rose Quarter was the ‘highest crash location in Oregon.” After repeatedly pointing out this lie, we finally got ODOT to retract this from their website.

Safety: Using the big lie to sell wider freeways, March 19, 2019. ODOT claims that the I-5 Rose Quarter is the state’s “#1 crash location.” But that’s not true.  Other Portland area ODOT roads, including Barbur Boulevard, Powell Boulevard and 82nd Avenue have crash rates that are as much as 3 times higher, and worse, these streets cause fatalities, which the freeway doesn’t. Crying “safety” is a calculated, “Big Lie” marketing gimmick, that would spend half a billion dollars on a roadway that contributes nothing to the state’s growing traffic death toll.

Safety last: What we’ve learned from “improving” the I-5 freeway, March 21, 2019. ODOT has also “improved” freeway interchanges south of Portland as well.  It improved the Woodburn interchange in 2015, hoping to reduce crashes–but they increased instead.  The interchange had two serious crashes, producing extensive delays in February 2019.

Carbon and pollution

 

Whitewashing the freeway widening. June 4, 2020. A so-called “peer review” panel was kept in the dark about critiques of the highway department’s flawed projections This is a thinly veiled attempt to whitewash flawed analysis. These are the products of a hand-picked, spoon-fed group, asked by ODOT to address only a narrow and largely subsidiary set of questions and told to ignore fundamental issues.

Widening I-5 at the Rose Quarter will increase greenhouse gases. January 26, 2021. Adding more freeway capacity at the Rose Quarter will thousands of tons to the region’s greenhouse gas emissions. freeways—including additional ramps and “auxiliary lanes”—induce additional car travel which increases greenhouse gas emissions.

  • The I-5 Rose Quarter project will add approximately 33,000 vehicles per day to I-5 traffic, according to ODOT’s own estimates
  • These 33,000 vehicles will directly add 56,000 daily vehicle miles of travel and indirectly add 178,000 daily vehicle miles of travel.
  • Additional vehicle travel will directly produce between 8,000 tons of greenhouse gas emission per year; and with induced travel outside the project a total increase of 35,000 tons of greenhouse gas emissions per year.

Here’s what’s wrong with Oregon DOT’s Rose Quarter pollution claims. December 10, 2021.  10 reasons not to believe phony DOT claims that widening highways reduces pollution

Climate concerns crush Oregon highway funding bill, March 6, 2015. In 2015, a pending highway finance bill was killed when the Oregon Department of Transportation admitted it had provided estimates of carbon reductions that were wildly exaggerated and could not be verified. With a track record of producing carbon emission estimates that falsely flatter its preferred projects, should anyone trust the estimates contained in the Rose Quarter Environmental Assessment?

Widening the I-5 Freeway will add millions of miles of vehicle travel, March 4, 2019.  The University of California Davis has a calculator for estimating the effects of added freeway capacity on travel; it suggests that Rose Quarter freeway widening will produce 10 to 17 million additional miles of travel per year in Portland, as well as 5 to 8 thousand additional tons of carbon emissions per year.

Urban Myth Busting: Congestion, Idling and Carbon Emissions, July 6, 2017. The Rose Quarter project makes unsubstantiated claims that it will reduce carbon emissions, by reducing the number of cars idling in traffic; but the published scientific literature on the subject shows that gains from reduced idling due to capacity increases are more than offset by the increase in emissions due to induced travel demand.

Bike and pedestrian infrastructure and freeway covers

More proof of ODOT’s Rose Quarter Freeway coverup. June 16, 2021. Newly revealed documents show its roadway is vastly wider than needed for traffic, and also makes “buildable” freeway covers prohibitively expensive. If you really want just two additional lanes, you can do so much more cheaply and with less environmental destruction. The reality is ODOT is planning a 10 lane freeway at the Rose Quarter, and is lying about the covers and the project’s real cost and environmental impact.

ODOT reneges on Rose Quarter cover promises. November 14, 2022.
The Rose Quarter I-5 Revised Environmental Assessment shows that ODOT is already reneging on its sales pitch of using a highway widening to heal Portland’s Albina Neighborhood. It trumpeted “highway covers” as a development opportunity, falsely portraying them as being covered in buildings and housing—something the agency has no plans or funds to provide.The covers may be only partially buildable, suitable only for “lightweight” buildings, and face huge constraints. ODOT will declare the project “complete” as soon as it does some “temporary” landscaping. The covers will likely be vacant for years, unless somebody—not ODOT—pays to build on them. ODOT isn’t contributing a dime to build housing to replace what it destroyed, and its proposed covers are unlikely to ever become housing because they’re too expensive and unattractive to develop.

Distorted Images: Freeway widening is bad for pedestrians, March 14, 2019. ODOT has produced a handful of computer-generated renderings to show how its massive freeway widening project would affect surface streets in Northeast Portland. They’re carefully composed to exaggerate some features and conceal others. If you look closely, you can see how the plan is to round off corners at key intersections–speeding car traffic and increasing danger to pedestrians. In addition, ODOT illustrations show dozens of pedestrians and just a handful of cars on this busy city street: proportions that are off by a factor of 200 in showing the real world relationship of cars to people in this space.

The great freeway cover-up, December 13, 2017. ODOT’s freeway widening plans call for two over-sized freeway overpasses to be built (primarily to deal with construction staging in a dense urban environment). While it claims that the overpasses can be developed as public space, they’re too fragmented, noisy and hostile (thanks to thousands of fast moving cars on every side) to be useable public space.

The death of Flint Street, May 12, 2017. The proposed Rose Quarter freeway widening would demolish the existing Flint Avenue overpass, a low speed neighborhood street that runs parallel to a busy North-South couplet, and provides an important bike route with a gentle grade, and limited auto-traffic. *

Diverging Diamond Blues, December 19, 2017. A key element of the local street plan for the proposed Rose Quarter freeway widening project is turning a portion of North Williams Avenue into a miniature “diverging diamond” interchange–with traffic traveling on the wrong (left-hand) side of a two-ways street. This disorienting design is inherently pedestrian hostile. *

Equity and neighborhood effects

How a freeway destroyed a neighborhood, and may again, March 18, 2019. In 1962, construction of I-5 devastated Portland’s historically African-American Albina neighborhood.  Population declined by nearly two-thirds in the three decades after the freeway was built, as the area shifted from a residential area with local-serving businesses, to an auto-dominated landscape. The neighborhood has only started to rebound in recent years, and more auto traffic will likely undermine the area’s attractiveness.

The toxic flood of cars, not just the freeway, crushed Albina. September 16, 2020. Restorative Justice & A Viable Neighborhood. What destroyed the Albina community? What will it take to restore it? It wasn’t just the freeway, it was the onslaught of cars, that transformed Albina into a bleak and barren car-dominated landscape.

Getting real about restorative justice in Albina. April 26, 2021. Drawings don’t constitute restorative justice. ODOT shows fancy drawings about what might be built, but isn’t talking about actually paying to build anything. Just building the housing shown in its diagrams would require $160 million to $260 million. Even that would replace only a fraction of the housing destroyed by ODOT highway building in Albina.

Taking Tubman: ODOT’s plan to build a freeway on school grounds. April 13, 2021. ODOT’s proposed I-5 Rose Quarter project would turn a school yard into a freeway. The widened I-5 freeway will make already unhealthy air even worse. Pollution from high volume roads has been shown to lower student achievement. ODOT also proposes to build sound walls in Tubman’s school yard.

How ODOT destroyed Albina: The untold story. March 22, 2021.  I-5 wasn’t the first highway that carved up Portland’s historically black Albina Neighborhood. Seventy years ago, ODOT spent the equivalent of more than $80 million in today’s dollars to cut the Albina neighborhood off from the Willamette River. ODOT’s highways destroyed housing and isolated Albina, lead to a two-thirds reduction in population between 1950 and 1970. Demolishing neighborhoods for state highways is ODOT’s raison d’etre.

How ODOT destroyed Albina: The I-5 Meat Axe. March 30, 2021. Interstate 5 “Meat Axe” slashed through the Albina Neighborhood in 1962. This was the second of three acts by ODOT that destroyed housing and isolated Albina. Building the I-5 freeway led to the demolition of housing well-outside the freeway right of way, and flooded the neighborhood with car traffic, ending its residential character and turning into an auto-oriented landscape of parking lots, gas stations and car dealerships.

How ODOT destroyed Albina, part 3: The Fremont Bridge ramps. April 7, 2021. ODOT’s Fremont Bridge wiped out multiple blocks of the Albina neighborhood. A freeway you’ve never heard of leveled dozens of blocks in North and Northeast Portland. The stub of a proposed “Prescott Freeway” still scars the neighborhood.

Why do poor school kids have to clean up rich commuter’s pollution?, March 6, 2019. Portland’s Tubman Middle School, built more than a decade before the I-5 freeway sliced through the neighborhood would get an even larger dose of air pollution when the widened freeway is moved closer to classrooms. The school’s students–disproportionately low income and children of color, have had to see public school monies–more than $12 million–spent to clean up the school’s air; commuters on I-5, disproportionately white and higher income, paid nothing toward’s these costs.

Freeway-widening grifters: Woke-washing, fraud and incompetence. September 20, 2021.  The Oregon Department of Transportation’s glossy mailer to sell its $1.25 billion I-5 Rose Quarter Freeway widening project is a cynical, error-ridden marketing ploy. ODOT doesn’t show or tell about its wider freeway and more traffic, but instead tries to sell the project based on buildings it won’t contribute any money for building. ODOT sent an expensive mailer to thousands of Portland households studded with nearly two-dozen typographical errors.

Housing reparations for Northeast Portland, April 16, 2018. When it built the I-5 freeway in the 1960s, through the heart of Portland’s African-American neighborhood, it demolished–and never replaced–more than 300 homes. It outlandishly claims that a wider freeway will somehow redress that damage, but it could make a much better start by spending about $140 million to rebuild the housing it demolished.

Freeway widening for whomst? March 6, 2019. There’s a profound demographic disparity between those who benefit from I-5 freeway widening and those who bill bear its costs.  Beneficiaries are disproportionately, out-of-state commuters; single occupancy peak hour commuters from Vancouver Washington earn an average of more than $82,000, 50 percent more than those who live in North and Northeast Portland and who commute by bike, transit or walking, and more than double the income of those households in the area who don’t own cars.

Concealing facts and lying to sell freeway widening

An open letter to the Oregon Transportation Commission. March 18, 2021.
For years, the Oregon Department of Transportation has concealed its plans to build a ten lane freeway through Portland’s Rose Quarter. We’ve documented how ODOT concealed the actual width of its proposed freeway from the public, in violation of the letter and spirit of the National Environmental Policy Act. We’re calling on the state to do a full environmental impact statement that assesses the impact of the project they actually intend to build.

The Black Box: Hiding the facts about freeway widening, March 12, 2019. The most basic metric for understanding a road project is something called “Average Daily Traffic” or ADT, a count of the total number of vehicles that use a stretch of roadway on a typical day. That’s an essential input for estimating congestion, air pollution, carbon emissions and assessing safety. But it’s also one statistic that you won’t find anywhere in the Rose Quarter freeway widening project’s Environmental Assessment or its Traffic Technical Report:  all the ADT numbers have been suppressed. It’s like a financial report that has no dollar amounts. Leaving out basic traffic data keeps the public in the dark about key elements of the project.

Why won’t ODOT tell us how wide their freeway is? December 1, 2022. After more than three years of public debate, ODOT still won’t tell anyone how wide a freeway they’re planning to build at the Rose Quarter.  ODOT’s plans appear to provide for a 160-foot wide roadway, wide enough to accommodate a ten lane freeway, not just two additional “auxiliary” lanes. ODOT is trying to avoid NEPA, by building a wide roadway now, and then re-striping it for more lanes after it is built. The agency has utterly failed to examine the traffic, pollution and safety effects of the ten-lane roadway they’ll actually build.

Orwellian Freeway Widening, March 5, 2019. Don’t call it widening the freeway, it’s an “improvement” project. And those aren’t freeway lanes that are being added? They’re harmless “auxiliary lanes.” The Oregon Department of Transportation is torturing logic, common sense and the English language as it relentlessly markets its plans to widen I-5 and the Rose Quarter.  

More Orwell from the Oregon Department of Transportation, April 2, 2019.  We have always been at war with EastAsia.  Within 24 hours ODOT took two entirely different positions regarding the Columbia River Crossing, first denying it had any connection to the proposed $500 million Rose Quarter Freeway widening project, and then saying it was integral to the plans for the freeway widening. Similarly, ODOT first denied the existence of any engineering plans or drawings for the freeway-widening, and then, when pressed conceded that they existed, then ultimately under legal threat, producing 33 gigabytes of such plans. Willfully lying about and concealing key facts about the project is a violation of NEPA and of the public trust.

National transportation experts to Portland: You’re doing it wrong, March 25, 2019.  The nation’s leading experts on urban transportation–Janette Sadik-Khan, Robin Chase, Jennifer Keesmaat and others–have some choice words about freeway widening for Portland:  Don’t!

ODOT’s real agenda:  Massive freeways at the Rose Quarter and Columbia River

The Hidden Rose Quarter MegaFreeway, March 13, 2019. Though its promoted as just adding a couple of “auxiliary lanes” the Rose Quarter project calls for building a massive 126 foot right of way through Northeast Portland, enough to fit a full eight-lane freeway. Once the $500 million is spent at the Rose Quarter, it will only take a few hours with a paint truck to create a much wider freeway.

There’s a $3 billion bridge hidden in the Rose Quarter Project EA, March 27, 2019. Hidden in the plans for the Rose Quarter project is the assumption that Portland will also build an $3 billion, 12-lane wide freeway across the Columbia River–in fact, the Rose Quarter project is needed chiefly to deal with induced demand from this project.

Why won’t ODOT tell us how wide their freeway is? December 1, 2022.  After more than three years of public debate, ODOT still won’t tell anyone how wide a freeway they’re planning to build at the Rose Quarter. ODOT’s plans appear to provide for a 160-foot wide roadway, wide enough to accommodate a ten lane freeway, not just two additional “auxiliary” lanes. ODOT is trying to avoid NEPA, by building a wide roadway now, and then re-striping it for more lanes after it is built. The agency has utterly failed to examine the traffic, pollution and safety effects of the ten-lane roadway they’ll actually build.

Revealed: ODOT’s Secret Plans for a 10-Lane Rose Quarter Freeway. February 24, 2021. For years, ODOT has been planning to build a 10 lane freeway at the Rose Quarter, not the 6 lanes it has advertised. Three previously undisclosed files show ODOT is planning for a 160 foot wide roadway at Broadway-Weidler, more than enough for a 10 lane freeway with full urban shoulders. ODOT has failed to analyze the traffic, environmental and health impacts from an expansion to ten lanes; not disclosing these reasonably foreseeable impacts is a violation of the National Environmental Policy Act (NEPA).

Calculating induced demand at the Rose Quarter. February 1, 2021. Widening I-5 at the Rose Quarter in Portland will produce an addition 17.4 to 34.8 million miles of vehicle travel and 7.8 to 15.5 thousand tons of greenhouse gases per year. These estimates come from a customized calibration of the induced travel calculator to the Portland Metropolitan Area.

Congestion Pricing: ODOT is disobeying an order from Governor Brown. February 8, 2021.  More than a year ago, Oregon Governor Kate Brown directed ODOT to “include a full review of congestion pricing” before deciding whether or not to do a full environmental impact statement for the proposed I-5 Rose Quarter Freeway widening project. ODOT simply ignored the Governor’s request, and instead is delaying its congestion pricing efforts, and proceeding full speed ahead with the Rose Quarter with no Environmental Impact Statement that would include pricing. ODOT has produced no analysis of the effects of pricing as part of its Rose Quarter environmental review, and has said “congestion pricing was not considered.”

Cost Overruns

ODOT: Exploding whales and cost overruns. March 9, 2020. ODOT’s I-5 Rose Quarter freeway project which was estimated to cost $450 million in 2017, had its pricetag nearly doubled to $795 million in 2020.

Another exploding whale: ODOT’s freeway widening cost triples. September 16, 2021. It now looks like Oregon DOT’s I-5 Rose Quarter $450 million freeway widening project will cost more than $1.25 billion. The project’s estimated cost has nearly tripled in just four years, and still has further cost overrun risk. Even OTC commissioners question whether it’s worth more than a billion dollars to widen a 1.5 mile stretch of freeway. The Oregon DOT has experienced massive cost-overruns on all of its largest construction projects, and has systematically concealed and understated the frequency and scale of cost overruns.

 

* – Two features of the I-5 Rose Quarter project that we criticized in 2019 have been changed:  The project no longer proposes to demolish the Flint Avenue overpass, and no longer proposes to construct a “diverging diamond” interchange.

 

 

25 reasons not to widen Portland freeways

Portland is weighing whether to spend half a billion dollars widening a mile-long stretch of the I-5 freeway at the Rose Quarter near downtown. We’ve dug deeply into this idea at City Observatory, and we’ve published 25 commentaries addressing various aspects of the project.  Here’s a synopsis:

Traffic congestion

Wider freeways don’t reduce congestion.  March 4, 2019. The best argument that highway planners can muster for the Rose Quarter freeway widening is that it might somehow relieve congestion by reducing the number of crashes, but when they widened a stretch of I-5 just north of the Rose Quarter a decade ago, crashes not only didn’t decrease, crash rates actually went up.

Rose Quarter freeway widening won’t reduce congestion, March 2, 2019. Wider urban freeways have never reduced congestion, due to “induced demand” a problem so predictable, that experts call it “the fundamental law of road congestion.” Even the experts from ODOT and the Portland Bureau of Transportation concede that the freeway widening will do nothing to reduce daily “recurring” traffic congestion.

Backfire: How widening freeways can make traffic congestion worse, February 26, 2019.  It’s an article of faith among highway builders and boosters that adding more capacity will make freeways flow more smoothly. But in reality, widening a road or intersection at one point simply funnels more vehicles into the next bottleneck more quickly–which can lead a road to become congested even faster. That’s what’s happened on I-5 Northbound in Portland, where the I-5 bridge over the Columbia River carry fewer vehicles in the peak hour now because improvements to the freeway and intersections have overwhelmed the bridge bottleneck.

Congestion pricing is a better solution for the Rose Quarter, March 26, 2019. Congestion pricing on I-5 would dramatically reduce congestion, improve freight and transit travel times, and do so at far lower cost than freeway widening, according to . . . the Oregon Department of Transportation. Pricing has been approved by the state Legislature, but ODOT has violated NEPA by failing to include any mention of it in the Rose Quarter Environmental Assessment.

How tax evasion fuels traffic congestion in Portland, March 15, 2019. A big part of traffic congestion on I-5 and I-205 as they cross the Columbia River is due to Washington residents shopping in Oregon to evade Washington’s high retail sales tax (Oregon has none). Vancouver residents evade $120 million in sales tax per year by shopping in Oregon, but account for between 10 and 20 percent of all traffic across the river. 

Reducing congestion: Katy didn’t, December 27, 2016.  Add as many lanes as you like in an urban setting and you’ll only increase the level of traffic and congestion.  That’s the lesson of Houston’s Katy Freeway, successively widened to a total of 23 lanes, at a cost of billions, but with the latest widening, travel times on the freeway are now slower than before it was expanded.

Safety

Safety: Using the big lie to sell wider freeways, March 19, 2019. ODOT claims that the I-5 Rose Quarter is the state’s “#1 crash location.” But that’s not true.  Other Portland area ODOT roads, including Barbur Boulevard, Powell Boulevard and 82nd Avenue have crash rates that are as much as 3 times higher, and worse, these streets cause fatalities, which the freeway doesn’t. Crying “safety” is a calculated, “Big Lie” marketing gimmick, that would spend half a billion dollars on a roadway that contributes nothing to the state’s growing traffic death toll.

Safety last: What we’ve learned from “improving” the I-5 freeway, March 21, 2019. ODOT has also “improved” freeway interchanges south of Portland as well.  It improved the Woodburn interchange in 2015, hoping to reduce crashes–but they increased instead.  The interchange had two serious crashes, producing extensive delays in February 2019.

Carbon and pollution

Climate concerns crush Oregon highway funding bill, March 6, 2015. In 2015, a pending highway finance bill was killed when the Oregon Department of Transportation admitted it had provided estimates of carbon reductions that were wildly exaggerated and could not be verified. With a track record of producing carbon emission estimates that falsely flatter its preferred projects, should anyone trust the estimates contained in the Rose Quarter Environmental Assessment?

Widening the I-5 Freeway will add millions of miles of vehicle travel, March 4, 2019.  The University of California Davis has a calculator for estimating the effects of added freeway capacity on travel; it suggests that Rose Quarter freeway widening will produce xx to yy million additional miles of travel per year in Portland, as well as xx to yy thousand additional tons of carbon emissions.

Urban Myth Busting: Congestion, Idling and Carbon Emissions, July 6, 2017. The Rose Quarter project makes unsubstantiated claims that it will reduce carbon emissions, by reducing the number of cars idling in traffic; but the published scientific literature on the subject shows that gains from reduced idling due to capacity increases are more than offset by the increase in emissions due to induced travel demand.

Bike and pedestrian infrastructure and freeway covers

 

Distorted Images: Freeway widening is bad for pedestrians, March 14, 2019. ODOT has produced a handful of computer-generated renderings to show how its massive freeway widening project would affect surface streets in Northeast Portland. They’re carefully composed to exaggerate some features and conceal others. If you look closely, you can see how the plan is to round off corners at key intersections–speeding car traffic and increasing danger to pedestrians. In addition, ODOT illustrations show dozens of pedestrians and just a handful of cars on this busy city street: proportions that are off by a factor of 200 in showing the real world relationship of cars to people in this space.

The great freeway cover-up, December 13, 2017. ODOT’s freeway widening plans call for two over-sized freeway overpasses to be built (primarily to deal with construction staging in a dense urban environment). While it claims that the overpasses can be developed as public space, they’re too fragmented, noisy and hostile (thanks to thousands of fast moving cars on every side) to be useable public space.

The death of Flint Street, May 12, 2017. The proposed Rose Quarter freeway widening would demolish the existing Flint Avenue overpass, a low speed neighborhood street that runs parallel to a busy North-South couplet, and provides an important bike route with a gentle grade, and limited auto-traffic. 

Diverging Diamond Blues, December 19, 2017. A key element of the local street plan for the proposed Rose Quarter freeway widening project is turning a portion of North Williams Avenue into a miniature “diverging diamond” interchange–with traffic traveling on the wrong (left-hand) side of a two-ways street. This disorienting design is inherently pedestrian hostile.

Equity and neighborhood effects

How a freeway destroyed a neighborhood, and may again, March 18, 2019. In 1962, construction of I-5 devastated Portland’s historically African-American Albina neighborhood.  Population declined by nearly two-thirds in the three decades after the freeway was built, as the area shifted from a residential area with local-serving businesses, to an auto-dominated landscape. The neighborhood has only started to rebound in recent years, and more auto traffic will likely undermine the area’s attractiveness.

Why do poor school kids have to clean up rich commuter’s pollution?, March 6, 2019. Portland’s Tubman Middle School, built more than a decade before the I-5 freeway sliced through the neighborhood would get an even larger dose of air pollution when the widened freeway is moved closer to classrooms. The school’s students–disproportionately low income and children of color, have had to see public school monies–more than $12 million–spent to clean up the school’s air; commuters on I-5, disproportionately white and higher income, paid nothing toward’s these costs.

Housing reparations for Northeast Portland, April 16, 2018. When it built the I-5 freeway in the 1960s, through the heart of Portland’s African-American neighborhood, it demolished–and never replaced–more than 300 homes. It outlandishly claims that a wider freeway will somehow redress that damage, but it could make a much better start by spending about $140 million to rebuild the housing it demolished.

Freeway widening for whomst? March 6, 2019. There’s a profound demographic disparity between those who benefit from I-5 freeway widening and those who bill bear its costs.  Beneficiaries are disproportionately, out-of-state commuters; single occupancy peak hour commuters from Vancouver WAshington earn an average of more than $82,000, 50 percent more than those who live in North and Northeast Portland and who commute by bike, transit or walking, and more than double the income of those households in the area who don’t own cars.

Concealing facts and lying to sell freeway widening

The Black Box: Hiding the facts about freeway widening, March 12, 2019. The most basic metric for understanding a road project is something called “Average Daily Traffic” or ADT, a count of the total number of vehicles that use a stretch of roadway on a typical day. That’s an essential input for estimating congestion, air pollution, carbon emissions and assessing safety. But it’s also one statistic that you won’t find anywhere in the Rose Quarter freeway widening project’s Environmental Assessment or its Traffic Technical Report:  all the ADT numbers have been suppressed. It’s like a financial report that has no dollar amounts. Leaving out basic traffic data keeps the public in the dark about key elements of the project.

Orwellian Freeway Widening, March 5, 2019. Don’t call it widening the freeway, it’s an “improvement” project. And those aren’t freeway lanes that are being added they’re harmless “auxiliary lanes.” The Oregon Department of Transportation is torturing logic, common sense and the English language as it relentlessly markets its plans to widen I-5 and the Rose Quarter.  

More Orwell from the Oregon Department of Transportation, April 2, 2019.  We have always been at war with EastAsia.  Within 24 hours ODOT took two entirely different positions regarding the Columbia River Crossing, first denying it had any connection to the proposed $500 million Rose Quarter Freeway widening project, and then saying it was integral to the plans for the freeway widening. Similarly, ODOT first denied the existence of any engineering plans or drawings for the freeway-widening, and then, when pressed conceded that they existed, then ultimately under legal threat, producing 33 gigabytes of such plans. Willfully lying about and concealing key facts about the project is a violation of NEPA and of the public trust.

National transportation experts to Portland: You’re doing it wrong, March 25, 2019.  The nation’s leading experts on urban transportation–Janette Sadik-Khan, Robin Chase, Jennifer Keesmaat and others–have some choice words about freeway widening for Portland:  Don’t!

ODOT’s real agenda:  Massive freeways at the Rose Quarter and Columbia River

The Hidden Rose Quarter MegaFreeway, March 13, 2019. Though its promoted as just adding a couple of “auxiliary lanes” the Rose Quarter project calls for building a massive 126 foot right of way through Northeast Portland, enough to fit a full eight-lane freeway. Once the $500 million is spent at the Rose Quarter, it will only take a few hours with a paint truck to create a much wider freeway.

There’s a $3 billion bridge hidden in the Rose Quarter Project EA, March 27, 2019. Hidden in the plans for the Rose Quarter project is the assumption that Portland will also build an $3 billion, 12-lane wide freeway across the Columbia River–in fact, the Rose Quarter project is needed chiefly to deal with induced demand from this project.

 

 

 

More Orwell from the Oregon Department of Transportation

We have always been at war with Eastasia.

Concealing and lying about key facts regarding the proposed Rose Quarter Freeway widening process is a violation of the National Environmental Policy Act and a betrayal of public trust

  • In less than 24 hours, ODOT spokes-people maintained with equal assurance that the CRC was “no where on the horizon” and that the Department had clearly disclosed that the project’s traffic projections assumed the project had been built four years ago.
  • In quick succession last week, ODOT denied the existence of any project plans, then said it would take two weeks to find them, and then produced 33 GB of plan files.

One of the big issues on Interstate 5 between Portland and Vancouver is whether the region was moving ahead with something called the Columbia River Crossing, a now (mostly) dead plan to build a $3 billion, 12-lane, five-mile freeway over the Columbia River between the two cities.  Not surprisingly, Oregon’s plan to widen I-5 in the area south of the river got people in Vancouver Washington excited about the prospect of the bridge (which, not incidentally facilitates tax evasion to the tune of $120 million per year). Andy Matarrese, crack reporter for the Vancouver Columbian, asked Oregon Department of Transportation Spokesman Don Hamilton whether there was any connection between the two projects.  Here’s what he wrote on Monday, March 25, 2019, in an article entitled “$500 million ODOT plan addresses Rose Quarter bottleneck issue.

Not linked to bridge

Although the Rose Quarter project might allay the concerns of some bridge critics, there is no conjectural Interstate 5 Bridge project baked into the Rose Quarter plan, ODOT spokesman Don Hamilton said, because there is no interstate bridge project on the horizon.

Pretty clear, huh? Rose Quarter is a separate project.  No Columbia River Crossing here.

The very next day, Tuesday, March 26,  City Observatory (and Oregon Public Broadcasting) reported that materials found in the in obscure appendices and delayed public disclosures provided by the Oregon Department of Transportation showed that the traffic projections for the Rose Quarter Freeway widening project were built on the assumption that the Columbia River Crossing was completed–in 2015.

That fact was confirmed by ODOT project manager Megan Channell on Tuesday night, at a meeting of the Portland Planning and Sustainability Commission.  To her, it was no big deal, they’d never hid that. Oregon Public Broadcasting’s Jeff Mapes reported:

ODOT acknowledged Tuesday that its traffic modeling for another freeway project — a $500 million upgrade to I-5 in Portland’s Rose Quarter area — assumes that the Columbia River Crossing will still be built.

Megan Channell, the manager for the Rose Quarter project, said traffic modeling includes all of the road projects in the Portland region’s transportation plan, “including the CRC … We’re sort of staying with what the adopted projects are.”

Channell disclosed the CRC traffic assumptions after opponents of the Rose Quarter project found hints of it in technical reports that ODOT released under pressure.

(Not only is the Columbia River Crossing not “on the horizon,” the project’s traffic projections assume that the non-existent CRC was completed in 2015, and their model shows it pumping thousands of non-existent vehicles into the Rose Quarter in that year.)

So:  Monday:  Absolutely no multi-billion CRC was part of ODOT’s plan.  And Tuesday:  Oh, yeah, it’s always been integral to our plan, as we’ve always said. And in fact, we’re pretending it was built four years ago.  We have always been at

As we related at City Observatory on Tuesday, the “notice” of the CRC was, in reality, deeply buried in an obscure appendix to the project’s environmental assessment.  One had to navigate a vague cross-reference, dig into a minor footnote, follow a web-link to a non-ODOT website (not part of the EA) and wade through hundreds of lines of fine print in a large Excel spreadsheet to find any reference to the CRC.

As regular City Observatory  reader Spencer Boomhower observed, Channell’s attitude bore a striking resemblance to Arthur Dent’s experience in in finding the public notice of the forthcoming demolition of his home–for a highway, naturally–in the opening scenes of Douglas Adam’s “Hitchhikers Guide to the Galaxy?

Mr Prosser: But look, you found the notice, didn’t you?
Arthur: Yes, yes, I did. It was on display at the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying ‘beware of the leopard.’

The half life-of any particular truth at the Oregon Department of Transportation is apparently measured in periods of less than 24 hours.  That’s all it took for the official position to flip flop 180 degrees.  We have, indeed, always been at war with East Asia.

Plans? We don’t have any plans?  . . . Oh, you mean those plans?  The 33 GB of plans.

Surely, even that whopper about the Columbia River Crossing is an anomaly, isn’t it? Actually, no.

Lying about basics of the project seems to be pretty much accepted practice at the Oregon Department of Transportation. Let’s go back a week or so further, a bit earlier in the comment process on the Rose Quarter Environmental Assessment.

Iain Mackenzie, editor of the blog Next Portland, which chronicles in detail all of the real estate development projects in Portland (trust us:  this is an invaluable resource), looked at the public materials disclosed by ODOT, and saw some interesting computer generated renderings of selected views of the project. An expert in this field, Mackenzie immediately contacted ODOT, to ask for copies of the plans which were used to produce those renderings.

On February 15, 2019, Mackenzie, emailed ODOT to request copies of plans used to produce computer-generated renderings of aspects of the project presented in the EA. Initially, ODOT staff denied that any such plans existed. ODOT staffer Douglas Siu wrote to Mackenzie on February 19, “engineering drawings do not yet exist.”

Mackenzie, based on his technical knowledge of computer rendering, knew that such renderings could not be created without such plans. He pressed his request, and ODOT acceded that such plans, in fact existed. On February 25, 2019, Mackenzie filed a public record request for these files. On March 20, 2019, ODOT finally replied to his public records request, stating that it would take twenty-five business days and $6,000 to supply such records, meaning such data would be unavailable until after the expiration of the EA public comment period. Mackenzie’s attorney then met with ODOT, and following that meeting on March 26, 2019, ODOT released 34 gigabytes of computer data files containing plans of the project. It has only been for the following 5 days prior to the project’s April 1 comment deadline that he and other members of the public have had the opportunity to review this highly technical information.

With just a handful of days to study this giant pile of data, project critics quickly identified issues ODOT had hidden.  Mackenzie pointed out the plans show that the freeway will be widened over the top of the Vera Katz Eastbank trail.  The freeway itself would be widened over the trail (below), and what the renderings don’t show, is that additional physical support would be needed to support a much wider viaduct, which would likely be a further intrusion into the trail (as shown by the annotation added by Cupola Media).

Source: Cupola Media, via Iain Mackenzie, from withheld ODOT plans/renderings.

Doug Klotz showed that the ODOT plans add an additional travel lane for cars on Weidler Street, and clearly how ODOT will cut-away the corners of several blocks in the Rose Quarter, speeding traffic and endangering pedestrians. These features were invisible or de-emphasized in ODOT’s selective renderings of the project.  The top rendering shows the existing Weidler street running left to right, with three travel lanes, and “square’ corners with short crosswalks; the bottom rendering shows the freeway widening project, with four travel lanes, corners cut away and much longer, more vulnerable crosswalks.

Source: Doug Klotz from withheld ODOT plans/renderings.

Withholding these files from public release prevented technical experts like Mackenzie and Klotz from identifying problems like these (and there are undoubtedly more).  They also prevented the general public from learning the true nature of the I-5 Rose Quarter freeway widening project before the comment period expired.  This clearly frustrates the public’s right to know about the project’s likely impacts, which is the centerpiece of NEPA.

Why this matters

We’ve taken a tongue in cheek attitude toward the serial prevarication of Oregon Department of Transportation, but make no mistake, this is an issue of the utmost gravity. Our citizens and our democracy rely on the honesty of public servants doing their jobs and, at a bare minimum, telling the truth.  The hallmark of the National Environmental Policy Act is information and disclosure: its purpose is to reveal to citizens the likely environmental consequences of government actions. When government officials intentionally lie, deceive, and deny the existence of key facts, they undercut the foundation of our democracy.

In the case of both of these lies and deceptions, they’ve both come in the last few days of the very limited comment period set by ODOT.  In both cases, the only reason the truth was revealed was because citizen advocates, at their own time and expense, forced the issue. Extracting facts and honesty from public servants shouldn’t be contingent on this kind of heroic effort. Public servants at the Oregon Department of Transportation have an affirmative obligation to tell the truth and reveal the facts.  They are doing so only belatedly and grudgingly, if at all, with the evident intent to deny the public the right to know and participate.  And in the process they’re damaging our state.  We deserve better.

 

 

ODOT consultant: Pricing is a better fix for the Rose Quarter

Oregon DOT’s own consultants say congestion pricing would be a better way to fix congestion at the I-5 Rose Quarter than spending $800 million.

Pricing would improve traffic flow and add capacity equal to another full lane of traffic, according to WSP who called it “our best alternative” for dealing with the Rose Quarter

Failing to advance pricing as an alternative in the environmental review violates the National Environmental Policy Act, and the Governor’s direction.  The City of Portland also made its willingness to go along with this project conditional, based on ODOT implementing pricing prior to constructing this project.

Nearly two years ago, City Observatory pointed out that congestion pricing would be a far more economical and effective way to fix congestion on I-5 that spending half a billion dollars widening the freeway at the Rose Quarter. Since then, the project’s price tag has ballooned to nearly $800 million.

But more importantly:  The Oregon Department of Transportation’s own consultants have explicitly testified that congestion pricing would be a better, cheaper way to fix congestion than this expensive project.  They hired WSP—a global traffic engineering firm—to study a range of pricing alternatives for area freeways, including Interstate 5.  While the scope of the project was much broader (details below), WSP’s Chris Swenson testified that pricing would be especially effective and economical at the Rose Quarter.  On February 28, 2018  he told Portland Value Pricing Policy Advisory Committee Meeting, that congestion pricing would eliminate “flow breakdowns” and produce just as much traffic relief as building an additional travel lane.

In the Rose Quarter particularly, the cost of new lanes new capacity is significant. You’ve got a lot of things on structure there, which makes it very expensive. I know that there is a Rose Quarter project. That isn’t online, and it is in the model. However, I think that our best alternative, because of the cost of new lanes–it’s not saying no new lanes ever it’s not saying you can’t do it—it’s just given what we were told, this is what you need to analyze. Your best option is to get traffic running much better in every lane in each lane, and right now with the flow breakdowns that you’ve got during the peak periods, that’s not happening. So if we could get a 50% improvement in those two lanes—And that’s very doable—then you’re effectively adding a new lane of capacity on a two lane section, and it would operate as if it were three lanes versus what it’s operating today.

 

ODOT consultant Swenson: To deal with Rose Quarter congestion, pricing would be “our best alternative” . . . “effectively adding a new lane of capacity”

So rather than squandering the better part of a billion dollars, the state could achieve the same objective, faster, more effectively and far more cheaply by implementing congestion pricing.

Tragically, the project’s environmental assessment didn’t consider that as a possible alternative. But look high and low in the Environmental Assessment and its Traffic Technical Report and you’ll find not a single mention of pricing. In our view, that’s a plain violation of the National Environmental Policy Act, which requires a consideration of a full range of reasonable alternatives.  In fact, ODOT and the Federal Highway Administration specifically excluded an analysis of congestion pricing, claiming that there was no need to do so because the congestion pricing wasn’t identified as a project in the Regional Transportation Plan.  ODOT also ignored a specific direction from Oregon Governor Kate Brown to analyze congestion pricing as part of its environmental review of the project.

Congestion pricing is a better solution for Portland traffic

Now in some places, you might be able to argue that congestion pricing is far-fetched or politically infeasible.  In Oregon, as it turns out, its neither:  In fact, it’s already been adopted as law.  Two years ago, Oregon’s Legislature adopted HB 2017, which directed the state Department of Transportation to begin work to implement congestion pricing on Interstate 5 and Interstate 205 in the Portland metropolitan area.  In addition, the City of Portland made its planning approval of the Rose Quarter project conditional on ODOT implementing congestion pricing on I-5. The city’s adopted Central City Plan specifically calls for congestion pricing to be implemented in conjunction with the Rose Quarter project:

ODOT, in partnership with PBOT will implement congestion pricing and TDM options to mitigate for climate impacts as soon as feasible and prior to the opening of the project.
(Central City Plan, page 139)

In addition, in December 2017, Portland’s City Council voted unanimously to endorse congestion pricing as part of the city’s future transportation planning. Last year, the state transportation department published a preliminary analysis of several pricing alternatives.  Among the proposals is one, called “Concept 2” which would extend pricing to all lanes of these two major freeways.

The studies undertaken by the Oregon Department of Transportation conclude that congestion pricing could measurably reduce traffic congestion on I-5. The analysis concludes that the project would reduce congestion and improve travel time reliability on I-5.  It would save travel time for trucks and buses.  It enables higher speeds and greater throughput on the freeway–because it eliminates the hyper-congestion that occurs when roads are unpriced. Here’s an excerpt from page 17, of the report.  We highlighted in bold the most salient bits of the analysis:

Overall, Concept 2 – Priced Roadway, will reduce congestion for all travelers on the priced facility. This will produce overall improvement in travel time reliability and efficiency for all users of I-5 and I-205.  [Concept 2 is] Likely to provide the highest level of congestion relief of the initial pricing concepts examined. [It] Controls demand on all lanes and, therefore, allows the highest level of traffic management to maintain both relatively high speeds and relatively high throughput on both I-5 and I-205. Vehicles 10,000 pounds and more (such as many freight trucks and transit vehicles) would benefit from travel time improvements on the managed facilities.  Pricing recovers lost functional capacity due to hyper-congestion, providing greater carrying volume with pricing than without. This means that diversion impacts may be minimal, but still warrant consideration and study.

This concept is relatively inexpensive to implement, and significantly less expensive than concepts that include substantial physical improvements to the pavement and bridge infrastructure.

Oregon Department of Transportation,, (2018). Portland Metro Area Value Pricing Feasibility Analysis Final Round 1 Concept Evaluation and Recommendations Technical Memorandum #3, 2018.

Bottom line:  congestion pricing works better for freeway users, freight mobility, and transit riders; in keeps unpriced traffic from causing hyper-congestion and effectively and is vastly cheaper than building new lanes and bridges.  It is also likely to result in minimal, if any diversion to local streets.

Omitting pricing as an alternative in the Environmental Assessment violates the National Environmental Policy Act

This is clearly a viable alternative to widening the freeway at the Rose Quarter.  Viable is actually a significant understatement:  pricing isn’t simply a viable alternative, it’s arguably, on its face, a superior alternative).  But that’s beside the point.  From a legal standpoint, not seriously evaluating road pricing as an alternative to expensive, environmentally damaging road widening is violates NEPA’s requirement for a robust analysis of alternatives.

The substantive requirements for alternative analysis are spelled out in 40 CFR1502.  As explained by the Federal Highway Administration,

The Council on Environmental Quality (CEQ) refers to the alternatives analysis section as the “heart of the EIS,” and requires agencies to:

1. Rigorously explore and objectively evaluate all reasonable alternatives and for alternatives which were eliminated from detailed study, briefly discuss the reasons for their having been eliminated.

2. Devote substantial treatment to each alternative considered in detail including the proposed action so that reviewers may evaluate their comparative merits.

The FHWA guidance requires that the agency clearly state the rationale for not advancing reasonable alternatives:

Alternatives analysis should clearly indicate why and how the particular range of project alternatives was developed, including what kind of public and agency input was used. In addition, alternatives analysis should explain why and how alternatives were eliminated from consideration. It must be made clear what criteria were used to eliminate alternatives, at what point in the process the alternatives were removed, who was involved in establishing the criteria for assessing alternatives, and the measures for assessing the alternatives’ effectiveness.

Nothing in the EA describes the criteria used to select or eliminate alternatives, nor is any analysis offered for the the failure to advance pricing as an alternative in its own right.  It’s simply arbitrary and capricious of the Oregon Department of TRansportation and the Federal Highway Administration to produce an Environmental Assessment which takes no notice of efforts they are both currently undertaking to implement road pricing on this very roadway.

Moreover, while NEPA requires that congestion pricing be evaluated separately as an alternative, it is also the case that road pricing ought to be incorporated in the analysis as part of the No-Build alternative.  The implementation of road pricing in the next decade or so is, in NEPA terms, a reasonably foreseeable event.  Just as the Environmental Analysis has incorporated its expectations about the growing electrification and increased fuel efficiency of future vehicles in its forecasts of emissions (due to the future implementation of fuel economy regulations), it should likewise include the analysis of congestion pricing, which is also a reasonably foreseeable part of the regulatory environment in the next decade or so.

Pricing first, building (if at all) later is the smarter choice

It’s likely that implementing road pricing would obviate entirely the need for capital construction at the Rose Quarter.  But even if it didn’t, it is far more prudent from both a fiscal and environmental standpoint to implement congestion pricing first, to determine which, if any, “improvements” are needed at the Rose Quarter to reduce congestion and improve safety. The “build first-price later” strategy is a recipe for squandering scarce public resources on roads that people don’t value and won’t use.  As we chronicled at City Observatory, Louisville spent a billion dollars adding capacity to its I-65 bridge over the Ohio River, only to discover that traffic volumes fell by almost half when they started charging a modest toll to bridge users.  It’s financially, as well as ecologically prudent to toll first, and then invest in capacity, when, were and as needed, only later.

Editor’s Note:  A hat tip to Portland transportation advocate Doug Allen for pointing us to Mr. Swenson’s testimony.  This post has been revised to provide an updated link to the consultant’s report.

 

Congestion pricing is a better solution for the Rose Quarter

Congestion pricing is a quicker, more effective and greener way to reduce congestion at the Rose Quarter than spending $500 million on freeway widening.

Failing to advance pricing as an alternative in the environmental review is a violation of the National Environmental Policy Act

For the past month, we’ve been taking a hard look at plans to spend $500 million to widen I-5 at the Rose Quarter near downtown Portland.  Ostensibly, this project aims to reduce congestion and improve safety. We’re at the point the process where the public is being asked to review the project’s Environmental Assessment (EA), a document prepared to address the requirements of the National Environmental Policy Act (NEPA). The I-5 Rose Quarter Freeway Widening EA looks at just two alternatives: a “Build” alternative, consisting of a $500 million, mile-long freeway widening, and a “No-Build.”

But the state could achieve the same objective, faster, more effectively and far more cheaply by implementing congestion pricing.

Tragically, the project’s environmental assessment didn’t consider that as a possible alternative. But look high and low in the Environmental Assessment and its Traffic Technical Report and you’ll find not a single mention of pricing. In our view, that’s a plain violation of the National Environmental Policy Act, which requires a consideration of a full range of reasonable alternatives.

Congestion pricing is a better solution for traffic, according to Oregon DOT’s own studies

Now in some places, you might be able to argue that congestion pricing is far-fetched or politically infeasible.  In Oregon, as it turns out, its neither:  In fact, it’s already been adopted as law.  Two years ago, Oregon’s Legislature adopted HB 2017, which directed the state Department of Transportation to begin work to implement congestion pricing on Interstate 5 and Interstate 205 in the Portland metropolitan area.  In addition, in December 2017, Portland’s City Council voted unanimously to endorse congestion pricing as part of the city’s future transportation planning. Last year, the state transportation department published a preliminary analysis of several pricing alternatives.  Among the proposals is one, called “Concept 2” which would extend pricing to all lanes of these two major freeways.

The studies undertaken by the Oregon Department of Transportation conclude that congestion pricing could measurably reduce traffic congestion on I-5. The analysis concludes that the project would reduce congestion and improve travel time reliability on I-5.  It would save travel time for trucks and buses.  It enables higher speeds and greater throughput on the freeway–because it eliminates the hyper-congestion that occurs when roads are unpriced. Here’s an excerpt from page 17, of the report.  We highlighted in bold the most salient bits of the analysis:

Overall, Concept 2 – Priced Roadway, will reduce congestion for all travelers on the priced facility. This will produce overall improvement in travel time reliability and efficiency for all users of I-5 and I-205.  [Concept 2 is] Likely to provide the highest level of congestion relief of the initial pricing concepts examined. [It] Controls demand on all lanes and, therefore, allows the highest level of traffic management to maintain both relatively high speeds and relatively high throughput on both I-5 and I-205. Vehicles 10,000 pounds and more (such as many freight trucks and transit vehicles) would benefit from travel time improvements on the managed facilities.  Pricing recovers lost functional capacity due to hyper-congestion, providing greater carrying volume with pricing than without. This means that diversion impacts may be minimal, but still warrant consideration and study.

This concept is relatively inexpensive to implement, and significantly less expensive than concepts that include substantial physical improvements to the pavement and bridge infrastructure.

Oregon Department of Transportation,, (2018). Portland Metro Area Value Pricing Feasibility Analysis Final Round 1 Concept Evaluation and Recommendations Technical Memorandum #3, 2018.

Bottom line:  congestion pricing works better for freeway users, freight mobility, and transit riders; in keeps unpriced traffic from causing hyper-congestion and effectively and is vastly cheaper than building new lanes and bridges.  It is also likely to result in minimal, if any diversion to local streets.

Omitting pricing as an alternative in the Environmental Assessment violates the National Environmental Policy Act

This is clearly a viable alternative to widening the freeway at the Rose Quarter.  Viable is actually a significant understatement:  pricing isn’t simply a viable alternative, it’s arguably, on its face, a superior alternative).  But that’s beside the point.  From a legal standpoint, not seriously evaluating road pricing as an alternative to expensive, environmentally damaging road widening is violates NEPA’s requirement for a robust analysis of alternatives.

The substantive requirements for alternative analysis are spelled out in 40 CFR1502.  As explained by the Federal Highway Administration,

The Council on Environmental Quality (CEQ) refers to the alternatives analysis section as the “heart of the EIS,” and requires agencies to:

1. Rigorously explore and objectively evaluate all reasonable alternatives and for alternatives which were eliminated from detailed study, briefly discuss the reasons for their having been eliminated.

2. Devote substantial treatment to each alternative considered in detail including the proposed action so that reviewers may evaluate their comparative merits.

The FHWA guidance requires that the agency clearly state the rationale for not advancing reasonable alternatives:

Alternatives analysis should clearly indicate why and how the particular range of project alternatives was developed, including what kind of public and agency input was used. In addition, alternatives analysis should explain why and how alternatives were eliminated from consideration. It must be made clear what criteria were used to eliminate alternatives, at what point in the process the alternatives were removed, who was involved in establishing the criteria for assessing alternatives, and the measures for assessing the alternatives’ effectiveness.

Nothing in the EA describes the criteria used to select or eliminate alternatives, nor is any analysis offered for the the failure to advance pricing as an alternative in its own right.  It’s simply arbitrary and capricious of the Oregon Department of TRansportation and the Federal Highway Administration to produce an Environmental Assessment which takes no notice of efforts they are both currently undertaking to implement road pricing on this very roadway.

Moreover, while NEPA requires that congestion pricing be evaluated separately as an alternative, it is also the case that road pricing ought to be incorporated in the analysis as part of the No-Build alternative.  The implementation of road pricing in the next decade or so is, in NEPA terms, a reasonably foreseeable event.  Just as the Environmental Analysis has incorporated its expectations about the growing electrification and increased fuel efficiency of future vehicles in its forecasts of emissions (due to the future implementation of fuel economy regulations), it should likewise include the analysis of congestion pricing, which is also a reasonably foreseeable part of the regulatory environment in the next decade or so.

Pricing first, building (if at all) later is the smarter choice

It’s likely that implementing road pricing would obviate entirely the need for capital construction at the Rose Quarter.  But even if it didn’t, it is far more prudent from both a fiscal and environmental standpoint to implement congestion pricing first, to determine which, if any, “improvements” are needed at the Rose Quarter to reduce congestion and improve safety. The “build first-price later” strategy is a recipe for squandering scarce public resources on roads that people don’t value and won’t use.  As we chronicled at City Observatory, Louisville spent a billion dollars adding capacity to its I-65 bridge over the Ohio River, only to discover that traffic volumes fell by almost half when they started charging a modest toll to bridge users.  It’s financially, as well as ecologically prudent to toll first, and then invest in capacity, when, were and as needed, only later.

Major cities around the world have implemented various forms of congestion pricing, with strongly positive results.  The London Congestion charge is well-known.  Pricing systems are up and running in Stockholm, Singapore and Milan. Once established, the systems generate strong public support because they make urban traffic flow more smoothly.  Systems in Milan and Stockholm were both endorsed in city-wide public referenda after being established. New York seems finally, to be on the verge of approving congestion pricing for lower Manhattan.

 


For the record, we’ve included a screen-shot of page 17 of the ODOT value pricing report here:

Oregon Department of Transportation, (2018). Portland Metro Area Value Pricing Feasibility Analysis Final Round 1 Concept Evaluation and Recommendations Technical Memorandum #3, 2018

Safety last: What we’ve learned from “improving” the I-5 freeway.

Expanding freeway capacity on I-5 hasn’t reduced crashes in Woodburn, but did triple in cost

Today, we’re pleased to offer a guest commentary from Naomi Fast. Naomi currently lives in Beaverton, Oregon. Previously, she lived in Portland, where she learned to ride a bicycle as transportation while earning graduate degrees at Portland State University. Her website is naomifast.com

In the next few weeks, metropolitan Portland will be taking a close look at an Oregon Department of Transportation (ODOT) proposal to spend half a billion dollars widening a mile-long stretch of I-5 near downtown Portland.  ODOT is marketing the project through the release of a multi-volume “Environmental Assessment,” which makes a number of claims about how the project will influence the environment, affect traffic, and–so we are told–reduce crashes.

But as the saying goes, this isn’t our first rodeo.  ODOT has been expanding capacity and widening chunks of I-5 for years. What has that experience taught us about how freeway widening affects safety, and what does it tell us about ODOT’s ability to bring projects in on time and under budget?

Previously, City Observatory has looked at a recent widening of I-5 just north of the Rose Quarter project area.  But just a few years ago, ODOT completed a capacity expansion project on I-5, about 30 miles to the south, at the Woodburn Interchange.  Just like the Rose Quarter, this project was pitched as a congestion-busting safety improvement. Did this project pan out as planned?

When I heard about two separate & serious multiple vehicle crashes this February at the Woodburn Interchange, each of which caused hours of delays on the I-5, I thought: Wait a minute—wasn’t that section widened recently?


As it turns out, it was. The expansion project at Milepost 271, completed in 2015, was in fact similar to in concept to the one being promoted now for the Rose Quarter.  While it didn’t add auxiliary lanes like those in the Rose Quarter, it did add huge new loop ramps with long exit and entrance lanes to increase capacity to move cars on and off I-5.  It had its own lengthy design process, with its own Environmental Assessment. It affected a little less than a mile of freeway, slightly smaller than the propose Rose Quarter widening, with a similar justification, and with an Environmental Assessment that compared just two options: Build & No Build.

Oregonians were told that congestion and crashes were going to be significantly reduced in that section, according to ODOT’s planning reports for the project.

  • ODOT’s 2005 EA said, “Travel speeds, traffic flow, and overall safety and function would be improved for all modes of travel using the reconstructed interchange.”
  • In the 2006 Revised EA report, ODOT said the widening would “provide a facility that would safely accommodate multimodal travel demands 20 years into the future.”
  • A 2011 project paper said, “The purpose of the Woodburn Interchange Project is to improve the traffic flow and safety conditions of the existing Woodburn/I-5 interchange.”
  • And at the 2015 ribbon cutting less than four years ago, even Governor Brown was optimistic, as seen in this video.

If improving safety is a key reason for widening freeways, then we should be looking for tangible real-world evidence that widened freeways, like the one at Woodburn, are getting safer.  That means the February crashes cast real doubt on the efficacy of freeway widening and flow improvements as a safety strategy. Along with the February 15th and 20th Woodburn Interchange crashes, there have been others; there was a fatal crash at the interchange just months after it opened.

ODOT’s own annual crash data aren’t conclusive.  In 2016, the first year after the project opened, and the latest year for which annual data are available, the project area experienced 14 crashes.  With so few crashes (less than one per month, on average) crash rates can vary a lot from year to year. The data so far suggest that the crash performance of this area is no better than it was before the project was built.  It remains to be seen how last month’s crashes will figure into the 2019 totals, but two in a little over two weeks is a lot.


There’s one other thing to consider:  While highway engineers almost always think its a good thing if traffic can move faster, safety analysts point to crash data that shows that speed is strongly associated with more crashes and more severe injuries.  The real world experience with widening Interstate 5, both here in Woodburn, and to the north of the project between Lombard and Victory in Portland, shows that contrary to the claims of engineers, there’s no reason to believe that wider freeways will have fewer, or less severe crashes.

How much will it cost?

There’s a postscript to the Woodburn Interchange story.  Not only does it not appear to have delivered improved safety, it’s also ended up costing vastly more than promised.  When initially proposed by the Oregon Department of Transportation in 2005, the project was supposed to cost $25 million (according to the project’s 2006 Environmental Assessment).  When finally completed in 2015, the project’s cost had nearly tripled to more than $70 million. In the case of the Rose Quarter–a much more complex project, in a dense urban setting–imagine if $500 million were to nearly triple.

 

The Lemming Model of Traffic

Highway planners use a deeply flawed “lemming” model of traffic that rationalizes highway widenings

The traffic projections made as part of the Environmental Assessment for the $500 million Rose Quarter I-5 widening project make an audacious claim that the project will reduce traffic congestion and greenhouse gases, and completely unlike any other freeway expansion project, won’t induce any additional demand.

How do they know that?  Because they have a model that says so.

But how realistic is that model?  As it turns out, not very.  As we detailed earlier, one  problem is that the model (without saying so) is based on the assumption that a 12-lane, 5 mile long freeway was built (reality check:  it wasn’t) between Portland and Vancouver in 2015, and is funnelling thousands and thousands more additional cars into the Rose Quarter than is actually the case.

But there’s a more fundamental problem. The traffic forecasting model used by ODOT is inherently structured to over-predict traffic congestion, and presents a distorted picture of what’s likely to happen with freeway widening.The model, a classic four-step traffic demand model (albeit with a few tacked on bells and whistles, like a microsimulation package) is decades old technique with a couple of critical and well documented weaknesses. The most important one is that it allows traffic to grow without limit in the “base” case.  It predicts that as population increases, more and more people will drive more and more miles, and that they will all be oblivious in their trip making behavior to any changes in congestion.  That is, they won’t be deterred by congestion from changing the timing, mode, destination, or whether they take a trip at all.  That’s because four-step models, like the one used to create the Rose Quarter I-5 estimates, uses something called static trip assignment, STA that takes almost no notice of the effects of congestion and delay.

In an important sense, static trip assignment is a kind of “lemming” model of travel behavior.  It assigns successively more and more trips to congested links even as they become more and more congested.  Implicitly, it assumes that traveler behavior doesn’t respond at all to experienced travel conditions, especially delay.  In fact, the model allows traffic to actually exceed the capacity of the roadway link–by definition of physical impossibility–and consequently over predicts traffic, and leads to forecasts of hours and hours of congestion, as more and more traffic piles obliviously onwards into the congested roadway–and does so every single day, with no learning or adaptation.  Like lemmings, you might say.

This is like the classic 1958 Walt Disney nature film, White Wilderness showing hundreds of lemmings running off the edge of a cliff.  The following lemmings leap to their death, even though they can see the lemmings in front of them falling.

In the film, the little rodents assemble for a mass migration, scamper across the tundra and ford a tiny stream as narrator Winston Hibbler explains that, “A kind of compulsion seizes each tiny rodent and, carried along by an unreasoning hysteria, each falls into step for a march that will take them to a strange destiny.”

That destiny is to jump into the ocean. As they approach the “sea,” (actually a river -more tight cropping) Hibbler continues, “They’ve become victims of an obsession — a one-track thought: Move on! Move on!”

The “pack of lemmings” reaches the final precipice. “This is the last chance to turn back,” Hibbler states. “Yet over they go, casting themselves out bodily into space.”

They don’t do this in real life:  They were pushed, just like drivers in ODOT’s traffic model Lemmings leaping to their doom (Walt Disney, 1958, via Youtube)

Lemmings are an apt analogy, because in real life (as opposed to 1950s nature films) lemmings don’t actually blindly leap to their death.  As the Alaska Department of Fish and Wildlife explains, the classic footage which appears in White Wilderness was actually staged by the producers, who from positions off camera, chased the terrified lemmings off the cliff.

In a sense, that’s what modelers are doing with Static Traffic Assignment–no matter how bad congestion gets in the STA models, the lemming motorists, just keep pouring into the congested roadway link.  But in the real world, as opposed to traffic models or 1950s Disney films, drivers (and lemmings) don’t just mindlessly drive into congestion day after day (or jump off cliffs along with hundreds of others).  Instead, in the face of congestion, they change their behavior, changing the time, mode or destination of their journey, or foregoing it altogether.

What the unrealistic static traffic assignment lemming scenario does is to create a fictitious baseline forecast of traffic that looks truly horrible.  And, in comparison, the “build” scenario, in this case, a road widening, looks less bad.  And it also doesn’t have any “induced” demand, because the projections dramatically over-estimated the amount of traffic that would occur in the base no-build scenario. (Induced demand is almost impossible in these models because traffic demand is derived from an earlier step in the process and there’s no feedback loop of less delay to encourage greater trip-making.)

For technicians in the field, the problems with static traffic assignment are well documented.  Norm Marshall writing in the peer-reviewed technical publication “Research in Transportation Business and Management, says:

STA has two fundamental problems that make it ill-suited at analyzing peak period congestion. First, most peak period congestion, especially on freeways, involves traffic queuing behind bottlenecks. Therefore, the roadway segments are not independent, as is assumed in STA. Second, these bottlenecks meter traffic flow to the capacity of the bottlenecks. In sharp contrast, STA allows modeled traffic volumes to exceed capacity. This misrepresents traffic not only on the over-capacity segment, but on downstream segments that the excess traffic could not really reach because it either would divert to other routes or be queued upstream.

Forecasting the impossible: The status quo of estimating traffic flows with static traffic assignment and the future of dynamic traffic assignment, published in Research in Transportation Business and Management, https://doi.org/10.1016/j.rtbm.2018.06.002

Static assignment models allow traffic forecast on specific segments of roadway to exceed the capacity of that segmenta –physical impossibility, and also inaccurately forecast actual speeds because of the non-linear relationship between capacity and speed.

There is a method to reduce this bias, called dynamic trip assignment, which adjusts travel volumes and routing based on modeled levels of congestion.  Portland Metro is developing some dynamic assignment techniques, but they weren’t used in preparing the forecasts for the Rose Quarter freeway widening project.

The lemming model is pernicious for two reasons.  First, it presents a distorted view of what will happen under the no-build alternative.  The EA claims that there will be environmental benefits because we’ll avoid all of the congestion and pollution caused by motorist/lemmings jamming themselves into the Rose Quarter day after day..  But this claim is a fiction: in reality, traffic will never grow to the levels forecast in the model, because the model predicts more vehicles than the road can physically accommodate.  By exaggerating the level of traffic in the no-build scenario, the EA has created a false baseline for estimating the environmental effects of the widened freeway, thus disguising the effect of induced demand.  Because under static trip assignment, capacity makes no difference to whether or when people travel, there can’t possibly be induced demand.  Induced demand is effectively assumed away.  

The tendency to overestimate future traffic levels in mature travel corridors is also an endemic problem with the current methodology used to predict future transportation demand. After a careful review of the literature, the Government Accountability Office found:

. . . current travel demand models tend to predict unreasonably bad conditions in the absence of a proposed highway or transit investment. Travel forecasting, as previously discussed, does not contend well with land-use changes or effects on nearby roads or other transportation alternatives that result from transportation improvements or growing congestion. Before conditions get as bad as they are forecasted, people make other changes, such as residence or employment changes to avoid the excessive travel costs.

The weakness of transportation models in accurately predicting future traffic levels is a continuing problem. So it is not merely the Rose Quarter traffic projection model that is problematic; rather the entire class of four-step (trip generation, assignment, mode, routing models) have proved inaccurate in practice. After an exhaustive review of the state of the art, the Transportation Research Board of the National Academies wrote:

“. . . as has been true for the past four decades, these models could not provide accurate information to inform decision making on many transportation and land use policies or traffic operation projects.”
(Committee for Determination of the State of the Practice in Metropolitan Area Travel Forecasting, 2007)

While technology has allowed for faster computation, and more detailed mapping, they conclude:

“The practice of metropolitan travel forecasting has been resistant to fundamental change. Every 10 years or so there begins a cycle of research, innovation, resolve to put innovation into practice, and eventual failure to affect any appreciable change in how travel forecasting is practiced.”

(Committee for Determination of the State of the Practice in Metropolitan Area Travel Forecasting, 2007) pages 123-124.

Why wouldn’t highway departments implement a better method?  Well, as it turns out, the static trip assignment/lemming model tells them exactly what they want to hear:  If you don’t give us money to widen the highway, we’ll have endless gridlock.  It also “proves” that the expansion won’t create any induced demand. From the standpoint of highway builders, this is a convenient property for a model to have.

A model, any model, is only as good as the assumptions that go into it.  The fundamental assumption of static traffic assignment, that motorists behave like (fictional) lemmings, filing irrationally over a cliff (or into a gridlocked roadway, day after day), and that they don’t change their behavior in response to congestion, is dramatically wrong. It produces a gross over-estimate of how much roadways are likely to be congested, and paints capacity improvements in an unrealistic, favorable light, while structurally excluding the well-documented effects of induced demand.

Distorted images: Freeway widening is bad for pedestrians

The proposed I-5 Rose Quarter freeway widening project creates a bike- and pedestrian-hostile environment

The Oregon Department of Transportation has crafted distorted images that exaggerate pedestrian use by a factor of 200

As part of its effort to sell its $500 million dollar project to widen Interstate 5 at the Rose Quarter, the Oregon Department of Transportation has prepared some computer generated renderings of what area streets will look like.  At first glance, they seem to buttress the notion that this will be a pedestrian and bike friendly environment, with lots of space devoted to non-car travel. Project staff are is fond of repeating the words “36-foot wide multi-use path.”

But if you take a close look a the computer rendering, what you’ll see is both some troubling design flaws that actually make this a bad place for people who aren’t in cars.  And if you turn a critical eye on the way the rendering is constructed, you’ll see that ODOT has intentionally distorted the perspective to make things seem rosier than they actually are.

Wrong Way Traffic

The first thing you should notice is the two-way street in the center of the diagram.  It’s got lanes of traffic on either side of a combined bike-pedestrian multi-use path.  But unlike every other road in the city of Portland or the state of Oregon, traffic is driving on the left-hand side of the road. It is a miniature diverging diamond. Diamonds are used by traffic engineers explicitly to facilitate the faster movement of cars. But they’re disorienting to pedestrians and cyclists, because instinctively, they tend to look in the wrong direction to detect traffic.  That’s why diverging diamond intersections regularly have “Look Left” and Look Right” stenciled on the pavement to warn pedestrians that traffic is coming from an unexpected direction.  There’s also a bus only lane that appears to run on the sidewalk.

Two freeway on-ramps

Again, because this is a still image rather than a moving one, it isn’t clear what the cars on the wrong way diverging diamond are doing.  Where they’re going, in both cases is directly onto freeway on-ramps.  In both cases, the traffic light at the end of the diverging diamond (the southbound one is over Williams Avenue, immediately in the foreground, another one, not shown is at the other end of the diverging diamond) is the last traffic signal before vehicles enter the I-5 freeway both north and south bound.  Vehicles on the diverging diamond will be accelerating to freeway speeds.  So not only will the cyclists and pedestrians in the center of diverging diamond be disoriented by heavy traffic going in the wrong direction, they’ll also be confronting drivers intent on accelerating to freeway speeds.  The arrows on the following chart show the movement of cars, both from the west and from the north which will be accelerating on to the Southbound I-5 freeway on-ramp.

 

High Speed Corners

Throughout Portland, as part of the city’s efforts to improve safety, the city is building curb-extensions, bumping the corner of intersections out further into the street, to shorten crossing distances, to make pedestrians more visible to drivers, and critically, to force drivers to slow down to make turns–because the curb extension sticks out further into the junction of two roads, it forces drivers to make a sharper, and therefore slower turn. These drawings are from Portland’s Pedestrian Design Guidelines, and the photo shows a curb extension in practice:

But look closely at the corners of this intersection–in every case they’ve been cut back and rounded off, moving the corner further away from the street–the exact opposite of curb extensions. The following illustration shows the portion of the corners of each of these intersections–the red triangles–that would be cut away to speed car turning movements.

By making these turns wider–in engineering parlance “increasing the radius of curvature”–this encourages cars to drive faster, increasing the hazard to pedestrians.  In addition, it makes pedestrians less visible, and increases crossing distances.  At each of these corners, pedestrians in crosswalks will be confronting fast moving cars.

Distorted Images

While a close look at these renderings reveals some of the flaws in the design of this intersection, it’s important to remember that the image itself was crafted by the Oregon Department of Transportation, which made a number of editorial decisions which are clearly designed to influence the viewer.  Consider first the perspective and vantage point chosen for this image.  The viewer is floating 20 or 30 feet in the air, looking down at the bike path and sidewalk.  This doesn’t represent how actual humans will encounter and perceive the project. The foreground (which due to the rendering is shown much larger) is filled with bikes, pedestrians and green-painted spaces.  Nearly all of the car lanes are shown in the background and thereby made to seem much smaller. In a sense, that’s how perspective works, but the decision to put bike/pedestrian infrastructure in the foreground, and cars in the background conveys the impression that one is big, and the other small. Imagine for example, the cars were in the foreground and the bikes in the background–one would have a much different sense of how much space was devoted to cars as opposed to bike-riders and pedestrians.

Invisible cars

Another distortion comes from what’s shown–and what isn’t shown–in the renderings.  This seems to be a very pedestrian-oriented space because ODOT has chosen to show only a handful of cars. The easiest way to detect bias in this image is simply to count things.  How many cyclists are there? How many pedestrians? How many cars?  We can do that.  Here are 25 pedestrians (and we didn’t count ones that weren’t shown on sidewalks.

 

In addition there are 8 cyclists.

But there are only 5 cars.

On one of the busiest intersections in Portland, at the convergence of two major north-south and east west arterials, as well as the I-5 freeway, there are almost no cars.  But there are huge numbers of bike riders and pedestrians.  Bikes and pedestrians outnumber cars 5 to 1 in this illustration. That’s a wild distortion of reality.  Consider the actual evidence of who uses this intersection.  The City of Portland’s traffic counts, last taken on a typical afternoon in June, showed that in the afternoon rush hour, cars outnumbered pedestrians by about 40 to 1.  Here’s the data:

The data show that between 4 and 6 pm, there were 4,228 cars and trucks that traveled through this intersection and just 96 pedestrians:  In round numbers there were about 40 times more cars than pedestrians used this intersection.  What that means is that ODOT’s rendering has distorted the actual use of this intersection by a factor of about 200.  The ODOT rendering shows five times as many pedestrians as cars.  Reality is that there are about 40 times as many cars as pedestrians.  This is distortion on an epic scale. If one were to take this image as a depiction of reality, you would have to wonder, why are we widening the freeway when there are virtually no cars in this neighborhood?

Imaginary buildings

Like so many freeway interchanges, this is a hardly a pleasant urban environment.  In addition to freeway on-ramps, desolate and unkempt patches of land next to the roads and abutments, the most prominent commercial uses are a gas station, a car dealership and a paint store.  But this image has improved the area by adding several imaginary wireframe buildings.  Of course, the project doesn’t actually involve building any buildings, and there are no plans to make the freeway covers strong enough to support buildings, but that hasn’t stopped ODOT from making its project look more urban and inviting by installing fictional buildings that simply aren’t there.

The renderings prepared by ODOT masquerade as objective information, but are deeply biased.  If you look closely, you can see the projects flaws–wrong way traffic, high speed corners, bus lanes running on the sidewalk, city streets functioning as acceleration lanes for freeway on-ramps. More tellingly, though, the renderings are drawn in a particular way to present a misleading view of the project and its impacts, the viewer is looking at the seen from a kind of video-game viewpoint floating in air, perspective is used to make people and bikes look large, and cars look small, non-existent buildings are conjured out of thin air, to make the area seem more urban and inviting, and the entire scene is populated with people and bikes, while almost no cars are shown.

The Rose Quarter: ODOT’s Phony safety claims

There’s no evidence that widening the I-5 freeway at the Rose Quarter will reduce crashes.

ODOT used a model that doesn’t work for freeways with ramp-meters

When ODOT widened I-5 lanes and shoulders near Victory Boulevard in 2010, crash rates did not decline

Research shows interstate freeway shoulder widths aren’t correlated with crash rates

The marketing campaign for spending $1.45 billion to widen the I-5 Rose Quarter freeway heavily emphasizes safety–falsely claiming that this is a particularly dangerous roadway

Reducing crashes is a linchpin of the project in two ways, both to support an argument about safety, and also because the project claims that with fewer crashes they’ll be less “non-recurring delay.”  City and state officials concede that due to induced demand the project won’t do anything about daily recurring delay–any improvements in the regular flow of traffic will simply prompt more people (who now avoid the roadway because of congestion) to try to use it, with the result that it will be just as slow in future as it is before spending a billion dollars or so).

ODOT incorrectly applied the ISATe model to I-5, ignoring the effect of ramp meters.

The project claims that widening the freeway will reduce crashes.  How they know?  Well according to the project’s “Transportation Safety Technical Report,” were generated by a tool called “ISATe”.  Despite the impressive moniker, ISATe is just an excel spreadsheet.

The AASHTO HSM (AASHTO 2010) predictive method for highways and interchanges using the software ISATe was applied to estimate the relative safety performance of the proposed Project. The method was applied without calibration factors, so the results are presented as relative differences rather than absolute predictions. The models are applied on a segment-by-segment basis, and segments are defined to have consistent geometric characteristics. The largest safety benefit results from upgrading shoulders to full standard on both sides of the highway.

(Emphasis added)

This statement reveals a number of important points.  First, they didn’t calibrate the ISATe mode to existing crash rates on the roadway.  ISATe is based on a rough approximation of factors for a wide range of roadways, and to produce relative, rather than absolute values.

The report specifically says that the the methodology doesn’t account for the effects of ramp metering.  Ramp meters have already been installed on this portion of freeway,

And if you read the ISATe technical manual, there’s a section that explains the limitations of the model, helpfully labeled “Limitations of the Predictive Methods.” The manual says that the ISATe spreadsheet doesn’t estimate crashes for freeways that have ramp meters.

Bonneson, J., Pratt, M., and Geedipally, S., (et al), Enhanced Interchange Safety Analysis Tool: User Manual, National Cooperative Highway Research Program, Project 17-45, Enhanced Safety Prediction Methodology and Analysis Tool for Freeways and Interchanges, May 2012.

This is important because a key reason for ramp meters is to smooth the flow of traffic, and reduce merge conflicts that would otherwise result in an increase in crashes. What ODOT has done is use a tool that assumes that there are no ramp meters to estimate the number of crashes on a road that has long had — and will continue to have ramp meters.

Consequently, by inappropriately applying the ISATe spreadsheet to a roadway that’s already had ramp metering installed, ODOT has overstated the base level of crashes on the road, and overstated the likely reduction in crashes from wider shoulders.  According to the US Department of Transportation, ramp metering on Portland freeways has already produced a 43 percent decline in crashes.

Because the ISATe model doesn’t take account of the actual presence of ramp meters, it over-estimates the reduction in crashes that might be attributable to adding lanes and shoulders to this stretch of I-5.  In essence, the installation of ramp meters here has already accomplished some, if not most of the supposed crash reduction benefit that would be achieved by widening the freeway.

ODOT’s own experience on I-5 shows wider shoulders didn’t decrease crash rates

So rather than looking at a spreadsheet based model that, according to its own user manual, isn’t capable of accurately estimating crash rates on this kind of freeway, it makes more sense to look at the actual experience with freeway widenings on Interstate 5.  As we’ve pointed out at City Observatory, ODOT has already helpfully run this experiment. It widened a mile-long stretch of I-5 just north of the Rose Quarter in 2010, between Lombard Street and Victory Boulevard, adding lanes and widening shoulders.  That stretch of road is used by essentially the same drivers to travel through the Rose Quarter. It is, in a sense, a perfect natural experiment on the efficacy of wider freeways reducing crashes.

And what ODOT’s own data show is that after widening I-5 crash rates were higher than before the freeway widening. The data is shown below.  Crashes spiked during the construction period (2010; gray bar), which is a common occurrence.  But after project construction was completed, from 2011 onward, crashes were consistently about 10 percent higher than they had been prior to construction.  Far from reducing crashes, freeway widening actually increased them.

Research shows wider shoulders aren’t statistically associated with fewer crashes on Interstate Freeways

There’s actually no valid statistical evidence that shoulders wider than 6 feet on divided roads produce any additional safety benefit.

The same could be observed for shoulder width, where the crash rate decreases up to 6 ft and then varies as the shoulder becomes wider. These trends are simple observations, and statistical tests were not conducted to determine their statistical significance. (page 21)

Stamatiadis, N., Pigman, J., Sacksteder, J., Ruff, W., & Lord, D. (2009). NCHRP Report 633: Impact of shoulder width and median width on safety. Transportation Research Board of the National Academies, Washington, DC.

Other research has concluded that there is no statistically significant relationship between outside shoulder width and crash frequency on Interstate highways.

Highway geometric factors are found to have variable impacts on highway safety (across accident severity level and road class). For example, it is found that widening outside shoulder by 1% will have no impact on Interstate highways  . . .

Chen, S., Saeed, T. U., Alinizzi, M., Lavrenz, S., & Labi, S. (2019). Safety sensitivity to roadway characteristics: A comparison across highway classes. Accident Analysis & Prevention123, 39-50.

In short, there’s no evidence that the Rose Quarter freeway widening will improve safety:

  • ODOT used a model (ISATe) that doesn’t apply to this freeway, which has ramp meters.
  • ODOT’s own experience with widening the I-5 freeway and shoulders resulted in an increase in crash rates, not a decline
  • The scientific literature concludes that lane and shoulder width is a statistically insignificant factor in determining crash rates.

 

The Hidden Rose Quarter MegaFreeway

ODOT is really building an 8-lane mega-freeway at the Rose Quarter

You can tell from the tortured rhetoric about “auxiliary” lanes that the Oregon Department of Transportation is falling all over itself to make the freeway widening project it has proposed for the I-5 Rose Quarter area seem absolutely as small as possible.  They maintain that they’re not actually widening the roadway at all, just adding “ramp to ramp connections.”  In reality, however, ODOT is engineering a right-of-way that can easily hold an 8 lane freeway–effectively doubling the size of the current Interstate 5.

How big is it?

The public descriptions of the Rose Quarter freeway project go to great lengths to minimize the size of the project.  As we’ve noted the project’s Environmental Assessment insists that the lanes its adding to the I-5 freeway aren’t actual freeway lanes, but are an entirely different and environmentally benign creature called an auxiliary lane.

But for a moment, set-aside all this talk about counting lanes or what they’re called.  How big a footprint will the widened I-5 freeway have?  That fact is carefully concealed from view in the main Environmental Assessment Report, but if you did dig through the appendices, you’ll find one diagram in the project’s Right-of-Way analysis that hints at the freeway’s actual footprint.

This chart shows a representative cross section of the I-5 freeway at the Rose Quarter, as currently built (top) and as proposed to be widened by ODOT (bottom).  The fine print in the bottom of each schematic shows the number of feet of width of each lane and of the shoulders.  In the proposed widening, the total width is about 126 feet.  There are six 12-foot traffic lanes (three in each direction, two labeled “thru” lanes one labeled “auxiliary”), plus 12 foot shoulders on the left and right side of the traffic lanes.  We assume the median is five feet wide.  That works out to about 126 feet ((6 *12)+(4*12)+5=126).

If you look at that diagram, it looks like there’s a lot of space in that 126 feet relative to the size of the illustrated cars and trucks:  48 feet is dedicated to shoulders, which is a lot for a busy urban highway built on extremely expensive land.  (The Right of Way Report notes that land near the freeway is worth as much as  $150 to $200 per square foot, and land acquisition for the project will require a total budget of $55 million).

How many lanes can you fit in a 126-foot right of way?

So instead of arguing about what we call lanes, let’s ask a basic question:  How many freeway lanes can you fit in an urban, 126-foot right-of-way?  Well the answer can easily be found by simply looking at the way the ODOT stripes freeway lanes and shoulders in Portland.  According to ODOT’s lane report data, this section of the Interstate 84 freeway, just east of 12th Avenue has 3-foot left shoulders, and 7-foot right shoulders.  (Highway 2/Columbia River Highway is ODOT’s designation for Interstate 84.  LN1 through LN3 correspond to travel lanes, LS and RS are left shoulder width and right shoulder width, with all data presented in feet.  Data are show for I-84 from NE 12th Avenue to NE 21st Avenue (which corresponds approximately to the area illustrated in the photograph below).

If ODOT were to simply re-stripe its widened 126-foot Rose Quarter right of way in exactly the same fashion it has striped Interstate 84, less than a mile away, it could easily fit eight full-sized travel lanes in the space it is planning to build.  Four 12-foot travel lanes (48 feet), plus a 3-foot right shoulder and a 7-foot left shoulder (10 feet total shoulders), are equal to 58 total feet of width.  Allowing for a 5 foot median, this means an eight lane freeway would use 121 feet ((58*2)+5=121).  As noted above the Environmental Assessment schematic shows 126 feet of total right of way.

Despite the protestations that 12 foot left and right shoulders are “standard,” ODOT’s own practice–in this exact area–makes it clear that isn’t true. ODOT not only builds and operates urban freeways with narrower shoulders (and consequently more lanes), the US Department of Transportation holds out these design standards as a best practice for other cities and states to follow in providing more freeway capacity.

Portland’s I-84, is actually touted by the US Department of Transportation’s as poster child for narrower shoulders.  Here is a page of the USDOT report, “USE OF NARROW LANES AND NARROW SHOULDERS ON FREEWAYS: A Primer on Experiences, Current Practice, and Implementation Considerations.” FHWA HOP-16-060.  The narrow shoulders on I-84 are also featured on the cover of the document.

 

Build now, paint later

It’s pretty easy to see what’s likely to happen here, if this project is allowed to go ahead.  ODOT will widen the right of way and, on opening day, stripe it for six lanes (it will probably stop calling them auxiliary lanes just as soon as it breaks ground).  It will let a decent interval (six months or a year) pass, and in all likelihood, traffic will be just as bad, if not worse than it is today.  Then Oregon DOT–which will be operating under a new director–will note that the previous administration underestimated the challenge at the Rose Quarter, and will “discover” that there’s actually room to fit in a couple more travel lanes.  And ODOT will fire up the paint truck.

All that stands between the proposed Rose Quarter freeway widening project that’s depicted in computer renderings of the Environmental Assessment and an 8-lane freeway is a few hundred gallons of Federal Spec TTP-1952b traffic marking paint.  A few hours with a lane-marking machine would essentially double the capacity of the current I-5 freeway.  And nothing in the Environmental Assessment examines what the effects of such a widening would be, in terms of added car traffic from induced demand, air pollution and carbon emissions. Which brings us to the National Environmental Policy Act (NEPA).  NEPA requires that ODOT disclose the cumulative and reasonable foreseeable future consequences of its actions. This provision is designed to prevent agencies from breaking down otherwise environmentally significant actions into smaller, less impactful pieces that can individually escape review, but which cumulatively would affect the environment.

If striped like I-84, a widened Rose Quarter I-5 could accomodate four travel lanes in each direction.

In NEPA terms, the possibility that the Oregon Department of Transportation could come back, a few months or a few years after it’s spent $500 million on the concrete and structures for its 126 foot right-of-way, and simply restripe the freeway to add capacity, is “reasonably foreseeable.”  That means that rather than just looking at the project with stripes as proposed, the Environmental Assessment should present the likely consequences of building a road that could easily accomodate one more full travel lane than it has considered.  Alternatively, the Environmental Assessment ought to look at the alternative of building a narrower physical right-of-way.  The freeway right of way could be at least 20 feet narrower–and all of the physical impacts of widening on the community, including land takings (especially onto the site of Tubman Middle School), could be reduced, while achieving the same traffic flow that ODOT has claimed is needed.  Considering just this kind of alternative is exactly what the NEPA process is designed to do.  By treating 12 foot inside and outside shoulders as an unquestioned and unquestionable assumption, ODOT has both built in the room to easily widen the freeway, and concealed the likely long-term environmental effects of this decision and sidestepped its responsibility under NEPA to consider alternatives that would reasonably have smaller environmental impacts.

We’ve been down this road before

But would a state Department of Transportation so brazenly and cynically manipulate its environmental disclosures to hide its true intent?  Surely, if it really wanted to widen I-5 to 8 lanes through the Rose Quarter ODOT it would level with the public, wouldn’t it?

In fact, the Oregon Department of Transportation has a demonstrated track record of concealing the actual number of lanes it plans to build when it widens freeways, and specifically, the Interstate 5 freeway in Portland. In the Final Environmental Impact Statement of the Columbia River Crossing, issued in 2011, ODOT was always careful to refer to the new bridge and widened freeway it was proposing as having 10 lanes, 5 lanes in each direction.

On paper, this was a reduction from the 12-lane bridge and freeway put forward in the project’s Draft Environmental Impact Statement (DEIS), issued in 2008.  After the issuance of the DEIS, the City of Portland and other regional leaders sought to reduce the project’s environmental impact by reducing the number of proposed automobile lanes.  In 2010, at Mayor Sam Adams request, the Project Sponsors Council for the CRC directed ODOT to reduce the number of lanes on the CRC from 12 lanes to 10.  Let’s turn the microphone over Oregon Public Broadcasting‘s Kristian Foden-Vencil:

Leaders of cities, ports and transportation districts around Portland, agreed Monday on key aspects of a new I-5 bridge over the Columbia River. They agreed the new Columbia River Crossing ought to be 10 lanes wide — not 12. . . .  Portland Mayor Sam Adams, called the vote a significant step forward.
“Portland-Area Leaders Agree On Aspects Of New I-5 Bridge” OPB News, August 9, 2010

ODOT dutifully complied with this direction. When the project’s final environmental impact statement (FEIS) was released, all the references were to 5 lanes.  But the document and all its appendices had been carefully purged of any references to the actual physical width of the structure they were going to build.  Here’s the illustration of the freeway bridge’s cross-section from the project FEIS.  The CRC was to consist of two parallel, double-decker bridges, with the freeway carried on the top deck. The FEIS shows five travel lanes on each structure–for a total of ten.  Three lanes in each direction are labeled “through” lanes, and two lanes labeled “add-drop” lanes–an earlier version of ODOT’s Orwellian naming conventions).  Notice that this diagram lacks any dimensions, and carries the cryptic disclaimer “Not to scale.”
Columbia River Crossing, Final Environmental Impact Statement, page S-19.
Not to scale, indeed. It took a Freedom of Information Act request to get the Oregon Department of Transportation to disgorge the cross-sectional schematics showing  the bridge profile with actual dimensions.  They showed the bridge was going to be built to exactly the same physical width as originally proposed.  This diagram shows a close up view of a cross-section of one of the two spans; the deck is angled slightly due to the horizontal curvature of the bridge as it crosses the river.  The deck of each of these two bridges is 90 feet edge-to-edge, 180 feet wide in all–enough room for six 12 foot travel lanes and 18 feet of highway shoulders–exactly what ODOT included in the DEIS.  So while it played up the proposal’s five-lane paint scheme, ODOT charged ahead with concealed plans to build exactly the same physical structure as before.
CRC Bridge Cross Section Schematic, (Public Record Request #D00482).
Then, just as now, ODOT will conspicuously tell the public and the leaders what they want to hear about the number of lanes, as a way of assuaging environmental concerns, but will then quietly–even secretly–go ahead with plans to engineer a structure and right of way that provides space for the number of lanes it wants.

Traffic is declining at the Rose Quarter: ODOT growth projections are fiction

ODOT’s own traffic data shows that daily traffic (ADT) has been declining for 25 years, by -0.55 percent per year

The ODOT modeling inexplicably predicts that traffic will suddenly start growing through 2045, growing by 0.68 percent per year

ODOT’s modeling falsely claims that traffic will be the same regardless of whether the I-5 freeway is expanded, contrary to the established science of induced travel.

These ADT statistics aren’t contained in the project’s traffic reports, but can be calculated from data contained in its safety analysis.

ODOT has violated its own standards for documenting traffic projections, and violated national standards for maintaining integrity of traffic projections.

As we’ve noted, the Oregon Department of Transportation claims it needs to widen the I-5 freeway to meet growing traffic demand.  But the agency has been utterly opaque in either presenting its data or describing the methodology it used to reach that conclusion.  Neither its draft or supplemental environmental analyses contain any presentation of “average daily traffic” presently or in the future; and ADT is the most basic and widely used measure of traffic.  Asked about its methodology, the agency makes vague references to a 40-year old traffic forecasting handbook.  This opaque approach to fundamental data both violates NEPA, and the agency’s only standards for professional and ethical practice.

At City Observatory, we’ve found ODOT data—material not included in the project’s Traffic Technical Report—which show that traffic levels on I-5 in the Rose Quarter have been declining for a quarter century, and that the project is assuming that the next twenty five years will suddenly produce a surge in traffic.

Traffic Trends in the Rose Quarter

ODOT maintains a traffic counting program that tracks the number of vehicles traveling on every major roadway in Oregon.  We examined the on-line versions of ODOT’s Traffic Volume Tables for the years 1996 through 2019 (i.e. the last full year prior to the Covid-19 pandemic) to establish the historical trend in traffic.  We looked at data for the section of I-5 at NE Holladay Street (in the middle of the proposed I-5 Rose Quarter freeway widening project.  In 2019, the average daily traffic (ADT) in this section of the Roadway was 118,900 vehicles. That volume of traffic was lower than every year from 1996 through 2017.  As the following graph makes clear, the long-term pattern in traffic on this portion of I-5 is downward.

From 1996 through 2019, the average annual change in traffic on this stretch of I-5 was -0.55 percent per year.  That downward trend also holds for the last decade:  from 2009 through 2019, the average annual change in traffic was -0.30 percent per year.  Contrary to the claims of ODOT that this stretch of freeway is seeing more traffic, traffic has actually been decreasing.  Nothing in the historic record provides any basis for a claim that traffic on this section of freeway will increase if its capacity is not expanded.

ODOT’s Hidden ADT forecast

As we’ve noted multiple times at City Observatory, the traffic reports contained in ODOT’s Environmental Assessment (EA) and Supplemental Environmental Assessment (SEA) do not contain any references to Average Daily Traffic.  The project’s safety report, however, does contain an indirect reference to average daily traffic.  This report uses a safety handbook to compute the number of crashes that are likely to occur on this section of freeway.  The handbook method relies on applying a crash rate of a certain number of crashes per million vehicle miles traveled on a stretch of roadway.  While ODOT’s safety report doesn’t reveal the ADT figures used, we can use algebra to compute the ADT.  ODOT reports the number of crashes and the crash rate per million miles traveled and also the total number of lane miles of freeway included in the analysis.  From these three bits of information, we can work out that the average daily traffic for this section of the I-5 freeway is estimated to be 142,000 in 2045.  Here are our calculations:

We estimate the implied annual vehicle miles traveled in millions by dividing the predicted crashes by the predicted crash rate.  We then estimate daily vehicle miles traveled by dividing annual VMT by 365.  Finally, we computed average daily traffic by dividing VMT by the number of miles in the subject roadway segment (1.4).  This produces an estimate of approximately 142,000 ADT for this segment of I-5 in 2045.

This future traffic level forecast implies that traffic will increase by about 22,000 vehicles per day between 2019 and 2045.  This works out to an annual growth rate of 0.68 percent.  In essence, ODOT is predicting that even though traffic has declined by 0.55 percent per year for the past quarter century, it will over the next quarter century grow by 0.68 percent per year.  How or why this is likely to happen is never explained or documented in the EA or Supplemental EA.

Also, as we’ve noted before, ODOT’s traffic forecast for the I-5 Rose Quarter, according to its safety analysis, is for the same level of traffic in 2045 in both the Build and the No-Build scenario.  This claim flies in the face of what we know about the science of induced travel:  the construction of additional lanes creates added capacity, which will induce more trips in the corridor—if the project is built, but not if it isn’t. The prediction of higher traffic levels for a build scenario may be plausible, but for a no-build scenario it is unlikely that there would be any traffic growth, especially given the established 25-year trend of declining traffic on this portion of I-5.  By over-estimating travel in the No-Build scenario, the EA conceals the increased driving, pollution and likely crashes from widening the freeway.

In theory, the National Environmental Policy Act is all about disclosing facts. But in practice, that isn’t always how it works out. The structure and content of the environmental review is in the hands of the agency proposing the project, in this case the proposed $1.45 billion widening of the I-5 Rose Quarter freeway in Portland. The Oregon Department of Transportation and the Federal Highway Administration have already decided what they want to do: Now they’re writing a set of environmental documents that are designed to put it in the best possible light. And in doing so, they’re keeping the public in the dark about the most basic facts about the project.  In the case of the I-5 project, they haven’t told us how many vehicles are going to use the new wider freeway they’re going to build.

No New Regional Modeling

The Traffic Technical Report of the Supplemental Environmental Assessment makes it clear that ODOT has done nothing to update the earlier regional scale traffic modeling it did for the 2019 Environmental Assessment.  ODOT claims it used Metro’s Regional Travel Demand Model to generate its traffic forecasts for the i-5 freeway—but it has never published that models assumptions or results.  And in the three years since the original report was published, it has done nothing to revisit that modeling.  The traffic technical report says:

. . . the travel demand models used in the development of future traffic volumes incorporated into the 2019 Traffic Analysis Technical Report are still valid to be used for this analysis.

What ODOT has never done is “show its work”—i.e. demonstrate how its estimates and modeling turned a highway that was seeing declines of 0.55 percent per year for a quarter century into one where we should expect 0.68 percent increases in traffic ever year for the next quarter century.  In its comments on the 2019 EA, a group of technical experts pointed out a series of problems with that modeling.  Because ODOT made no effort to update or correct its regional modeling, all of those same problems pervade the modeling in the new traffic technical report.

Hiding results

ODOT has gone to great lengths to conceal its actual estimates of future average daily traffic on I-5.  Neither the 2019 Environmental Assessment, nor the 2022 Supplemental Environmental Assessment’s traffic reports contain any mention of average daily traffic.  As illustrated above, ODOT also expunged the actual ADT statistics from the safety report as well (even though its calculations make it clear that it is assuming that this stretch of freeway will have 142,000 ADT.  And when you look at what is presented, ODOT has purposefully excluded any indication of actual traffic levels.  The supplemental traffic analysis presents its results in a way that appears designed to obscure, rather than reveal facts.  Here is a principal table comparing the No-Build and Build designs.

Notice that the tables do not report actual traffic volumes, either daily (ADT) or hourly volumes.  Instead, the table reports the “V/C” (volume to capacity) ratio.  But because it reveals neither the volume, nor the capacity, readers are left completely in the dark as to how many vehicles travel through the area in the two scenarios.  This is important because the widening of the freeway increases roadway capacity, but because ODOT reveals neither the volume nor the capacity, it’s impossible for an outside observer to discern how many more vehicles the project anticipates moving through the area.  This, in turn, is essential to understanding the project’s traffic and environmental impacts.  It seems likely that the model commits the common error of forecasting traffic volumes in excess of capacity (i.e. between I-84 and Weidler) in the No-Build.  As documented by modeling expert Norm Marshall, predicted volumes in excess of capacity are symptomatic of modeling error.

ODOT is violating its own standards and professional standards by failing to document these basic facts

The material provided in the traffic technical report is so cryptic, truncated and incomplete that it is impossible to observe key outputs or determine how they were produced.  This is not merely sloppy work. This is a clear violation of professional practice in modeling.  ODOT’s own Analysis Procedures Manual (which spells out how ODOT will analyze traffic data to plan for highway projects like the Rose Quarter, states that the details need to be fully displayed:

6.2.3 Documentation
It is critical that after every step in the DHV [design hour volume] process that all of the assumptions and factors are carefully documented, preferably on the graphical figures themselves. While the existing year volume development is relatively similar across types of studies, the future year volume development can go in a number of different directions with varying amounts of documentation needed. Growth factors, trip generation, land use changes are some of the items that need to be documented. If all is documented then anyone can easily review the work or pick up on it quickly without questioning what the assumptions were. The documentation figures will eventually end up in the final report or in the technical appendix.
 The volume documentation should include:
• Figures/spreadsheets showing starting volumes (30 HV)
• Figures/spreadsheets showing growth factors, cumulative analysis factors, or travel demand model post-processing.
• Figures/spreadsheets showing unbalanced DHV
• Figure(s) showing balanced future year DHV. See Exhibit 6-1
• Notes on how future volumes were developed:
    o If historic trends were used, cite the source.
    o If the cumulative method was used, include a land use map, information that documents trip generation, distribution, assignment, in-process trips, and through movement (or background) growth.
o If a travel demand model was used, post-processing methods should be specified, model scenario assumptions described, and the base and future year model runs should be attached

This is also essential to personal integrity in forecasting.  The American Association of State Highway and Transportation Officials publishes a manual to guide its member agencies (including ODOT) in the preparation of highway forecasts.  It has specific direction on personal integrity in forecasting.  National Cooperative Highway Research Project Report, “Analytical Travel Forecasting Approaches for Project-Level Planning and Design,”  NCHRP Report #765—which ODOT claims provides its methodology— states:

It is critical that the analyst maintain personal integrity. Integrity can be maintained by working closely with management and colleagues to provide a truthful forecast, including a frank discussion of the forecast’s limitations. Providing transparency in methods, computations, and results is essential. . . .  The analyst should document the key assumptions that underlie a forecast and conduct validation tests, sensitivity tests, and scenario tests—making sure that the results of those tests are available to anyone who wants to know more about potential errors in the forecasts.

 

Appendix:  Expert Panel Critique of Rose Quarter Modeling

RQ Model Critique

Updated December 2, 2023

The black box: Hiding the facts about freeway widening

State DOT officials have crafted an Supplemental Environmental Assessment that conceals more than it reveals

The Rose Quarter traffic report contains no data on “average daily traffic” the most common measure of vehicle travel

Three and a half years later and ODOT’s Rose Quarter’s Traffic Modeling is still a closely guarded secret

The new SEA makes no changes to the regional traffic modeling done for the 2019 EA, which was done 7 years ago in 2015

The report misleadingly cites “volume to capacity ratios” without revealing either volumes or capacities

ODOT has violated its own standards for documenting traffic projections, and violated national standards for maintaining integrity of traffic projections.

In theory, the National Environmental Policy Act is all about disclosing facts. But in practice, that isn’t always how it works out. The structure and content of the environmental review is in the hands of the agency proposing the project, in this case the proposed $500 million widening of the I-5 Rose Quarter freeway in Portland. The Oregon Department of Transportation and the Federal Highway Administration have already decided what they want to do: Now they’re writing a set of environmental documents that are designed to put it in the best possible light. And in doing so, they’re keeping the public in the dark about the most basic facts about the project.  In the case of the I-5 project, they haven’t told us how many vehicles are going to use the new wider freeway they’re going to build.

A traffic report without ADT is like a financial report without $

To take just one prominent example, the Project’s “Traffic Technical Report” which purports to discuss how the traffic will affect the flow of vehicles on the freeway—which after all, is the project’s purpose–conspicuously omits the most common and widely used metric of traffic volume:  average daily traffic or ADT.

How common is ADT?  It’s basically the standard yardstick of describing traffic. ODOT uses it to decide how wide roads should be.  It’s the denominator in calculating road safety. Average daily traffic is also, not incidentally, the single most important variable in calculating how much carbon and other air pollutants cars will emit when they drive on this section of road. ODOT maintains a complicated system of recording stations and estimation, tracking traffic for thousands of road segments on highways. ODOT’s annual report,  Traffic Volume Trends details average daily traffic for about 3,800 road segments statewide.  It also turns out that predicted future ADT is an essential input into the crash modeling software that ODOT used to predict crash rates on the freeway (“ADT” appears 141 times in the model’s user manual). ODOT uses ADT numbers throughout the agency: Google reports that the Oregon DOT website has about 1,300 documents with the term “ADT” and nearly 1,000 with the term “average daily traffic.” Chapter 5 of ODOT’s Analysis Procedure Manual, last updated in July 2018, contains 124 references to the term “ADT” in just 55 pages. “Average daily traffic” as fundamental to describing traffic as degrees fahrenheit is to a weather report.

Three and a half years ago, we searched through the original “traffic analysis technical report” prepared for the Rose Quarter project.  It had absolutely no references to ADT:  The Rose Quarter I-5 Traffic Analysis Technical Report.  We searched the PDF file of the report for the term “ADT”—this was the result:

 

ODOT just made public its “Traffic Analysis Supplemental Technical Report” for the Rose Quarter project’s Supplemental Environmental Assessment.  We repeated our experiment and found . . . still no references to average daily traffic:

 

ADT is just the very prominent tip of much larger a missing data iceberg.  There’s much, much more that’s baked in to the assumptions and models used to create the estimates in the report that simply isn’t visible or documented anywhere in the Environmental Assessment or its cryptic and incomplete appendices.  The advocacy group No More Freeways have identified a series of documents and data series that are missing from the report and its appendices, and has filed a formal request to obtain this data. In March, 2019, just prior to the expiration of the public comment period, ODOT provided fragmentary and incomplete data on some hourly travel volumes, but again, no average daily traffic data. The new traffic technical report does not contain this data or updates of it.

No New Regional Modeling

The Traffic technical report makes it clear that ODOT has done nothing to update the earlier regional scale traffic modeling it did for the project.  ODOT claims it used Metro’s Regional Travel Demand Model to generate its traffic forecasts for the i-5 freeway—but it has never published that models assumptions or results.  And in the three years since the original report was published, it has done nothing to revisit that modeling.  The traffic technical report says:

. . . the travel demand models used in the development of future traffic volumes incorporated into the 2019 Traffic Analysis Technical Report are still valid to be used for this analysis.

In its comments on the 2019 EA, a group of technical experts pointed out a series of problems with that modeling.  Because ODOT made no effort to update or correct its regional modeling, all of those same problems pervade the modeling in the new traffic technical report.

Hiding results

The supplemental traffic analysis presents its results in a way that appears designed to obscure, rather than reveal facts.  Here is a principal table comparing the No-Build and Build designs.

x

Notice that the tables do not report actual traffic volumes, either daily (ADT) or hourly volumes.  Instead, the table reports the “V/C” (volume to capacity) ratio.  But because it reveals neither the volume, nor the capacity, readers are left completely in the dark as to how many vehicles travel through the area in the two scenarios.  This is important because the widening of the freeway increases roadway capacity, but because ODOT reveals neither the volume nor the capacity, it’s impossible for an outside observer to discern how many more vehicles the project anticipates moving through the area.  This, in turn, is essential to understanding the project’s traffic and environmental impacts.  It seems likely that the model commits the common error of forecasting traffic volumes in excess of capacity (i.e. between I-84 and Weidler) in the No-Build.  As documented by modeling expert Norm Marshall, predicted volumes in excess of capacity are symptomatic of modeling error.

ODOT is violating its own standards and professional standards by failing to document these basic facts

The material provided in the traffic technical report is so cryptic, truncated and incomplete that it is impossible to observe key outputs or determine how they were produced.  This is not merely sloppy work. This is a clear violation of professional practice in modeling.  ODOT’s own Analysis Procedures Manual (which spells out how ODOT will analyze traffic data to plan for highway projects like the Rose Quarter, states that the details need to be fully displayed:

6.2.3 Documentation
It is critical that after every step in the DHV [design hour volume] process that all of the assumptions and factors are carefully documented, preferably on the graphical figures themselves. While the existing year volume development is relatively similar across types of studies, the future year volume development can go in a number of different directions with varying amounts of documentation needed. Growth factors, trip generation, land use changes are some of the items that need to be documented. If all is documented then anyone can easily review the work or pick up on it quickly without questioning what the assumptions were. The documentation figures will eventually end up in the final report or in the technical appendix.
 The volume documentation should include:
• Figures/spreadsheets showing starting volumes (30 HV)
• Figures/spreadsheets showing growth factors, cumulative analysis factors, or travel demand model post-processing.
• Figures/spreadsheets showing unbalanced DHV
• Figure(s) showing balanced future year DHV. See Exhibit 6-1
• Notes on how future volumes were developed:
    o If historic trends were used, cite the source.
    o If the cumulative method was used, include a land use map, information that documents trip generation, distribution, assignment, in-process trips, and through movement (or background) growth.
o If a travel demand model was used, post-processing methods should be specified, model scenario assumptions described, and the base and future year model runs should be attached

This is also essential to personal integrity in forecasting.  The American Association of State Highway and Transportation Officials publishes a manual to guide its member agencies (including ODOT) in the preparation of highway forecasts.  It has specific direction on personal integrity in forecasting.  National Cooperative Highway Research Project Report, “Analytical Travel Forecasting Approaches for Project-Level Planning and Design,”  NCHRP Report #765—which ODOT claims provides its methodology— states:

It is critical that the analyst maintain personal integrity. Integrity can be maintained by working closely with management and colleagues to provide a truthful forecast, including a frank discussion of the forecast’s limitations. Providing transparency in methods, computations, and results is essential. . . .  The analyst should document the key assumptions that underlie a forecast and conduct validation tests, sensitivity tests, and scenario tests—making sure that the results of those tests are available to anyone who wants to know more about potential errors in the forecasts.

ODOT:  A history of hiding facts from the public

This shouldn’t come as a surprise to anyone who has followed the Rose Quarter project. ODOT staff working on the project went to great lengths to conceal the actual physical width of the project they are proposing to build (it’s 160 feet wide, enough for a 10 lane freeway, something they still don’t reveal in the new EA, and which they don’t analyze in the traffic report).  And an Oregon judge found that ODOT staff violated the state’s public records law by preparing a false copy of documents summarizing public comments on the Rose Quarter project.  ODOT falsely claimed that a peer review group validated its claims that the project would reduce air pollution; the leader of that group said ODOT hadn’t provided any information about the travel modeling on which the pollution claims were based, and that the group couldn’t attest to their accuracy.

This kind of distortion about traffic modeling has a long history:  the Oregon Department of Transportation has previously presented flawed  estimates of carbon emissions from transportation projects.  During the 2015 Oregon Legislature, the department produced estimates saying that a variety of “operational improvements” to state highways would result in big reductions in carbon.  As with the Rose Quarter freeway widening, the putative gains were assumed to come from less stop and go traffic. Under questioning from environmental groups and legislators, the department admitted that its estimates of savings were overstated by a factor of at least three. ODOT’s mis-representations so poisoned the debate that it killed legislative action on a pending transportation package. What this case demonstrates is that the Oregon Department of Transportation is an agency that allows its desire to get funds to build projects to bias its estimates. That demonstrated track record should give everyone pause before accepting the results of an Environmental Assessment that conceals key facts.

Not presenting this data as part of the Environmental Assessment prevents the public from knowing, questioning and challenging the validity of the Oregon Department of Transportation’s estimates. In effect, we’re being told to blindly accept the results generated from a black box, when we don’t know what data was put in the box, or how the box generates its computations.

This plainly violates the spirit of NEPA, and likely violates the letter of the law as well.  Consider a recent court case challenging the environmental impact statement for a highway widening project in Wisconsin. In that case, a group of land use and environmental advocates sued the US Department of Transportation, alleging that the traffic projections (denominated, as you might have guessed, in ADT) were unsubstantiated.  The US DOT and its partner the Wisconsin Department of Transportation (WisDOT) had simply provided the final outputs of its model in the environmental report–but had concealed much of the critical input data and assumptions. A federal judge ruled that the failure to disclose this information violated NEPA:

In the present case, the defendants [the state and federal transportation departments] have not explained how they applied their methodology to Highway 23 in a way that is sufficient for either the court or the plaintiff to understand how they arrived at their specific projections of traffic volumes through the year 2035. They do not identify the independent projections that resulted from either the TAFIS regression or the TDM model and do not identify whether they made any adjustments to those projections in an effort to reconcile them or to comply with other directives, such as the directive that projected growth generally cannot be less than 0.5% or the directive to ensure that all projections make sense intuitively. For this reason, the defendants have failed to comply with NEPA. This failure is not harmless. Rather, it has prevented the plaintiffs from being able to understand how the defendants arrived at traffic projections that seem at odds with current trends. Perhaps the defendants’ projections are accurate, but unless members of the public are able to understand how the projections were produced, such that they can either accept the projections or intelligently challenge them, NEPA cannot achieve its goals of informed decision making and informed public participation.

1000 Friends of Wisconsin v. U.S. DOTCase No. 11-C-054, Decision & Order, May 22, 2015

Why this matters

The entire rationale for this project is based on the assumption that if nothing is done, traffic will get much worse at the Rose Quarter, and that if this project is built things will be much better.  But the accuracy of these models is very much in question.

This is also important because the Environmental Assessment makes the provocative claim that this project–completely unlike any other urban freeway widening project in US history–will reduce both traffic and reduce carbon emissions. The academic literature on both questions is firmly settled, and has come to the opposite conclusion. The regularity with which induced demand swamps new urban road capacity and leads to even more travel and pollution is so well documented that it is now called “The Fundamental Law of Road Congestion.” The claims that freeway widening projects can offset carbon emissions by reduced idling have also been disproven:  Portland State researchers Alex Bigazzi and Manuel Figliozzi have published one of a series of papers indicating that the reverse is true:  wider roads lead to more driving and more carbon pollution, not less.  And in addition to not providing data, there’s nothing in the report to indicate that the authors have produced any independent, peer-reviewed literature to support their claims about freeway widening. The agencies simply point at the black box.

Statistical errors are common and can easily lead to wrong conclusions

Maybe, just maybe, the Oregon Department of Transportation has a valid and accurate set of data and models. But absent disclosure, there’s no way for any third party to know whether they’ve made errors or not.  And, even transportation experts, working with transportation data, can make consequential mistakes. Consider the recent report of the National Highway Traffic Safety Administration (another arm of the US Department of Transportation) which computed the crash rate for Tesla’s cars on autopilot.  In a report released January 2017, NHTSA claimed that Tesla’s autopilot feature reduced crashes 40 percent.

That didn’t sound right to a lot of transportation safety experts, including a firm called Quality Control Systems. They asked NHTSA for the data, but were unable to get it, until the agency finally complied with a Freedom of Information Act (FOIA) request two years later.  The result:  NHTSA had made a fundamental math error in computing the number of vehicle miles traveled (which is actually an analog of Average Daily Traffic).  Rather than being relatively safe, the corrected calculation showed that the auto-pilot feature is significantly more dangerous that the average of all human drivers.

What QCS found upon investigation, however, was a set of errors so egregious, they wrecked any predictive capability that could be drawn from the data set at all. In fact, the most narrow read of the most accurate data set would suggest that enabling Autosteer actually increased the rate of Tesla accidents by 59 percent.

A government agency hiding key data is putting the public behind the 8-ball

Without the opportunity to look at the data, there’s no way for anyone to check to see if ODOT has made a mistake in its calculations. A black box is no way to inform the public about a half billion dollar investment; you might say, it put us behind the 8-ball. It makes you wonder:  What are they hiding?

 

Appendix:  Expert Panel Critique of Rose Quarter Modeling

RQ Model Critique

The black box: Hiding the facts about freeway widening

`State DOT officials have crafted an Environmental Assessment that conceals more than it reveals

In theory, the National Environmental Policy Act is all about disclosing facts. But in practice, that isn’t always how it works out. The structure and content of the environmental review is in the hands of the agency proposing the project, in this case the proposed $500 million widening of the I-5 Rose Quarter freeway in Portland. The Oregon Department of Transportation and the Federal Highway Administration have already decided what they want to do: Now they’re writing a set of environmental documents that are designed to put it in the best possible light. And in doing so, they’re keeping the public in the dark about the most basic facts about the project.  In the case of the I-5 project, they haven’t told us how many vehicles are going to use the new wider freeway they’re going to build.

A traffic report without ADT is like a financial report without $

To take just one prominent example, the Project’s “Traffic Technical Report” which purports to discuss how the traffic will affect the flow of vehicles on the freeway–which after all, is the project’s purpose–conspicuously omits the most common and widely used metric of traffic volume:  average daily traffic or ADT.

How common is ADT?  It’s basically the standard yardstick of describing traffic. ODOT uses it to decide how wide roads should be.  It’s the denominator in calculating road safety. Average daily traffic is also, not incidentally, the single most important variable in calculating how much carbon and other air pollutants cars will emit when they drive on this section of road. ODOT maintains a complicated system of recording stations and estimation, tracking traffic for thousands of road segments on highways. ODOT’s annual report,  Traffic Volume Trends details average daily traffic for about 3,800 road segments statewide.  It also turns out that predicted future ADT is an essential input into the crash modeling software that ODOT used to predict crash rates on the freeway (“ADT” appears 141 times in the model’s user manual). ODOT uses ADT numbers throughout the agency: Google reports that the Oregon DOT website has about 1,300 documents with the term “ADT” and nearly 1,000 with the term “average daily traffic.” Chapter 5 of ODOT’s Analysis Procedure Manual, last updated in July 2018, contains 124 references to the term “ADT” in just 55 pages. “Average daily traffic” as fundamental to describing traffic as degrees fahrenheit is to a weather report.

But there’s one place you’ll find absolutely no references to ADT:  The Rose Quarter I-5 Traffic Analysis Technical Report.  We searched the PDF file of the report for the term “ADT”–here’s the result:

Search through the appendices to the Environmental Assessment, and you might stumble on a reference to ADT that escaped the censors.  You’ll actually find four references to ADT in the Safety Technical report (page 31), but these are only for surface streets, not for the freeway.

ADT is just the very prominent tip of much larger a missing data iceberg.  There’s much, much more that’s baked in to the assumptions and models used to create the estimates in the report that simply isn’t visible or documented anywhere in the Environmental Assessment or its cryptic and incomplete appendices.  The advocacy group No More Freeways have identified a series of documents and data series that are missing from the report and its appendices, and has filed a formal request to obtain this data. To date, no further data has been provided by the state or federal transportation departments.

This is important because the Environmental Assessment makes the provocative claim that this project–completely unlike any other urban freeway widening project in US history–will reduce both traffic and reduce carbon emissions. The academic literature on both questions is firmly settled, and has come to the opposite conclusion. The regularity with which induced demand swamps new urban road capacity and leads to even more travel and pollution is so well documented that it is now called “The Fundamental Law of Road Congestion.” The claims that freeway widening projects can offset carbon emissions by reduced idling have also been disproven:  Portland State researchers Alex Bigazzi and Manuel Figliozzi have published one of a series of papers indicating that the reverse is true:  wider roads lead to more driving and more carbon pollution, not less.  And in addition to not providing data, there’s nothing in the report to indicate that the authors have produced any independent, peer-reviewed literature to support their claims about freeway widening. The agencies simply point at the black box.

If the Oregon Department of Transportation and its consultants have discovered a remarkable new scientific finding by which this project–unlike any other, anywhere–can reduce carbon emissions, then they owe it to not just the citizens of Portland, but to the entire world to explain how this remarkable process operates.

In this context, it’s worth pointing out that the Oregon Department of Transportation has previously presented flawed  estimates of carbon emissions from transportation projects.  During the 2015 Oregon Legislature, the department produced estimates saying that a variety of “operational improvements” to state highways would result in big reductions in carbon.  As with the Rose Quarter freeway widening, the putative gains were assumed to come from less stop and go traffic. Under questioning from environmental groups and legislators, the department admitted that its estimates of savings were overstated by a factor of at least three. ODOT’s mis-representations so poisoned the debate that it killed legislative action on a pending transportation package. What this case demonstrates is that the Oregon Department of Transportation is an agency that allows its desire to get funds to build projects to bias its estimates. That demonstrated track record should give everyone pause before accepting the results of an Environmental Assessment that conceals key facts.

Not presenting this data as part of the Environmental Assessment prevents the public from knowing, questioning and challenging the validity of the Oregon Department of Transportation’s estimates. In effect, we’re being told to blindly accept the results generated from a black box, when we don’t know what data was put in the box, or how the box generates its computations.

This plainly violates the spirit of NEPA, and likely violates the letter of the law as well.  Consider a recent court case challenging the environmental impact statement for a highway widening project in Wisconsin. In that case, a group of land use and environmental advocates sued the US Department of Transportation, alleging that the traffic projections (denominated, as you might have guessed, in ADT) were unsubstantiated.  The US DOT and its partner the Wisconsin Department of Transportation (WisDOT) had simply provided the final outputs of its model in the environmental report–but had concealed much of the critical input data and assumptions. A federal judge ruled that the failure to disclose this information violated NEPA:

In the present case, the defendants [the state and federal transportation departments] have not explained how they applied their methodology to Highway 23 in a way that is sufficient for either the court or the plaintiff to understand how they arrived at their specific projections of traffic volumes through the year 2035. They do not identify the independent projections that resulted from either the TAFIS regression or the TDM model and do not identify whether they made any adjustments to those projections in an effort to reconcile them or to comply with other directives, such as the directive that projected growth generally cannot be less than 0.5% or the directive to ensure that all projections make sense intuitively. For this reason, the defendants have failed to comply with NEPA. This failure is not harmless. Rather, it has prevented the plaintiffs from being able to understand how the defendants arrived at traffic projections that seem at odds with current trends. Perhaps the defendants’ projections are accurate, but unless members of the public are able to understand how the projections were produced, such that they can either accept the projections or intelligently challenge them, NEPA cannot achieve its goals of informed decision making and informed public participation.

1000 Friends of Wisconsin v. U.S. DOTCase No. 11-C-054, Decision & Order, May 22, 2015

Statistical errors are common and can easily lead to wrong conclusions

Maybe, just maybe, the Oregon Department of Transportation has a valid and accurate set of data and models. But absent disclosure, there’s no way for any third party to know whether they’ve made errors or not.  And, even transportation experts, working with transportation data, can make consequential mistakes. Consider the recent report of the National Highway Traffic Safety Administration (another arm of the US Department of Transportation) which computed the crash rate for Tesla’s cars on autopilot.  In a report released January 2017, NHTSA claimed that Tesla’s autopilot feature reduced crashes 40 percent.

That didn’t sound right to a lot of transportation safety experts, including a firm called Quality Control Systems. They asked NHTSA for the data, but were unable to get it, until the agency finally complied with a Freedom of Information Act (FOIA) request two years later.  The result:  NHTSA had made a fundamental math error in computing the number of vehicle miles traveled (which is actually an analog of Average Daily Traffic).  Rather than being relatively safe, the corrected calculation showed that the auto-pilot feature is significantly more dangerous that the average of all human drivers.

What QCS found upon investigation, however, was a set of errors so egregious, they wrecked any predictive capability that could be drawn from the data set at all. In fact, the most narrow read of the most accurate data set would suggest that enabling Autosteer actually increased the rate of Tesla accidents by 59 percent.

A government agency hiding key data is putting the public behind the 8-ball

Without the opportunity to look at the data, there’s no way for anyone to check to see if ODOT has made a mistake in its calculations. A black box is no way to inform the public about a half billion dollar investment; you might say, it put us behind the 8-ball. It makes you wonder:  What are they hiding?

 

Why do poor school kids have to clean up rich commuter’s pollution?

The fundamental injustice of pollution from urban freeways

Item:  In the past two years, Portland Public Schools has spent nearly $12.5 million of its scarce funds to clean up the air at Harriet Tubman Middle School.  This money will buy an expensive state-of-the-art air filtration system that will make the air inside the school safe for students to breathe.  Scientists from Portland State University, who conducted an air quality assessment of the site–at a cost of an additional half million dollars–have warned the students against exercising outside because of poor air quality.

And make no mistake, pollution from cars is a threat not just to the health of students, but to their ability to learn as well. A recent study shows that pollution from cars and trucks lowers student performance in schools near highways.  Students attending schools located near and downwind from busy highways had lower rates of academic performance, higher absenteeism and higher rates of disciplinary problems than those attending less polluted schools. The more traffic on nearby roads, the larger the decline in scores on state standardized tests.

Tubman School faces a further increase in air pollution from the proposal of the Oregon Department of Transportation to spend a half billion dollars to widen the portion of Interstate 5 that runs right by the school.  The freeway-widening project will cut away a portion of the hillside that now separates the freeway from the school, moving the cars and trucks still closer to the building, and also increasing their volume—and the volume of pollution they emit.

This video shows how the freeway would be moved closer to Tubman Middle School.

So, here’s a question:  Why is the school district paying for the pollution controls?  Why aren’t the 120,000 vehicles that drive past the school every day paying for it?  They’re the ones creating the pollution and benefitting from the freeway.

As we pointed out earlier at City Observatory, the gravity of this question is underscored by the huge disparity in the demographics of those who use the freeway, especially at peak hours, and those who attend Tubman Middle School.  Peak hour, drive-alone commuters from Clark County, Washington have average household incomes of $82,500; and 75 percent of them are white, non-Hispanic.  More than two-thirds of Tubman students are people of color; and half the student body is poor enough to qualify for free or reduced price meals.

In a very real sense, what this does is make students pay for the costs of pollution.  The millions and millions of dollars being used to pay to install and operate air filters is money that isn’t available to pay for books and teachers. Meanwhile, freeway users get a free ride. This is plainly unfair.

Who was here first?

So why did Portland Public Schools build a school next to a freeway in the first place?  If they did so, then clearly, they must bear a big part of the blame for the fact that kids have to breathe here.  What makes this whole situation even more unfair is that, actually, the school was here first.  Lest there be any doubt, take a look at this aerial photo showing the construction of Interstate 5 in 1962.  (The Tubman School is outlined in red).

Source: City of Portland Archives, 1962.

When it was built, Tubman School stood on a bluff, overlooking the city of Portland.  The Oregon State Highway Department, following the advice of Robert Moses, cut away the hillside and dropped the freeway right next to the school.  So if the school was there first, why isn’t ODOT paying to clean up the air its students have to breathe?

Well, back in the 1960s, highway departments, Oregon’s included, seldom paid for any of the damage they did to cities.  As we’ve noted, the Oregon Department of Transportation obliterated hundreds of homes in this neighborhood and did nothing to replace the lost housing. Back then, air pollution was a greatly under-appreciated problem. Years after I-5 was built, ODOT did install some concrete walls to attenuate freeway noise in North Portland, but as to air pollution, nothing.

The Coase Theorem

Here’s where things get a bit wonky, at least for economists. There’s a famous conjecture in economics called the Coase Theorem which at its root is based on a story very much akin to that of the the freeway and the school.  Coase’s story is about a farmer and a railroad, in this case, an old fashioned steam-powered railroad, with smoke-and-spark belching engines. Sparks from the steam engines would fly into the farmer’s field, burning her crops. Coase mulled over the economics of who should pay whom for the damages, and what would be an efficient and fair outcome. He concluded, that it didn’t actually matter, as long as either one party or the other had clear property rights. Let’s turn the microphone over to University of California economist Brad DeLong, who picks up the story, first summarizing Coase’s argument, then pointing up a huge flaw.

The brilliant Ronald Coase . . .  was interpreted to have argued that pretty much any arrangement of property rights will do about as well as any other and the government should simply step back.The canonical case adduced was the locomotive that occasionally throws off sparks that burn the nearby farmer’s crops. If the railroad has a duty of care not to burn the crops, Coase said, the railroad will attach spark-catchers if it is cheap and makes sense to do so – and the railroad will pay damages and settle in order to avoid being hauled into court on a tort claim if it is expensive and doesn’t make sense to do so. If the railroad has no duty of care, Coase said, then the farmer will offer to pay the railroad to install spark-catchers – and spark-catchers will be installed if the potential damage to the crops is greater than the cost of the spark-catcher and it makes sense to do so, and spark-catchers will not be installed if the damage to the crops is less than the cost.

Thus the same decisions will be made whatever the property rights – as long as there are settled property rights. If there are not settled property rights, then the crops burn and lawyers grow fat. But as long as there are property rights, the market will work fine. Maybe the widows and orphans who own railroad shares will be wealthier under one setup and maybe the farmers will be wealthier under the other, but that is rarely a matter of great public concern.

Now this argument has always seemed to me to be wrong. If there is no duty of care on the part of the railroad, it has an incentive not just to threaten not to install a spark-catcher, but to design and build the most spark-throwing engine imaginable – to make sure that the firebox is also a veritable flamethrower – and then to demand that the farmer bribe it not to set the fields on fire. What economists call “externalities” are rife, and call for the government to levy taxes and pay bounties over wide shares of the economy in order to make the incentives offered by the tax-and-bounty-augmented market the incentives that it is good for society that decision-making individuals have. Cutting property rights “at the joints” to reduce externalities is important. But it will never be efficient: what economists call Pigovian taxes and bounties make up a major and essential part of the business of government.

And in fact, while DeLong’s point about “a veritable flamethrower”  seems like hyperbole, that’s pretty much exactly what the Oregon Department of Transportation is doing here:  Having already polluted the air near Tubman, it is doubling down on its earlier transgression—in part because of the moral hazard:  coping with the pollution isn’t it’s problem—it’s neatly shifted all of the costs of pollution to others–in this case the students and Portland Public Schools.

As a matter of both justice and efficiency, the Oregon Department of Transportation–and through them, the users of Interstate 5—ought to be required to pay for cleaning the air at Tubman.  Failing to impose these costs on ODOT leads it to falsely and unfairly under-value the lungs of these students, and to make a further investment that will make this problem worse.  If ODOT had to bear these costs, it would likely look at the freeway widening project very differently, and instead, consider alternatives that produce smaller amounts of emissions (and might even consider ways to reduce traffic, rather than increasing it).

Making kids pay for freeway pollution—and in fact, pay twice, first by breathing polluted air, and then second, by having to pay for the cost of cleaning it–is both wrong and inefficient.  And like Coase’s example of the farmer and the locomotive, there’s actually a bigger issue here:  More generally, we should be insisting that car users pay the costs that they impose on others. The reason why pollution, sprawl and even traffic congestion are so bad is because we’ve radically under-priced car travel, in essence subsidizing people to do things that degrade our cities and communities. Assigning the responsibility correctly, and getting the prices right can improve fairness, and make our cities better places to live.

Editor’s note:  Commenters are puzzled that they can find no record of Harriet Tubman school prior to the 1980s.  The building now named Harriet Tubman was originally built as Eliot Elementary School in 1952.  Eliot Elementary was subsequently merged with Boise Elementary—due to declining population in the neighborhood, caused in major part by the construction of the freeway. The Eliot/Tubman building has been owned continuously by Portland Public Schools. (This note added on March 8, 2019).

Orwellian freeway-widening

What pretends to be an environmental assessment is actually a thinly-veiled marketing brochure

In theory, an environmental impact statement is supposed to be a disclosure document. The idea behind the National Environmental Policy Act was to force thoughtful consideration of potentially environmentally harmful projects and policies, and by providing the public and decision-makers with clear information, enable better choices.

In practice, the environmental review process has simply become a way to manipulate opinion and manufacture consent, or more accurately, the simulacrum of consent.  The preparation of environmental impact statements is left to the agencies that are sponsoring projects, and who have a vested interest in a particular outcome.

When it comes to the proposed half billion dollar I-5 Rose Quarter Freeway widening project, the Environmental Assessment is less of a honest and objective disclosure, and much more a carefully edited and thinly veiled sales brochure.  The hucksterism starts with the name of the project, proceeds through its “communication plan,” and is executed in technical documents that have been carefully edited to remove the most salient information.

It’s not a freeway widening, it’s an “improvement” project.

Let’s start with what it’s called. The project’s carefully chosen moniker is the “Rose Quarter Improvement Project”– nothing about freeways, or pollution or widening.  Just “improvement.” Who can be against that? The trouble of course, is that the word improvement, in this context is argumentative, to the point of being Orwellian.

The trouble is, that’s exactly the slanted tactic that is regularly used to sell road widening projects, without accurately explaining their nature, purpose and effects.  It’s so well understood that even Washington State’s Department of Transportation is specifically educating its staff not to use this deceptive approach.  WashDOT’s Barb Chamberlain gave a presentation, “Words Matter:  Recognizing and address modal assumptions to shift transportation culture”  at Portland State University’s Transportation Research and Education Center on February 22, 2019.  She specifically spoke out against unqualified claims that a project “improves” a neighborhood.

“Improvement” is loaded. “Improvement” says its qualitatively better, but then you have to ask “for whom?” Some of the measures might be better for some, and worse for some, and you may need to be able to measure that, and you just need to say that so people understands what’s happening. . . .  “What do you mean improved?” Explain to this woman what you are doing to her street, don’t just tell her you’re improving it.

Chamberlain’s advice is clear and unambiguous:  Don’t just say improvement.  Improvement is a loaded word.  The Oregon Department of Transportation knows that.  It’s undoubtedly exactly why they’ve chosen the name “Rose Quarter Improvement Project” for their $500 million freeway-widening effort.  If they only could have, we’re pretty sure that they would have called it the “Rose Quarter Double-Plus Good Project.”

Improvement for whom?

The problem, of course, is that the word improvement not only conceals what will happen, but it begs the question of who the improvement is for.  In order to really understand whether a wider highway constitutes an improvement, we have to know both what will happen and who will win and who will lose.  When that question is asked and answered honestly, the public can make a reasoned decision as to whether a project makes sense.  That’s exactly what happen in Portland in the 1970s.

Famously, when he rallied public opinion against the Mt. Hood Freeway in the early 1970s, Mayor Neil Goldschmidt famously challenged the fairness of wrecking the city to benefit suburban commuters.  Willamette Week, in a retrospective “Highway to Hell,” published in 2005, wrote:

Buoyed by a cadre of campaigners, a new coalition on the City Council, a fresh urban-planning ethic and a positive solution, Goldschmidt hammered away at the central question: Who pays and who benefits? The answer was clear. City residents would pay, sacrificing their neighborhoods, schools and tax base. Suburban commuters would reap the benefits with a slightly shorter commute. What an injustice, argued the evangelical Goldschmidt. His reasoning even appealed to conservatives. 

As we’ve shown before at City Observatory, there are stark differences in the income, race and ethnicity between those who will benefit from the project–chiefly higher income, drive-alone peak hour commuters, many from suburban Clark County–and those who will bear its costs, such as coping with additional traffic and pollution–who tend to have lower incomes, and are much more likely to be people of color.

Wholly Moses

There’s actually an even long history here.  Branding big transportation projects as “improvements” actually stretches back seven decades in Portland, and touches the most famous freeway builder of all:  Robert Moses.  During World War II, Portland’s town father’s commissioned the power broker and a select team of his “Moses Men,” to come to Portland, to write a plan for the city’s future.  Moses sketched out a grid of freeways that ultimately led to the construction of Interstate 5 through the Rose Quarter.  This was the same Moses who his biographer Robert Caro related felt that in planning in a big city, you “you have to hack your way with a meat ax.” (Quoted in Robert Caro, The Power Broker: Robert Moses and Fall of New York, (1974), p. 849)  Which is exactly what he proposed in New York and Portland.

But even the great builder was a little sensitive to what you called his projects.  In Portland, he devoted two paragraphs to rebranding his highway-oriented development scheme for the city. He worried that “public works” wasn’t a sufficiently promotional moniker for the program he proposed, and instead specifically insisted it be called “Portland Improvement”

Magritte the highway engineer

One of the most famous image of early 20th Century is Rene Magritte’s Ceci n’est pas une pipe (This is not a pipe).  Magritte’s point, of course, is that his painting is not actually a pipe itself, but is an image of a pipe, which makes the caption seem like an ironic and incorrect statement.

This is not a pipe                                            This is not a wider freeway

Who would have thought that highway engineers had so much in common with surrealist painters? Talk to anyone associated with the “improvement project,” and they’ll go to great pains to tell you they’re not adding any lanes to the freeway.  Well, at least not “real” freeway lanes.  Instead, what’s being added are “auxiliary” lanes. ODOT even has a video extolling the virtues of the special “auxiliary” lanes. Calling them “auxiliary” lanes means, somehow, that they don’t actually count.  We’re told that an auxiliary lane carries traffic between intersections, facilitating merging and exiting, and isn’t carrying “through” traffic, and so we should disregard it. But, as we’ve pointed out before at City Observatory, if that’s true, then the current freeway consists of just one through lane and one auxiliary lane in each direction, because one of the current lanes functions effectively as an auxiliary lane.  In this case, going from one auxiliary lane and one through lane to one auxiliary lane and two through lanes, still represents and increase of one through lane.

Washington Department of Transportation’s Barb Chamberlain is right:  Words matter.  Whether this, or any other project is an “improvement” should be a a subject of inquiry and analysis, not a pre-determined outcome.  In this case, it’s questionable whether and for whom this is an “improvement.” Attaching the adjective “auxiliary” to the added freeway lanes you build doesn’t mean that the freeway isn’t wider and doesn’t generate more traffic, pollution, and ultimately congestion. ODOT owes the citizens of the state a clear and honest explanation of what it’s up to:  What we have so far is fraught with deception. It’s tragic that a process that was legislated to provide information and inform discussion has instead been twisted to promote a single, predetermined outcome.

 

There’s a $3 billion bridge hidden in the Rose Quarter Project EA

ODOT hid its plans to build a $3 billion Columbia River Crossing in the Rose Quarter Freeway Widening Environmental Assessment

The carefully crafted marketing campaign for the I-5 Rose Quarter Freeway widening project is adamant that you don’t call it an expansion.  It’s an “improvement project” they say.  We’re not widening the freeway, we’re just building auxiliary lanes.  But that rationale evaporates when you understand that the traffic projections that justify the project (and allegedly minimize its environmental effects) appear to be  based on the assumption that the region spends $3 billion (and likely a good deal more) to build a 12-lane Columbia River Crossing project.

The modeling for the Rose Quarter Freeway expansion has hidden its assumption that the CRC gets built–and produces a flood of traffic into the Rose Quarter.  Search the project’s Environmental Assessment (EA) and its traffic technical report, and you’ll find no mention of the Columbia River Crossing or the CRC.  But there are a couple of very obscure passages in the Traffic Technical Report that deserve close scrutiny.

Start with this  seemingly innocuous statement buried 38 pages into the Traffic Technical Report .

So, what exactly are “”planned future I-5 projects?”

Well you have to turn back to a reference buried on a footnote on page 7 of the Traffic Technical Report  to learn that:

Notice, that there’s a lot of misdirection here:  the EA draws our attention to the local street network and bike lanes.  But local streets are just “one of the actions.” That naturally begs the question:  What are the other actions? The EA doesn’t say.  To find out, you have to follow the link buried in the footnote.  If you click on the link in that footnote, you’re taken to an Excel spreadsheet hosted at the Metro website (it’s laid out in 8-point type, and the zoom is set by default to 50%), so you’ll want to enlarge it considerably to read what’s there.  If you scroll through six hundred rows of the spreadsheet, on row 635, you’ll find this reference.

That’s small and hard to read, so let’s zoom in a bit:

That’s right:  The environmental analysis for the Rose Quarter Freeway widening says, in the most indirect and obscure way imaginable, it is based on the assumption the region spends $3 billion on building the Columbia River Crossing (a 5-mile long, 12-lane wide freeway project between Portland and Vancouver).  And critically, they’ve assumed that the CRC is part of the “No-Build” scenario for the Rose Quarter.

ODOT’s “No-Build” Scenario includes a 12-lane freeway and bridge on I-5 at the Columbia, at a cost of $3 billion (or much more)

Assume a Twelve Lane Firehose Pointed at the Rose Quarter

There’s also quantitative evidence that ODOT built the CRC into its traffic forecasts for the Rose Quarter freeway widening project.  This too was carefully concealed from public disclosure. As we noted, the Traffic Technical Report contains no references to Average Daily Traffic levels (the most basic measure of traffic volumes).  After repeated requests to ODOT, on March 13, 2019, the agency released PDF images of several tables showing hourly traffic volumes on I-5.  (Releasing them with just 18 days left in a 45-day comment period, of course, minimized public opportunity to evaluate ODOT’s data and claims).

These data tables show peak hour traffic volumes at various locations on Interstate 5, and include data for existing (2016) levels of traffic and modeled 2015 and 2045 levels of traffic.  Here is one of those tables, summarizing traffic in the morning (8AM to 9AM) and evening (5PM to 6PM) peak hours in both directions on I-5 at Going Avenue (the portion of the I-5 freeway at the northern end of the Rose Quarter Freeway widening project).  We’ve shown two estimates:  the modeled 2015 level of traffic in the No-Build (the top panel labeled “RQ VISUM Model 2015 No Build” and the reported existing levels of traffic (the bottom panel labeled “RQ Existing Conditions 2016”).

I-5 North Volumes Modeled v. Existing
Northbound Southbound Total Difference
Time Period RQ VISUM Model (2015)
AM Peak 8AM-9AM 3,945 6,204 10,149 39%
PM Peak 5PM-6PM 5,052 5,175 10,227 46%
RQ Existing Conditions (2016)
AM Peak 8AM-9AM 2,146 5,133 7,279
PM Peak 5PM-6PM 3,360 3,639 6,999
RQ VISUM Model, “Mainline North of Going, 2015 No Build”
RQ Existing, “2016 Existing Conditions” “Mainline North of Going”

Source: ODOT March 13, 2019 Delayed Disclosure, “Model Volumes.pdf”

The discrepancy to pay attention to here is the difference between 2016 existing traffic levels and modeled 2015 traffic levels. In theory, you might think that the numbers should be the same, or almost the same, and that, if anything, the 2016 numbers should be higher than the 2015 numbers, due to economic and population growth. But in fact, the modeled 2015 traffic volumes are uniformly higher–much higher–than the actual measured 2016 traffic volumes.  What this means is that ODOT built a model of 2015 that assumes this area gets more traffic than it got in reality.  There is no explanation in the EA, in the Traffic Technical Report, or in the materials submitted by ODOT on March 13 that explain this discrepancy. But what it represents is the effects of building a traffic model that assumes–quite counterfactually–that the CRC was built and operational in 2015, and funnelling roughly 3,000 more vehicles per hour in the peak hour into the Rose Quarter.

The following chart summarizes differences between the actual level of traffic in 2016 (blue) and the estimates contained in ODOT’s model of 2015 conditions (red).  ODOT’s model exaggerates the current level of traffic on I-5 by 39 percent in the morning peak hour and by 46 percent in the evening peak hour.  While the Environmental Assessment and accompanying documents offer no explanation for this discrepancy, the only  plausible explanation is that ODOT has assumed a massive increase in capacity and traffic on I-5 north of the project impact area–the 12-lane Columbia River Crossing. (Not only that, but they’ve apparently created a kind of alternative reality in which the CRC existed in 2015).

There’s a very real unanswered question of how many vehicles per day ODOT assumed would be coming across the Columbia River in 2045.  There are very different answers to that question depending on whether one believes ODOT’s own Final Environmental Impact Statement for the Columbia River Crossing or whether one believes ODOT’s subsequent Investment Grade Analysis–a traffic report done by independent experts to assess the impact of tolling on traffic.  In the case of the FEIS, ODOT predicted that 180,000 vehicles per day would use I-5 in 2030 (the terminal year of its traffic forecast).  The Investment Grade Analysis, conducted for ODOT by CDM Smith concluded that even in it “high” scenario, tolls would dramatically reduce bridge traffic and that fewer than 100,000 vehicles would use in 2035–a number 20,000 to 30,000 lower than current traffic levels.

Because the EA makes no direct mention of the CRC, and suppresses all data about average daily traffic, it’s impossible for the public to know which of these very disparate estimates (the FEIS estimate or the CDM Smith investment grade analysis estimate) or some other estimate was used to generate the modeled estimates of traffic flows into the Rose Quarter.

Five big problems with hiding the CRC

Obscuring its assumption that the CRC will be built, and generate traffic flows into the Rose Quarter delegitimizes ODOT’s planning process, in five ways:

  1.  It’s dishonest and a violation of NEPA to hide such a fundamental assumption
  2.  The EA fails to present a true “No-Build Scenario” against which the project’s effects can be judged
  3.  The inflated traffic levels in the No-Build make the project look better than it really is in environmental terms
  4.  The modeling shows that the Rose Quarter project is needed to solve a problem that the CRC creates
  5.  Hiding the CRC in the No-Build violates the requirements that the EA address cumulative impacts

Hiding fundamental facts and assumptions violates NEPA

The purpose of an Environmental Assessment or Environmental Impact Statement is to fully and fairly disclose the impacts a project will have on the environment. It’s important to remember that the National Environmental Policy Act is just a procedural law:  it doesn’t prohibit you from doing things that change, or even hurt the environment. What it does do is insist that you honestly and accurately tell the public what those impacts are likely to be, and to give due consideration to reasonable alternatives that would likely have fewer impacts. The Rose Quarter Environmental Assessment does not meet that standard:  it’s a biased and self-serving marketing document that conceals important facts from the public, constructs a phony No-Build scenario to put its impacts in a better light, and hides the most basic data about the projects traffic and environmental impacts.

The “No-Build” scenario isn’t accurate

A fundamental premise of NEPA is that the agency consider a “No-Build” option:  What will the environmental effects be if the agency doesn’t move forward with the project.  It’s apparent that ODOT didn’t analyze a true “No-Build” scenario.  It’s assumed that the Columbia River Crossing is built.  The trouble is that project is, if not dead, certainly in limbo.  Oregon and Washington spectacularly failed to agree on the project and funding more than five years ago. The project itself is still the subject of ongoing litigation (which like the project itself, is in limbo). The finances of the project were never fully worked out, and in the meantime, the expected sources of federal funding have essentially evaporated.  Putting an imaginary $3 billion bridge, and its attendant traffic, in the “No-Build” scenario distorts the environmental assessment beyond reason. ODOT could remedy this by undertaking a new analysis that forecast traffic levels based on the actual no-build situation:  a world in which the existing I-5 bridges provide six lanes of traffic across the Columbia River.

Inflated “No-Build” traffic levels bias the environmental assessment.

Assuming that there’s a 12-lane I-5 bridge of the Columbia River or otherwise inflating the assumed traffic level on I-5 north of the project area above the current level of traffic fictitiously creates a “No-Build” world of congestion and pollution levels that don’t, and can’t exist, and therefore casts the project in an artificially favorable light by comparison.  A realistic No-Build, one which reflected actual traffic levels, and which left out the surge of traffic created by the modeler’s assumption that the CRC is built, would have much lower levels of congestion and pollution.  In addition, as we’ve shown, the added capacity at the Rose Quarter would induce millions of miles of addtional vehicle travel that wouldn’t occur in the absence of the project, and which would increase pollution, congestion and carbon emissions.

Traffic problems at the Rose Quarter are created by the Columbia River Crossing

What ODOT has done is assume that we spend $3 billion to create a flood of traffic across a 12-lane Columbia River Crossing, and that this problem that is created is then “solved” by widening the freeway at the Rose Quarter.  The rationale presented for the I-5 Rose Quarter widening project is the engineering equivalent of iatrogenic disease.  (Iatrogenic is “doctor-caused” for example, when an otherwise healthy person undergoing surgery in a hospital acquires an antibiotic resistant infection).  In this case, the quantitative justification for the Rose Quarter Freeway widening is to cope with the flood of induced traffic created by the construction of the Columbia River Crossing.  This kind of modeling leads traffic engineers to a never ending game of whack-a-mole with traffic bottlenecks:  you expand capacity in one location, that feeds more traffic into the bottleneck in the next location, and provides justification for expanding capacity at that location.

The EA conceals the cumulative impacts of the Rose Quarter project

The purpose of NEPA is to get decision makers to pause and reflect, and not allow a steady stream of seemingly minor and incremental decisions to systematically foreclose other, more environmentally sensitive options. Failing to break out the connected decisions to widen I-5 at the Rose Quarter and build the Columbia River Crossing, and to consider a the environmental effects of a world in which the region choose to do neither of these things, means that the EA fails to meet the NEPA requirement that it fairly consider the cumulative impacts of this investment decision. An honest cumulative impacts analysis would compare a world in which we built neither the Columbia River Crossing, nor widened the I-5 freeway at the Rose Quarter.You can’t judge the cumulative impacts of this decision, because ODOT hasn’t separated out a world in which just the Rose Quarter project is built.

The CRC and the Rose Quarter have always been closely intertwined, although ODOT officials have tried to obscure that fact.  The Independent Review Panel for the Columbia River Crossing–appointed jointly by the Governor’s of Oregon and Washington, flagged this issue in their critique of the CRC in 2010.  They wrote:

“Questions about the reasonableness of investment in the CRC bridge because unresolved issues remain to the south threaten the viability of the project.” (Independent Review Panel Report, 2010, page 112).

The panel recommended further  traffic studies to test whether the CRC will simply shift the bottleneck south, and called for ODOT and the City of Portland to “fully develop a solution for I-5 from I-405 to I-84” and to program that solution in conjunction with the phasing of the construction of the CRC (page 113).

While its pitched as some kind of stand alone, safety related project affecting just a small area, the Rose Quarter freeway is really an integral part of a much larger, and entirely freeway-centric vision of transportation in the Portland area. It traffic projections are founded directly on the assumption that the region spends at least $3 billion, and likely a great deal more, to widen I-5 across the Columbia to 12 lanes.  The project itself engineers a right-of-way through the Rose Quarter than can easily accomodate an eight-lane freeway. Yet these basic facts are purposefully concealed, and not revealed in the EIS.  Nor does the EIS do its job of disclosing the likely cumulative impacts of these steps.

Editor’s Note:  This post was updated on March 27.

 

Why Portland shouldn’t be widening freeways

Why Portland’s freeway fight is so important to the future of cities everywhere

The plan to widen the I-5 Rose Quarter Freeway in Portland, at a cost of $500 million, is a tragic error for one city, and an object lesson to others.  A wider freeway will induce more traffic and pollution (and ironically, worsen traffic congestion), runs directly counter to the city and state’s goals of reducing greenhouse gas emissions, does nothing to improve safety, especially for those walking or biking, and disproportionately benefits higher income commuters from outside the city, while imposing social and environmental costs primarily on lower income households and people of color.

There’s a lot we have to do to meet the growing demand for urban living. The first rule, as in medicine, is to do no harm.  In Portland, and in cities around the nation, building freeways has been consistently shown to devastate urban neighborhoods, and only exacerbate traffic congestion and car dependence. This is a signal issue, around which much of of possibility of crafting better, more sustainable and more inclusive cities revolves. That’s why we’re spending time exploring this issue in detail here.  The battle over this freeway widening project is increasingly drawing national attention, and the leading experts are warning Portland its in danger of making a terrible mistake.

This is City Observatory’s  guide to the public policy case against the proposed I-5 Rose Quarter Freeway Widening Project

Traffic congestion will worsen, thanks to induced demand

Widening the freeway won’t solve traffic congestion. More highway capacity generates more traffic–the phenomenon of induced demand is so well documented that it’s now called “The fundamental law of traffic congestion.”  Added capacity encourages more people to drive, and in dense urban environments, there’s plenty of “latent demand” that almost immediately fills added lanes as soon as they’re built. Houston widened its Katy Freeway to 23 lanes, and it’s now even more congested and slower moving than before. Even PBOT and ODOT officials acknowledge that widening I-5 won’t reduce daily traffic congestion.

A wider freeway won’t reduce crashes that produce “non-recurring” congestion. Because they know they can’t reduce daily congestion, city and state engineers instead make a different claim:  that more lanes and wider shoulders will reduce occasional congestion from crashes.  But they don’t actually have evidence for that; instead the claim is just based on engineering “rules of thumb.” And our actual experience on this freeway, with the same drivers, has been just the opposite:  After ODOT widened I-5 between Lombard and Victory Blvd. in 2010, crashes went up, not down.  It’s not surprising: Metro’s State of Safety report shows that wider roads tend to have higher crash rates. If crashes don’t decline, ODOT can’t claim to reduce traffic congestion.


Wider roads will likely make congestion worse.  There’s good evidence that funnelling more traffic onto the region’s roads will actually cause congestion to get worse. After ODOT widened I-5 north of Lombard, and expanded ramps onto the freeway, traffic congestion became worse, as more cars were funnelled even more rapidly into bottlenecks, causing the freeway to lose capacity.  Consequently, the Interstate 5 bridge now carries about 10 percent fewer cars in the afternoon peak hour than it did ten years ago. 

Climate change: greenhouse gas emissions will increase

The threat of climate change is real and serious, and Oregon is failing in its legally adopted goal of reducing greenhouse gas emissions to 20 percent of 1990 levels by 2040, almost entirely due to an increase in driving in the past few years, according to the latest report of the Oregon Global Warming Commission.

The University of California, Davis’s induced travel calculator suggests that widening the I-5 freeway at the Rose Quarter will generate an additional 10 to 17 million miles of vehicle travel per year, adding to hundreds of tons per year to Portland area greenhouse gas emissions.

Widening freeways runs directly counter to the need to take decisive action to deal with climate change. The evidence from last summer’s smokey skies to the latest dire report from the International Panel on Climate Change show we’ve put off action too long. And in Oregon, the latest state report tells us we’re losing ground in our stated objective to reduce greenhouse gas emissions, almost entirely because we’re driving more.  More freeway capacity will produce more driving. ODOT is claiming that somehow by reducing congestion (which they won’t do anyhow), that they’ll reduce pollution and greenhouse gases associated with idling.  That folk myth has been thoroughly debunked by the transportation experts at Portland State: emissions from added car travel more than offset lower pollution for idling.   

ODOT has previously lied about the the carbon emissions of its activities to promote its objectives. In 2015, the Director of ODOT admitted to the Oregon Legislature that his agency had grossly overstated the carbon emission reductions that could be expected from so-called “operational improvements” to state freeways.  This led to the collapse of transportation legislation, which was predicated on carbon reductions.

There is no evidence that operational improvements, like variable speed signs, which are the centerpiece of ODOT’s climate strategy, have any meaningful effect on carbon emissions. Independent studies showed that implementation of variable speed signs on Highway 217 were accompanied by an increase in crashes, and not a reduction; there’s no evidence they improved congestion.

Congestion pricing would solve Rose Quarter congestion more quickly and cheaply

The only effective means of reducing traffic congestion is road pricing. Two years ago, the Oregon Legislature enacted HB 2017, directing ODOT to pursue congestion pricing on I-5 and I-205. ODOT’s own studies show that congestion pricing would dramatically reduce congestion in the Rose Quarter area, and improve transit operations and freight mobility, while increasing freeway throughput. ODOT has completely omitted any mention of congestion pricing in the Environmental Assessment. Leaving out this demonstrably superior alternative is a violation of the National Environmental Policy Act.

Widening the freeway won’t improve safety

The Oregon Department of Transportation falsely claims that the Rose Quarter is the highest crash location in the state.  In fact, many other ODOT operated roadways in the Portland area have far higher crash rates than the Rose Quarter.

Freeways are among the safest roads in the Portland metropolitan area; the typical multi-lane urban arterial street in Portland has a crash rate five times higher than urban freeways.  Virtually none of the crashes on I-5 at the Rose Quarter involve serious injuries or fatalities.  Willamette Week debunked ODOT’s safety claims:.

There’s is no evidence that widening I-5 at the Rose Quarter would reduce crashes.  When ODOT widened Interstate 5 just north of the project area (a roadway that carries the same vehicles that travel through the Rose Quarter) crash rates did not decline, in fact they increased. Similarly, when ODOT increased capacity at the Woodburn intersection on I-5, crash rates also did not decline; that intersection had two serious crashes in the space of ten days in February 2019.

Freeway widening will worsen pollution

Increased traffic and air pollution will most severely affect students at Tubman middle school–the project would widen the freeway onto land now occupied by the school. Construction of the freeway will require excavating land at the school site and bring the freeway within feet of the school building.

Scientists at Portland State University who studied air quality at the Tubman site recommended that students not engage in outdoor activities to avoid exposure to air pollution from the freeway.  The school district spent $12 million on on environmental protection for Tubman, mostly for special air filtration equipment to make air inside classrooms safe for students to breathe. It’s monumentally unfair that students should bear the dual costs of breathing polluted air, and have their scarce educational dollars used to pay for air filters, while roads are used by high income commuters, who pay nothing towards these costs.

Freeway widening is inequitable

Widening freeways generally privileges higher income people, while doing nothing, or actually worsening travel and the environment for lower income people.  There are huge disparities in the income, race and ethnicity of the project’s primary beneficiaries (peak hour, solo car commuters) and local area residents, including those who walk, bike and take transit, and households who don’t own cars.

  • Peak hour, drive-alone commuters from Clark County to Oregon have a median household income of $82,500, 50 percent greater than the transit, bike and walk commuters living in North and Northeast Portland ($53,900), and more than three times greater than carless households in North and Northeast Portland (23,500).
  • Two in five (40 percent) of peak hour drive alone commuters from Clark County to Oregon have a median household income of more than $100,000.
  • Three-quarters of peak hour, drive-alone commuters from Clark County to Oregon are white, Non-Hispanic.  Two-thirds of Tubman Middle School Students are persons of color (including multi-racial).
  • Nearly half of all Tubman Middle School Students qualify for free and reduced price meals.
  • A majority of those who live in the project area (Census Tract 23.03), commute to work by transit, bike or walking.

Freeway widening doesn’t repair the Albina Neighborhood

Widening the freeway does nothing to fix the decades long scar ODOT inflicted on the neighborhood in the 1960s. ODOT demolished and never replaced over 300 homes in the Interstate 5 freeway right of way. Truly mitigating that damage would require providing $140 million to build new housing to replace that lost–not making the freeway even wider.

The freeway widening–coupled with other urban renewal projects–triggered neighborhood decline that  led to the displacement of 60 percent of the neighborhood’s population (more than 1,700 persons). The neighborhood’s black population declined. Freeways and the traffic they generate are intrinsically inimical to healthy urban spaces.

The covers proposed for the freeway are needed just to facilitate construction  (allowing traffic to be re-routed when existing overpasses are demolished). Surrounded by high volume, fast moving auto-dominated arterials, they’re genuinely un-pleasant spaces for humans. They’re badly fragmented and poorly located to provide meaningful public space.  ODOT has no plans or budget to build them strong enough to support buildings (which would likely be cost-prohibitive while spanning a widened active freeway right-of-way).

Bicycle and Pedestrian access is impaired by freeway widening

The Rose Quarter Freeway widening project would demolish the current Flint Avenue Crossing over Interstate 5.  Flint Avenue is a low-speed, low volume neighborhood street that provides a safe, limited grade route between Northeast Portland and the Rose Quarter and Broadway Bridge.

The freeway widening project would replace the current Flint Avenue crossing with steep and indirect bike-pedestrian bridges on Hancock and Clackamas Streets. It is not clear from existing designs that the grade of these two structures would be compliant with the Americans with Disabilities Act. The City’s own Pedestrian Advisory Committee has prepared a devastating critique of the project’s impacts on neighborhood walkability.

The project would build a diverging diamond interchange along Williams Avenue, in which car traffic would be placed on a two-way street with cars running in the left lanes–the opposite of every other street in the city. This would create a hostile environment for bicyclists and pedestrians. In addition, at the intersections of Broadway, Weidler, Vancouver and Williams, ODOT would increase the radius of curvature, accelerating car traffic across crosswalks in this area. The project’s renderings paint a highly distorted picture of the project, showing five times as many pedestrians and forty time fewer cars than actually use city streets.

Cost overruns

ODOT has routinely experienced cost overruns of 200% or more on its major projects.  It’s largest highway project, a five-mile widening of US 20 between Newport and Corvallis wen more than 300% over budget, rising from $110 million to more than $360 million. It’s Grand Avenue Viaduct project in Portland went 300% over budget, costing $98 million rather than $31 million.  The Woodburn Interchange on I-5 ballooned from $25 million to $70 million.

Editors Note:  The commentary was updated on March 26th to reflect additional information published on City Observatory.

 

 

Widening the I-5 Freeway will add millions of miles of vehicle travel

We can calculate how much added freeway lanes will induce additional car travel

The takeaway:  the I-5 freeway widening project in Portland lead to 10 to 17 million more miles of vehicle travel annually, which will in turn produce thousands of tons of additional greenhouse gas emissions.

A key part of the selling point for the proposed $500 million Rose Quarter Freeway widening project is the improbable claim that widening the freeway will reduce traffic congestion and not stimulate additional vehicle travel. That claim flies in the face of decades of experience and widely published research showing that, with great predictability, more freeway capacity generates proportionately more traffic, traffic congestion and pollution. The phenomenon is now so well established in the literature that it is called the “Fundamental Law of Road Congestion.”

It’s also so well-established that researchers at the University of California Davis have built an on-line calculator that allows you to compute how many more vehicle miles of travel a wider freeway will produce.  Their calculator is based on a careful review of the research on induced demand, and generates an estimate of the number of additional vehicle miles of travel that will be produced by each additional lane-mile of freeway in a city. Here’s the calculator showing how much adding 1.6 lane miles of freeway would increase travel in the San Diego metropolitan area (about 11 million miles per year).

 

Because it’s based in California, the model is calibrated for California metro areas and not Portland. To get a rough idea of what kind of impact we might expect from a freeway widening project in Portland based on this model, we looked at the model’s computations of the impact of freeway widening projects in the three California metropolitan areas most similar in population size to Portland–Sacramento, San Diego and San Jose).  For each city, we used the calculator to generate estimates of the additional vehicle miles of travel associated with adding 1.6 lane miles of interstate freeway.  Here are the results:  in these California cities, adding 1.6 lane miles of freeway would be expected to generate between 8 million and 12 million additional miles of vehicle travel.

Of course, Portland isn’t exactly like any of these three cities.  So using the data from the model, we developed our own estimate of the likely effects of induced demand from freeway expansion in Portland.  The section of I-5 that would be widened as part of this project carries about 120,000 vehicles per day, and has four lanes in each direction.  According to the Environmental Assessment Traffic Technical Report, page 9), the segment to be widened is about 4,300 or 0.8 miles long; adding two lanes therefore adds about 1.6 lane miles of freeway. The Induced Travel Calculator shows that there is a unit-elasticity of vehicle miles traveled to added capacity:  Therefore a 1 percent increase in lane-miles generates a 1 percent increase in freeway travel.

We applied this unit elasticity factor to the expansion of the freeway in the Rose Quarter area.  This .8 mile stretch of freeway currently has about 96,000  vehicle miles of travel daily ((120,000 *0.8), or about 35 million vehicle miles of travel per year.  This project increases the number of lane miles of freeway by 50 percent, from 3.2 lane miles (4 * 0.8) to 4.8 lane miles (6 * 0.8).  The unit elasticity of lane miles to vehicle miles traveled means that a 50 percent increase in lane-miles should produce a 50 percent increase in vehicle miles traveled or about 17.5 million additional vehicle miles of travel.

This estimate is higher than the estimate for the three California cities presented above.  It’s important to keep in mind that the results from the calculator are based on the average traffic volume on all freeways in a metro area; because this area is much more central and more heavily used, it has a higher base of traffic than the typical freeway in a metro area.  Taken together, the evidence from the UC Davis calculator and our application of its methods to the Rose Quarter project, we estimate that widening the I-5 freeway by adding one lane for about 0.8 of a mile would be expected to add between 10 million and 17 million additional vehicle miles of travel in the Portland metro area.

We can go one step further, and estimate approximately how much additional greenhouse gas emissions will results from this added driving. Transportation planning firm Fehr and Peers has yet another calculator for converting additional vehicle miles of travel into greenhouse gas emissions, with estimates calibrated to Western US metropolitan areas. Their calculator suggests that each additional thousand miles of driving is associated with about .466 tons of greenhouse gases; that means the Rose Quarter freeway widening project will produce between 4.7 and 7.9 thousand tons of additional greenhouse gas emissions per year.

It doesn’t matter what you call the added lanes

And we don’t buy for a minute that it matters in any way that ODOT wants to call the additional lanes its building “auxiliary lanes”.  If the point is that the right hand lane on I-5 at the Rose Quarter is handling merging traffic, that is true whether the facility is 2 lanes in each direction or three.  If we apply ODOT’s logic and nomenclature to the current setup, the freeway now consists of one through lane and one auxiliary lane–and the proposed project would increase that to two through-lanes and one auxiliary lane. Using sophistry and shifting definitions doesn’t change the fact that this project adds lane miles of freeway. And more lane miles of freeway, as these calculators show, produces millions more miles of driving and thousands of tons more greenhouse gas emissions every year.

Freeway widening for whomst?

There’s a huge demographic divide between those who use freeways and neighbors who bear their costs

When it comes time to evaluate the equity of freeway widening investments, it’s important to understand that there are big differences between those who travel on freeways and those who bear the social and  costs in the neighborhoods the freeways traverse. Our equity analysis of the proposed half-billion dollar I-5 Rose Quarter freeway widening project shows:

  • Peak hour, drive-alone commuters from Clark County to Oregon have a median household income of $82,500, 50 percent greater than the transit, bike and walk commuters living in North and Northeast Portland ($53,900), and more than three times greater than carless households in North and Northeast Portland (23,500).
  • Two in five (40 percent) of peak hour drive alone commuters from Clark County to Oregon have a median household income of more than $100,000.
  • Three-quarters of peak hour, drive-alone commuters from Clark County to Oregon are white, Non-Hispanic.  Two-thirds of Tubman Middle School Students are persons of color (including multi-racial).
  • Nearly half of all Tubman Middle School Students qualify for free and reduced price meals.
  • A majority of those who live in the project area (Census Tract 23.03), commute to work by transit, bike or walking.

How should we judge the equity or fairness of our transportation system, and of proposed investments?  One way is to look at the demographic characteristics of those who receive the benefits and who bear the costs of these investments. Today, we take a close look at the allocation of the costs and benefits of the proposed $500 million Rose Quarter Freeway widening project in Portland.  The project would widen the freeway from four lanes to six, and rebuild associated interchanges to create straighter, faster routes for vehicles entering and leaving the freeway.  The project also claims benefits for bikes and pedestrians, but that’s actually very questionable, as the project eliminates entirely one low-speed, pedestrian and bike friendly street that crosses the freeway (Flint Avenue), and creates a pedestrian and bike hostile miniature diverging diamond interchange (where multiple lanes of traffic will be traveling on the wrong (left) side of the road, to speed cars on and off the freeway). The wider freeway will add more traffic and more emissions to the neighborhood.

For this analysis we compare and contrast the demographics of two different groups:  peak hour freeway users and neighborhood residents.  Peak hour freeway users will be the primary beneficiaries of the Rose Quarter Freeway widening project.  On a regular basis, this portion of Interstate 5 is congested in a very particular pattern:  daily flows of commuters from Washington state (southbound in the morning, and northbound in the afternoon) are primarily responsible for congestion in this area.  The impacts of this project, in terms of local traffic and air pollution, are primarily felt by those who live, work and go to school in the project area.  Our analysis looks in more detail at two groups:  drive-alone peak hour car commuters from Clark County Washington who work in Oregon, and persons who live or attend school in the project area. We look at three different sub-groups of persons living in the project area:  residents of the Census Tract in which the project is located (Census Tract 23.03), persons living in North and Northeast Portland, and students attending the Tubman Middle School (which is located immediately adjacent to the Interstate 5 freeway).  We look at two principal socioeconomic dimensions of these groups:  household income and race/ethnicity.

Household Income

There’s a huge income disparity between those who commute by car on freeways from Washington to jobs in Oregon, and those live in the project neighborhood who walk, bike and take transit to their jobs.  Peak-hour solo car commuters from Clark County Washington have incomes more than 50 percent higher than those who live in the neighborhood affected by the freeway widening who take walk, ride bikes or take buses to their jobs.

How do the incomes of those who will benefit from the freeway widening project (peak hour drive alone commuters, many from Washington State) compare to the incomes of those who will be affected locally by the increased traffic and emissions?  There will undoubtedly be winners and losers from the Rose Quarter freeway widening project. While the project allegedly saves travel time for those who commute on the freeway, there are few if any benefits for those who ride transit, bike or walk in the project area. By many standards, the Rose Quarter project will make things worse.  Bike riders will face more circuitous and steeper routes.  Pedestrians will have to deal with “wrong-way” traffic on the project’s miniature diverging diamond interchange, and also cope with wider turn radius intersections–and faster moving cars–at the project’s “improved” freeway on-ramps.

We used Census data to look at the average household income of persons commuting during peak hours by themselves by automobile from homes in Clark County Washington to jobs in Oregon.  Using data from the American Community Survey’s Public Use Microsample, we looked for solo car commuters who left their homes in Clark County between 6:30am and 8:30 am on a typical day. On average, the median peak hour solo car commuter had a household income of $82,500.

For contrast, we computed the average incomes of persons who live in North and Northeast Portland, and who commute to work by walking, riding bicycles or by transit (including bus, streetcar and light rail).  The median walk/bike/transit commuter living in these neighborhoods had an average household income of just $52,900.  We can also look at the average income of non-car-owning households in North and Northeast Portland. By definition, these households are largely dependent on transit, walking and cycling to meet their travel needs.  The average income of non-car-owning households in this area is $23,000

Race and Ethnicity

There are profound differences in the race and ethnicity of those who will be the primary beneficiaries of this project (peak hour, drive alone commuters, chiefly from Washington State), and those who will bear the environmental consequences of the project (exemplified by Tubman Middle School students) who will breathe the emissions and cope with the increased car traffic associated with the project.  This is apparent when we examine the racial and ethnic composition of these two groups.  Peak-hour drive alone commuters from Washington State to jobs in Oregon are overwhelmingly White and non-hispanic; students at Tubman are overwhelmingly persons of color.

A majority of students attending Harriet Tubman Middle School are students of color according to demographic data collected by Portland Public Schools.  About two-fifths of students are African American, about one-in-seven are Latino, and just one in three are white, non=Hispanic.

Clark County auto commuters to Oregon are overwhelming white, non-Hispanic.  The American Community Survey provided data on the race and ethnicity of peak-hour drive alone commuters who travel from Clark County to jobs in Oregon.  Three quarters of all drive-alone peak hour car commuters from Vancouver are white, non-hispanic.

In addition to being primarily persons of color, students at Tubman come from households with high levels of economic distress. Roughly half of all students (48.9 percent) attending Harriet Tubman Middle School qualify for free- or reduced-price meals, and indicator of low socio-economic status, according to data compiled by Portland Public Schools.

Widening a freeway through a neighborhood that doesn’t drive

In addition, data from the Census allows us to look in more detail at the usual mode of transport to work by persons living in the immediate vicinity of the project.  This area is Census tract 23.03, and area that almost completely includes the entire portion of the freeway to be widened, as well as housing and commercial areas on either side of the freeway. According to the latest Census data, a majority of the persons living in this area commuted by transit, biking or walking.  Only a third of local residents commuted alone by automobile.  This makes this neighborhood one of the most car-free in the city.  Census tract 23.03 has the highest proportion of bike, walk and transit commuters of any neighborhood outside downtown Portland.  Fully 97 percent of Multnomah County residents live in neighborhoods that have lower levels of transit use, cycling and walking than this Census tract.

Journey to work by mode, Census Bureau, American Community Survey, Census Tract 23.03.

One of the so-called rationales for the freeway project is to somehow repair the damage to the neighborhood caused by construction of the freeway in the early 1960s.  It’s difficult to understand how widening the damaging freeway redresses these damages. As we’ve documented at City Observatory, the construction of the freeway led to the Oregon Department of Transportation to demolish of more than 300 homes, which it never replaced.  What the freeway expansion clearly does, however, is repeat the historical injustice done by freeway construction in the first place:  subsidizing travel for higher income persons who live outside the neighborhood, while doing essentially nothing to better meet the needs of lower income persons who live in and neary the project’s location.

Technical notes:  For North and Northeast Portland, we used data from Public Use Microsample Areas (PUMAs) 01301 and 01305, which include all of North Portland, most of Northeast Portland, and some portions of Southeast Portland. PUMAs are the smallest geography for which income by mode data are available for Portland. Data are for the 2017 ACS one-year sample. These data are from Ruggles, et al.

Steven Ruggles, Sarah Flood, Ronald Goeken, Josiah Grover, Erin Meyer, Jose Pacas, and Matthew Sobek. IPUMS USA: Version 8.0 [dataset]. Minneapolis, MN: IPUMS, 2018. https://doi.org/10.18128/D010.V8.0

Angie’s List: The problem isn’t ride hailing, it’s the lack of road pricing

Streetsblogger extraordinaire Angie Schmidt is not happy with Uber and Lyft. They’re not really the ones to blame.

Are Uber and Lyft to blame for growing urban transportation problems? Streetsblog’s Angie Schmit makes a strong case that they’re the villains her February 4 article starts out tough:

All the bad things about Uber and Lyft in one simple list: More traffic, less transit trips, more traffic deaths greater social stratification: A Comprehensive list’s It’s long.

And gets tougher.

Here’s the latest evidence that Uber and Lyft are destroying our world

Angie’s list has a pretty damning bill of particulars.  She ticks off ride-hailing for increasing driving, dead-heading, operating in transit-served areas, reducing transit use, increasing crashes, reducing cycling and walking, and hoarding data. There’s no question in our minds that too many cars are a big problem for cities, for transportation, for equity, for safety and for the environment. We think it’s important to look deeper at the underlying causes of “too many cars” rather than to focus blame just on ride-hailing.

At City Observatory, we’re huge fans of Streetsblog, the nation’s go to sounding board for transportation policy, and regularly count on ace reporter Angie Schmidt to both bring us up to date on what’s happening around the country, as well as offering trenchant observations. It’s rare that we’re in anything but enthusiastic agreement with her posts. This is one of those rare exceptions.

Don’t hate the player, hate the game

Rather than being the cause of our urban transportation problems, ride-hailing services like Uber and Lyft are symptomatic of underlying deep flaws in transportation, the most important of which is that we don’t price road space, particularly at the peak hour and particularly in dense urban environments at anything approaching its value.

There’s huge demand to travel in cities, and it has really only been checked by the price of parking and the (historic) numerical limits on for-hire vehicles.  As we’ve pointed out at City Observatory, parking prices serve as surrogate road pricing, and discourage people from driving their personal vehicles to downtown areas. Similarly, the limited number (and relatively high price) of taxis under the medallion system meant that taxis weren’t economically or numerically likely to overwhelm city streets.

The high cost of parking in city centers is a key reason that people ride transit. When bus fare is lower than parking costs, people tend to ride transit.  When parking is free, people drive their private automobiles.

There’s always been a huge, unrequited demand for peak hour travel in urban cores.  People are willing to pay a premium to travel in these spaces–which is why ride-hailing firms are able to impose surge pricing.

What Uber and Lyft have done is evade the two big limits on bringing more cars into central cities. They don’t have to pay for parking, because they don’t park.  And they’ve side-stepped the numerical limits on the number of for-hire vehicles allowed, which have been imposed by medallion schemes (although New York has recently capped the number of ride-hailed vehicle).

But let’s be clear–the underlying problem is not so much Uber and Lyft per se, it’s the fact that this very valuable, scarce resource–city streets–is un priced.  These companies are monetizing and capturing that value for themselves because we don’t charge anyone to use the streets.

And of course, our current debate about ride hailed vehicles is just a small scale dress-rehearsal of the challenges we’ll face when autonomous cars become plentiful. Fleets of autonomous cars will overwhelm city streets if those streets aren’t priced. Not having to pay a driver will cut fares more than in half from current levels (to as little as 25 to 40 cents per mile, cheaper than transit fares), and it’s likely that even owners of private cars will find it cheaper to have them slowly circle the block rather than pay for parking.

Imposing road pricing (or surcharges) just on ride-hailed vehicles or taxis misses the point that privately owned vehicles cause just as much congestion as ride-hailed ones. Trip taking is similar, and cruising for parking is probably as big a source of “wasted” miles in city centers as is dead-heading or cruising for fares.

To be sure, by tapping the latent demand that was held in check by parking prices and taxi limits, the ride-hailed firms have slowed traffic in downtowns like New York and San Francisco. And that has the knock on effect of reducing bus speeds and productivity, which probably explains part of the reason transit ridership is down (as does lower gas prices).  While it may be emotionally satisfying to paint Uber and Lyft as the villains here, the real problem is our unwillingness to price streets.

In at least one important respect, Uber and Lyft have performed a very important public service: they’ve educated millions of Americans about the marginal cost of automobile travel. Every ride-hailed trip is itemized, and priced by the mile and the minute. And surge pricing begins to show the high value (and cost) associated with travel in peak demand periods (although it allows the companies to capture this economic rent, rather than give it to the public, where it belongs). Moving in that direction is where we’ll find a solution to all of the problems on Angie’s list: The more car use is priced by the mile, and reflects congestion costs, the more efficient our transportation system will be.

Although they’ve been far from model corporate citizens in many respects, to their credit, both Uber and Lyft have endorsed road pricing. Rather than simply vilify them and ignore the more fundamental problem, we ought to be working together to fix a flawed system. It would help to do it before autonomous vehicles make this problem an order of magnitude worse than it is now.

Backfire: How widening freeways can make traffic congestion worse

Widening  I-5 in Portland apparently made traffic congestion worse

Oregon’s Department of Transportation (ODOT) is proposing to spend half a billion dollars to add two lanes to Interstate 5 at the Rose Quarter in Portland, with the hope that it will help relieve traffic congestion. But practical experience with freeway widenings in this area shows that more capacity actually makes the traffic worse. Today we show evidence that when ODOT widened I-5 between Lombard and Victory Boulevard a few years ago, it only managed to funnel more traffic more quickly into the I-5 Columbia River bridge chokepoint. The result: the bridge actually carried less peak hour traffic than before.

If a sink isn’t emptying rapidly enough, pouring more water into it only causes it to overflow more.

A bit of orientation, Interstate 5 is one of two major freeway connections between Vancouver, Washington and Portland, Oregon. There’s a large daily flow of commuters from residences north of the river in Washington to jobs in Oregon. Travel across the I-5 bridge (and I-205, a parallel route some five miles to the east) is heavily southbound in the morning and northbound in the evening. As in most US cities, PM peaks are more pronounced and travel slower than in morning peaks.

As we related in an earlier commentary, in 2010, the Oregon Department of Transportation completed an $70 million dollar project to widen a mile long portion of I-5 between Lombard and Victory Boulevard in North Portland, to, in their words eliminate a bottleneck.

Our earlier analysis examined the traffic crash record for that stretch of freeway, noting that rather than decreasing crashes, the crash rate actually increased after the freeway was widened. So that part of the project didn’t work.  But did “fixing” the bottleneck make the freeway work better?

Today, we take a look at traffic flows across the Columbia River I-5 Bridge, just north of the freeway widening project. In theory, removing the bottleneck should cause traffic to flow more freely.

But what appears to have happened is that the wider I-5 just funneled more peak hour traffic, more quickly into the bridge area. The result is that the roadway jams up more quickly, and that backups occur earlier and last longer, with the result that the freeway actually carries fewer cars than it could if traffic volumes we effectively metered more evenly by a somewhat lower capacity upstream of the bridge.

As traffic engineers well know, there’s a kind of backward bending speed-volume relationship.  A highway can carry a certain amount of traffic at a relatively high speed, but as more cars are added, the freeway both slows down–and loses capacity. What additional capacity can do is more quickly push a highway past this “tipping point” resulting in slower traffic and lower throughput.

Source: Washington State Department of Transportation

Data provided by the Clark County Washington Regional Transportation Council (RTC) seems to show that’s just what happened on I-5 after the 2010 widening project was completed. Prior to the completion of the project, the Northbound peak hour flows over the I-5 bridges were always somewhat greater than 5,000 vehicles per hour, fluctuating between 5,000 and 5,500 vehicles per hour on a typical peak hour on a Thursday afternoon (a data point selected as typical by RTC). The following chart shows peak hour traffic on I-5 in October 2006, with the blue line corresponding to Northbound traffic.

Notice that for the three lanes of I-5, 5,000 vehicles per hour works out to about 1,700 vehicles per lane per hour, squarely in the “yellow zone” where traffic speeds and capacity become unstable. These data also show that the morning and afternoon peak volumes are approximately equal (topping out at just over 5,000 vehicles.

Here’s a similar chart for 2016 (the most recent year available. On this typical Thursday, Northbound traffic never reached 5,000 vehicles per hour. Notice that the morning peak remains at its earlier level–it’s only the afternoon volumes that have fallen. This 10 percent reduction in throughput over the I-5 bridges in the Northbound Direction in the afternoon peak is likely a result of funneling additional traffic on to I-5, thanks to the freeway widening and ramp “improvements” ODOT put in place since 2006.

The Regional Transportation Council has generated similar charts for selected years between 1983 and 2016.  (They’ve even got a clever animation of these charts showing how the pattern of traffic has changed over time). We’ve aggregated the data for all the years reported by RTC into a single chart showing the maximum PM peak hour volume traveling Northbound across the I-5 bridges.

After 2010, peak hour volumes on the I-5 Northbound have been consistently below 5,000 vehicles per hour, ranging between 4,400 and 4,600 between 2012 and 2016.  (Again, RTC selected data for even numbered years).  What these data show is that the hourly volume of cars crossing the I-5 bridges at the peak hour has fallen close to 10 percent since the bottleneck was removed.

Again, as we noted earlier, southbound traffic in the morning peak hour continues to flow at a rate of about 5,000 vehicles per hour. Our statistical analysis is admittedly a first brush, but we’ve seen nothing in ODOT’s analysis of I-5 operations that suggests its incorrect.

This analysis points up the futility of “bottleneck busting” incremental freeway expansion.  Widening a freeway at one point simply delivers more traffic, faster, to the next bottleneck in the system, causing it to be the new source of the problem.

Ironically, bottlenecks at one point in the system act as “meters” to control the flow of traffic on to subsequent sections of the roadway. Delaying traffic at one point–as we do intentionally with ramp meters–allows the downstream sections of the roadway to flow without exceeding capacity and moving into the backward bending part of the speed/volume relationship.

The beneficial effects of this metering process are apparent in Seattle’s recent experience with the closure of the Alaskan Way Viaduct.  This limited access highway, which carried 90,000 vehicles per data through downtown Seattle (I-5 at the Rose Quarter carries about 121,000) was closed in mid January 2019. Despite predictions of “Viadoom,: based on the theory that traffic would spill over onto adjacent city streets and overwhelm the parallel segment of I-5 in Seattle, traffic in Seattle was, if anything, somewhat better than before the viaduct closed. The Seattle Times reported, “the cars just disappeared.” By shutting down the flow of traffic from the viaduct to Seattle streets, the closure reduced the demand on those streets and enabled traffic to flow more smoothly.

The practical experience with widening I-5 shows that eliminating bottlenecks in one place simply leads to the more rapid congestion of the next downstream bottleneck, and ironically, lower throughput on the freeway system.  It might seem paradoxical that highway engineers would allow this to happen, but if you’re more interested in generating excuses to build things, rather than actually managing traffic flows, it makes some sense.  As we’ve argued before, it seems as if highway engineers treat the sisyphean aspects of perpetually chasing bottlenecks, not as a bug, but as a feature. To them, the fact that widening one stretch of freeway to eliminate one bottleneck simply creates another one is a guarantee of permanent employment, not a fundamental flaw in engineering practice.

Rose Quarter freeway widening won’t reduce congestion

Spending half a billion dollars to widen a mile of I-5 will have exactly zero effect on daily congestion.

The biggest transportation project moving forward in downtown Portland isn’t something related to transit, or cycling (or even bringing back shared electric scooters). It’s a proposal to spend half a billion dollars widening a mile long stretch of Interstate 5 adjacent to the city’s Rose Quarter.

Build it and they will drive. Wider freeways produce more traffic, not less congestion.

The project’s been advertised as a congestion-busting, bottleneck removal project. But sadly, even if the state spends a half billion dollars here, daily traffic conditions won’t improve. We know that’s true because of the well-documented phenomenon of induced demand. And as it turns out, even state and local transportation experts conceded that will be the case.

Induced Demand:  The futility of widening freeways

Time and again, cities around the US and around the world have widened freeways with the avowed purpose of reducing congestion. And its never worked. One need look no further that the current US record holder for widest freeway, Houston’s 23-lane Katy freeway.  It was most recently expanded in 2010 at a cost of $2.8 billion to reduce congestion. It was even touted by road-building advocates as a poster child for freeway widening projects. But, as we’ve reported at City Observatory, less than three years after it opened, peak hour travel times on the Katy Freeway were 30 to 55 percent longer than they had been before the freeway was widened. The added capacity was swamped by induced demand, and congestion–and pollution and sprawl–were worse than ever.

No matter how many lanes you add, it ends up like this.

If there were ever any doubt that this was the case, all one had to do was pay attention to what happened to Seattle when the city abruptly closed its Alaskan Way viaduct, a limited access highway carrying about 90,000 cars a day through the city’s downtown. (For reference, I-5 at the Rose Quarter handles 121,000).  City leaders warned of Carmageddon, gridlock and Viadoom, that downtown streets and freeways would be overwhelmed by the traffic usually carried on the viaduct. But in the two weeks following the viaduct’s closure, traffic was at or below normal, as the Seattle Times reported “the cars just disappeared.” The reason is the inverse of induced demand: when you reduce the amount of urban freeway space, traffic does simply back up, it actually goes away (as people take other modes, change when they take their trips, substitute more local destinations for further away ones, and consolidate trips). Far from be a fixed quantity, traffic is like a gas that expands to fill the space available.

This phenomenon is now so well-documented that it is referred to in the published journals of economics as “The Fundamental Law of Road Congestion.” Adding more un-priced highway capacity in urban settings only generates more traffic and does nothing to lower congestion levels.

Even agency experts agree: It’s futile and won’t fix daily congestion

Even the staff of the two agencies most responsible for the project concede that this is the case. Mauricio LeClerc is a principal transportation expert for the Portland Bureau of Transportation. Here’s his testimony to the Portland Planning and Sustainability Commission.

When we did the analysis, the congestion benefit is on the elimination of crashes—non-recurring congestion. The congestion benefit of just adding more lanes was very limited.
Basically you’re fixing something.  Certainly, there’s an improvement, but it’s not very large.  If you are familiar with the freeway system, it’s congested to the north, it’s congested to the south, and if you’re going to I-84, it’s just going to be congested as you enter I-84.  So, it has limited utility, but it does have a very significant safety and non-recurring congestion benefit.  So, we’re not sure what the induced demand, if that gets modeled, it’s potential, but it’s not very large.
(Emphasis added)
Portland Planning and Sustainability Commission
February 28, 2017,; at 37:00

This point was also confirmed by Travis Brouwer, a spokesman for the Oregon Department of Transportation, in response to questions posed by Jeff Mapes of Oregon Public Broadcasting.

Jeff Mapes:  It’s interesting, ODOT’s arguments—that’s the Oregon Department of Transportation—you know they’ve shifted a bit since the battleground has shifted now from the State Legislature to the City of Portland.  And they’re emphasizing now more the safety concerns—there are a lot of crashes there—but frankly the large majority of them are fender benders and that sort of thing, and secondly, but basically, they are saying if you take care of a lot of those fender benders, its; going to reduce a lot of delays that frequently happen there.
Here’s Travis Brouwer, he’s the assistant director of ODOT. He makes sort of their subtle case for the project, I guess:
Travis Brouwer: We fully admit that this is not going to eliminate congestion at the Rose Quarter, But, we do expect it will make traffic a lot better.
OPB Politics Now, October 12, 2017
(Emphasis added).

There’s a bit of nuance here that both LeClerc and Brouwer are alluding to:  The project won’t reduce congestion, except perhaps congestion related to crashes. You’ll notice that LeClerc makes reference to the congestion benefit of the elimination of crashes and “non-recurring congestion benefit.” Here’s the translation from engineering speak:  Roads get jammed up for two reasons:  first, the regular daily flood of traffic at the peak hour, and second when there’s a crash. What LeClerc and Brouwer are saying is this project will do nothing to reduce the regular daily traffic jams on I-5. As to that non-recurring component, lowering congestion by reducing crashes–we’ll take a close look at that in part II of this analysis.

A wider freeway won’t mean less daily traffic congestion.  Even though it seems like spending half a billion dollars ought to make a difference it won’t.  The Rose Quarter freeway widening project is either  a half-billion dollar ritual sacrifice to the freeway gods, or the world’s most expensive piece of performance art. But there is one thing it is surely not: any kind of a solution to daily congestion on a freeway at the center of one of the nation’s most vibrant metropolitan areas.

 

More driving, more dying: Dangerous by Design, 2019

More driving and our car-oriented transportation system killed 50,000 pedestrians in the past decade

Each year, Smart Growth America produces its annual report Dangerous by Design looking at pedestrian deaths and injuries. Once again, this is grim reading, but the report, as always, is a vital public service that brings home just how serious this problem is.  It also shows that the problem has gotten steadily worse. This is no accident.  When we optimize urban environments for rapid vehicle travel, we incentivize behaviors that put pedestrians at risk. The good news, if there is any, is that some cities have much lower rates of pedestrian death and injury, and by studying what they do, and especially the way they’re built, we can see how we might reduce this devastating toll.

The Trend:  More Driving = More Dying

The major takeaway is that as driving has increased in the US over the past decade, more pedestrians have been killed, even though the number of car occupants dying in motor vehicle crashes has decreased over that same period. Our road safety problem is increasingly a problem of pedestrian deaths. That means that getting a reduction in fatalities–much less achieving vision zero–will require much stronger efforts to improve pedestrian safety.

Cross-sectional evidence: Sprawling, car-dependent cities kill pedestrians

Although pedestrian deaths are up almost everywhere, there are still enormous variations across cities in pedestrian death rates. It’s tempting to focus on a ranking of the “worst” and “best” cities for pedestrians, but the real value of this report comes not from blaming and shaming cities, but by providing a clear understanding of what features of cities and urban development are correlated with pedestrian safety.

In general, it’s sprawling, car-dependent cities that have the highest pedestrian death rates.  One of the big challenges in evaluating the rate and prevalence of pedestrian fatalities is coming up with a good denominator for pedestrian activity.  While we have copious data on car volumes–every city and state gathers data on ADT (“average daily traffic”) on major roadways, pedestrian counts are irregular, rare and incomplete. This particular data blindness is yet another sign of the car-centric nature of our “transportation” data.

Dangerous by Design proxies for this by looking at the share of the population that walks to work. Though work trips are only a fraction of all walk trips, its undoubtedly a good way of distinguishing places where walking is fairly common from places where it is not. The report combines its estimates of the number of pedestrian fatalities per 100,000 persons with the data on the share of the population that commutes to work on foot to compute a “pedestrian danger index” which we think best represents the relative safety of different places for people who are walking.

What’s really striking about the statistic is that there’s a truly enormous variation among metropolitan areas. The pedestrian danger index in the worst cities is an order of magnitude greater than it is in the best cities.  For example, Orlando has a Pedestrian Danger Index of 313, which is more than ten times greater than in metropolitan New York City, where the index is 27.

In epidemiological terms, this wide variation is a strong signal of the importance of geography as a hazard.  If these were deaths from cancer, for example, there’s little question that we would regard cities with a high pedestrian death index as “hot spots” and look for ways to immediately reduce the risks from these environmental factors.

Multi-lane arterials are especially deadly for pedestrians

The persistent geographic pattern of high pedestrian death rates, concentrated in the sunbelt, signals that the very design of the road system is responsible for higher death rates. Dangerous by Design observes:

Part of the reason for this may be because much of the growth in these places occurred in the age, and the development scale of, the automobile. Previous research by Smart Growth America found that in general, the most sprawling metropolitan areas with wider roads and longer blocks typically cluster in the southern states. Furthermore, academic research has consistently linked these sprawling growth patterns to higher rates of both traffic-related deaths for people walking and traffic-related deaths overall

One type of roadway seems to be especially dangerous for pedestrians:  multi-lane arterials. Unlike local streets with only one travel lane in each direction, multi-lane arterials pose numerous hazards.  They have higher speeds and encourage cars to pass one another. They complicate turning movements at intersections, and also increase the distance that pedestrians have to travel to cross streets. And in many cities, marked pedestrian crossings are so widely spaced that many will cross mid-block.

Detailed analysis of crash locations in Portland, Oregon shows that these arterials are the big killers. In its most recent State of Safety report, Portland’s Metro (the regional planning agency) says:

Arterial roadways comprise 73% of the region’s serious crashes, 77% of the serious pedestrian crashes, and 65% of the serious bicyclist crashes, while accounting for 12% of road miles.

Streets with more lanes have an especially high serious crash rate for pedestrians, producing higher crash rates per mile and per VMT as compared to other modes.

Too often, pedestrian safety gets treated as a communication or education problem:  If we just got people driving and walking to pay more attention, there’d be fewer fatalities. But the data tell a very different story:  We’ve designed a transportation system, that by prioritizing car travel and emphasizing speed, predictably kills people who walk and bike. It will take much more fundamental changes that a glitzy campaign or a catchy slogan to change that reality.

Economists & Scientists agree: To save the planet, we have to price carbon

One thing economists agree about: pricing carbon is essential to saving the planet; but if you don’t believe economists, you ought to believe Bill Nye, the Science Guy.

Economists are famous for disagreeing with one another. For every proposition, there is an equal and opposite economist. An even economists frequently have trouble selecting a single conclusion. Harry Truman famously lamented that he wanted a one-armed economist, so that he would never find his advice qualified with “on the other hand.”

Economists of all political stripes support carbon taxes

It’s striking, therefore that the leading lights of the profession, conservative and liberal, freshwater and saltwater, republican and democrat, are all in broad agreement on what it will take to tackle our growing climate crisis.  Their collective statement on climate change was published by The Wall Street Journal.  It’s clear, direct and to the point.  We reprint it in its entirety here.

Global climate change is a serious problem calling for immediate national action. Guided by sound economic principles, we are united in the following policy recommendations.

I. A carbon tax offers the most cost-effective lever to reduce carbon emissions at the scale and speed that is necessary. By correcting a well-known market failure, a carbon tax will send a powerful price signal that harnesses the invisible hand of the marketplace to steer economic actors towards a low-carbon future.

II. A carbon tax should increase every year until emissions reductions goals are met and be revenue neutral to avoid debates over the size of government. A consistently rising carbon price will encourage technological innovation and large-scale infrastructure development. It will also accelerate the diffusion of carbon-efficient goods and services.

III. A sufficiently robust and gradually rising carbon tax will replace the need for various carbon regulations that are less efficient. Substituting a price signal for cumbersome regulations will promote economic growth and provide the regulatory certainty companies need for long- term investment in clean-energy alternatives.

IV. To prevent carbon leakage and to protect U.S. competitiveness, a border carbon adjustment system should be established. This system would enhance the competitiveness of American firms that are more energy-efficient than their global competitors. It would also create an incentive for other nations to adopt similar carbon pricing.

V. To maximize the fairness and political viability of a rising carbon tax, all the revenue should be returned directly to U.S. citizens through equal lump-sum rebates. The majority of American families, including the most vulnerable, will benefit financially by receiving more in “carbon dividends” than they pay in increased energy prices.

The letter is signed by these distinguished economists, including among their members Nobel Prize winners, Chairs of the Federal Reserve Bank, Treasury Secretaries, and Chairs of the President’s Council of Economic Advisers.

George Akerlof, Robert Aumann, Angus Deaton, Peter Diamond, Robert Engle, Eugene Fama, Lars Peter Hansen, Oliver Hart, Bengt Holmström, Daniel Kahneman, Finn Kydland, Robert Lucas, Eric Maskin, Daniel McFadden, Robert Merton, Roger Myerson, Edmund Phelps, Alvin Roth, Thomas Sargent, Myron Scholes, Amartya Sen, William Sharpe, Robert Shiller, Christopher Sims, Robert Solow, Michael Spence and Richard Thaler are recipients of the Nobel Memorial Prize in Economic Sciences.

Paul Volcker is a former Federal Reserve chairman.

Martin Baily, Michael Boskin, Martin Feldstein, Jason Furman, Austan Goolsbee, Glenn Hubbard, Alan Krueger, Edward Lazear, N. Gregory Mankiw, Christina Romer, Harvey Rosen and Laura Tyson are former chairmen of the president’s Council of Economic Advisers.

Ben Bernanke, Alan Greenspan and Janet Yellen have chaired both the Fed and the Council of Economic Advisers.

George Shultz and Lawrence Summers are former Treasury secretaries.

Preparation of the letter was organized by the Climate Leadership Council.  More information is available at www.clcouncil.org.

And so does Bill Nye, the Science Guy.

One trope of the classic 1950s disaster movie was the coterie of scientists announcing to the world the nature of the new threat (Godzilla, aliens, etc).  In those movies, the world’s leaders believe what the scientists are telling them and act accordingly and even if the first solutions don’t work, they persist. Well, at least that’s how things work in the movie.  In reality, we seem to be very good at ignoring or denying credible warnings from smart people.

OK:  Appeals to authority may not work. Maybe we need simple dramatics for those of us who can only digest information in the simplest terms.  In that case, let’s turn the microphone over to Bill Nye, the Science Guy, with a media assist from Last Week Tonight’s John Oliver.

Nye explained how a carbon tax is essential to reducing greenhouse gas emissions and solving climate change:

The simple economics of carbon taxation.

Many believe that the problem of climate change is subject to a technical fix. Like engineer Montgomery Scott in an episode of Star Trek, some last minute tweak to the ships engines (reversing the polarity of the di-lithium crystals) will avert catastrophe in the nick of time (i.e. about minute 48 of this week’s show).

Our desire for solutions that are free and easy and that someone else (i.e. Montgomery Scott) does all the work on seem preferable, but they’re hardly realistic.  In the face of a global catastrophe, we’ve got to expect to change our behavior. And frankly, Bill seems to be losing his patience with us.

Even if you believe technology is a critical part of the problem–perhaps especially if you believe technology can solve the problem–its important to recognize that pricing carbon is essential to providing the incentives and financial capital needed to develop that technology.  As long as fossil fuels are cheap, businesses and investors  have no incentive to to develop or buy such technologies; inventors will find it hard to raise capital for such technologies, and consumers will have little reason to buy them. Pricing carbon sends the right signals to everyone, both to stop using the dirtiest technologies, and to refine and improve the cleanest ones. Take it from economists, or from science guys.

 

The high cost of low house prices

Low house prices signify problems, not affordability

There’s a presumption that low housing prices are a sign of affordability, and a related belief that if housing prices rise, that its “a bad thing” because it must mean that a neighborhood is becoming less affordable.

If only it were that simple.

To economists, prices are signals of the value that people attach to things.  In the case of housing, the price that someone is willing to pay for a house (or the rent one is willing to pay for an apartment) reflects collectively judgments about the value or utility of that residence.  Some cities, and some neighborhoods and some houses command very low prices (or rents) because no one wants to live there.  If a house is small, dingy, and poorly maintained, it will have a lower  price than one that is roomy, clean, and in good shape. If a neighborhood is polluted, crime ridden and has poor schools, its housing will have lower prices than one that is clean and green, safe, and has high performing schools.

In effect, high or rising prices for real estate tend to reveal collective perceptions about the value and desirability of a house or neighborhood. This thinking is reflected in our analysis of what we call “the Dow of Cities.”  We’ve noted that centrally located housing has appreciated relative to suburban housing over the past two decades in the United States.  The greater appreciation of housing in close-in neighborhoods reflects in a tangible, monetary way, the growing relative value that Americans attach to being able to live in great urban environments.

Two recent articles explore what the ups and downs of real estate prices signal for housing. Both are drawn from rustbelt cities.  First, our friend and City Observatory commentator Jason Segedy, dismantles the illusion that cheap houses are a sign of affordable neighborhoods. Everyone marvels at the affordability of housing in distressed Rustbelt cities.  Why, you can pick up a house for at little as $40,000 in some neighobrhoods in Cleveland, Detroit, or Gary.  But low housing prices are a mirage and a delusion, according to Segedy.  He points out that the mortgage payment is just a fraction of the price of ownership, especially so for older homes. The reason they command such low purchase prices is because they typically have extensive deferred maintenance. He chronicles his own experience fixing up a relatively well-maintained older home in Akron.  After replacing the roof, a rotting deck, a worn-out furnace, and attending to painting and other issues, he ended up spending as much on maintenance as on mortgage payments. In contrast, newer homes have higher mortgage payments, but–at least initially–much lower maintenance costs. And the kicker is that low home prices discourage maintenance, leading to the decay, and ultimately demolition of housing.  It just doesn’t make sense to spend $10,000 on a new roof if it doesn’t increase the resale price of the house by at least that amount.  Consequently, the bargain prices of housing in some older, distressed cities is not a cause of affordability celebrations, but a sign (and a contributor to) deep-seated malaise. As Segedy concludes, somewhat higher prices would actually be a good thing in many of these communities.

The second article comes from Pete Saunders, who relays a recent story from the Detroit News on rising home prices in the Motor City. According to a story in the Detroit Free Press, the median home sale in the city has risen 41 percent to 38,500.  Prices of less than $40,000 are still too cheap, as Saunders notes, prices this low “aren’t indicators of an affordable real estate market, but a broken one.” But the good news here is that prices are headed up.  That’s a sign of growing demand in Detroit, and also means that homeowners who maintain or fix up their homes may have a reasonable expectation of seeing a financial return for their expenditures. The map of home price appreciation shows that gains have been greatest in neighborhoods close to the central business district.

As we’ve noted before, there’s always a strong geographic component to neighborhood revitalization. Values seldom move upward at the same rate in every neighborhood, even in a broad-based turnaround.  Some places will see larger gains, in large part because of the interdependency among different land uses. An area with new housing development is attractive for further housing development because its achieved (or is achieving critical mass). Similarly, such an area will be a favorable location for a new store, and in turn, the presence of a new store makes the neighborhood a more desirable place to live.

There’s a real bifurcation of US housing markets.  In some places, housing is expensive, with average home prices well above the cost of replacement, chiefly due to regulatory constraints on new construction. But according to analyses by economists Ed Glaeser and Joseph Gyourko, vast swaths of the nation have housing prices that are below replacement cost, meaning that owners have little incentive to invest in maintenance. They estimate that about a third of all housing in the US is in markets where the median price of housing is less than 75 percent of replacement cost.  That’s a good rough estimate of the share of the nation that finds itself in the Akron’s or Detroit’s situation, and which an increase in home prices would actually signal things are getting better.

 

 

 

Housing: Missing Middle or Missing Massive?

Gradually, more people and elected leaders are admitting that more housing density is needed if we’re to tackle housing affordability, and provide equitable opportunities to live in great cities and neighborhoods.

But like a swimmer cautiously dipping a toe in a fresh stream, we’re proceeding slowly:  It’s been (relatively) easy to talk about “missing middle” housing–duplexes, triplexes and other small-scale multi-family.  We can get on board with “gentle density”

While missing middle reforms are a long-overdue step in the right direction, they don’t go far enough.  In many places, if we’re going to tackle this problem at scale, we need to be thinking much bigger.  At the 2024 Yimbytown Conference, Alex Armlovich tweeted out a provocative, only slightly tongue in cheek call for us to prioritize “Missing Massive” housing.

 

We don’t need massive housing everywhere, but in urban locations with great transit, amenities, solid infrastructure and economic opportunity, it makes sense to build more, not less.  One advantage of higher density is, by definition, that it requires less land, and therefore requires less disruption than adding an equivalent number of homes by replacing single family homes with duplexes or triplexes.

As we wrote a while back, like Roy Scheider’s character in Jaws, once we get a close look at the problem, we’ll realize we’re going to need a bigger boat.

In the words of Sheriff Brody (Roy Scheider), “You’re going to need a bigger boat.”

At City Observatory, we’re excited as anyone that there seems to be a growing groundswell of support around the nation for eliminating exclusively single-family zoning.  Minneapolis has garnered national headlines by legalizing duplexes and triplexes in zones once reserved for detached housing. Oregon has adopted a measure that bans exclusive single-family zoning in cities over 10,000 statewide.

These measures are an important step forward to facilitating some of the “missing middle” housing types that provide a range of affordable housing and greater income diversity, in what otherwise might be exclusive and expensive neighborhoods.  This is an important symbolic and rhetorical advance for the YIMBY movement, and a key breakthrough in changing the way we talk about neighborhoods and housing:  single family zoning is no longer sacrosanct and politically untouchable.

But, it must be said, this is only a first step.

Legalizing accessory dwelling units, duplexes, triplexes and fourplexes does hold the promise of adding to affordability while injecting some very “gentle density” into single family neighborhoods. But it is far from up to meeting the scale of our housing affordability problems. For that, we need to build larger multi-family buildings, including apartments.

Row houses vs. apartments

Consider two buildings, both built on similar sized lots, just three blocks apart on Northeast Seventh Avenue in Portland. Both were completed in the last two years.  Both are on relatively large corner lots that were purchased by developers in 2014.  At the corner of Seventh Avenue and Thompson Street, a developer tore down a small existing home on a 12,500 square foot lot and built four adjoining row houses called the “New Albina Rowhouses.”

Just down the street, a second developer demolished a derelict 1920s era gas station on an 18,250 square foot lot, and built a six story, 68 unit apartment building, called “The Russell.”

Both of these projects added more housing in a high-demand area.  It’s worth comparing their cost and their impacts on the supply and affordability of housing in the neighborhood.

The fourplex is a great example of “missing middle” development, and relatively low-impact in-fill, while the six-story apartment building is an exemplar of the kind of development that neighborhood associations love to hate and indeed, neighbors protested strongly against this building when it was proposed. (As a sidelight, because the site was zoned for apartments, under Oregon’s land use laws, the developer was able to proceed with the project essentially by-right, in spite of neighbors objections.)

Apartments are more affordable due to lower land costs per unit

Both lots sold for almost the same price prior to development in 2014, according to property tax records: one sold for $35 per square foot, the other for $36. This has a straightforward implication for affordability.  Households who want to live in the fourplex have to buy about ten times as much land to accomodate their housing unit as households who want to live in the apartments. Each row-house needs about 3,175 square feet of land (at $35 per square foot, about $113,000 of land), while each apartment needs just 268 square feet of land (or about $9,400 of land per unit).

From the standpoint of housing supply, there’s about a ten-fold difference between the two projects.  The fourplex  provides four housing units (a net gain of three from the previous small, single family home of the site).  The six-story apartment building provides 68 housing units on a lot about 50 percent larger. (All of these were a net gain as the site had no housing previously). On a per square foot basis, the apartments provide about ten times as many housing units for the neighborhood.  That means the apartments soak up ten times as much demand, which would otherwise lead to households bidding up the price of housing in the neighborhood and elsewhere in the city. To accomplish the same increment to housing supply, you’d need to build about twenty fourplexes in place of existing single family homes, as each fourplex yields a net gain of 3 housing units.

Apartments mean less widespread disruption and demolition

The space efficiency of apartments is actually a boon if you’re looking to avoid widespread neighborhood disruption and minimize demolitions of existing homes.  In contrast to a four-plex for single family in-fill process, every “Russell” that we build avoids about 20 demolitions of single family homes. The “Russell” has a bigger impact on adjacent properties than the a single row house project, but you’d need 20 more row house projects, somewhere, to provide as much housing as the Russell.

The great thing about apartments is you don’t need a lot of land to build much, much more housing. There are plenty of gas stations, parking lots, under-utilized strip malls and similar properties to accomodate thousands of apartments in every city in the US–if the zoning and development approval processes allow it.

There are significant opportunity costs to underbuilding density in prime urban locations.  Both the Russell and the Albina row houses will likely still be standing in fifty or 100 years. It probably won’t make economic sense to demolish the row houses and build apartments on that site any time.  So committing that site to a fourplex, rather than a Russell-sized building forecloses that possibility for many decades–and means that either apartments (or many more fourplexes) will have to be built on less desirable sites in the meantime.

There’s a good argument to be made that we’re underbuilding density in many locations where it makes sense.  Neighborhoods with great transit, a mix of commercial uses, and high levels of walkability may justify more than just a duplex or triplex on a particular site.  We ought to be think about the long term, what the demand for the neighborhood is likely to be in 2050 or even 2075, rather than in terms of the 20 year time horizon of most housing affordability analyses.

Given that the average lifetime of a building is 50 or 100 years, building a duplex now may foreclose building 12 or 20 or 50 units on the same site for several decades. And cities and neighobrhoods don’t tend to grow my incremental changes to the density of buildings, i.e. with single family buildings in one decade, giving way to duplexes in the next decade, fourplexes in the following decade, garden apartments in the next decade, and mid-rise apartments later.  It’s far more economic to replace one or two single family homes with several dozen apartments, and leave the rest of the neighborhood untouched. Not to mention more politically palatable.

It’s helpful to remember that most of the neighborhoods that exhibit a mix of housing types, including duplexes, triplexes and fourplexes were originally built that way, because early zoning codes (especially prior to World War II) didn’t prescribe only single family buildings. It’s rare to see a neighborhood completely retrofitted with just slightly higher density. Allowing duplexes, triplexes and fourplexes is the sort of thing that is mostly likely to promote affordability and mixed income in peripheral or suburban or greenfield locations than it is in built up urban neighborhoods.

None of this should discourage us from celebrating the important political, rhetorical and practical victory of ending the long established bans on duplexes, triplexes and fourplexes in most residential areas. That’s a huge step forward. But dealing with housing supply and affordability will inevitably require a strategy to make it easier to build apartments in more places. We’re going to need a bigger boat.

You’re going to need a bigger boat

Eliminating exclusively single-family zones won’t provide enough density: Recognizing the limits of “missing middle” as a solution to urban affordability 

At City Observatory, we’re excited as anyone that there seems to be a growing groundswell of support around the nation for eliminating exclusively single-family zoning.  Minneapolis has garnered national headlines by legalizing duplexes and triplexes in zones once reserved for detached housing. Oregon is considering a measure that would ban exclusive single-family zoning in cities over 10,000 statewide.

These measures are an important step forward to facilitating some of the “missing middle” housing types that provide a range of affordable housing and greater income diversity, in what otherwise might be exclusive and expensive neighborhoods.  This is an important symbolic and rhetorical advance for the YIMBY movement, and a key breakthrough in changing the way we talk about neighborhoods and housing:  single family zoning is no longer sacrosanct and politically untouchable.

But, it must be said, this is only a first step.

Legalizing accessory dwelling units, duplexes, triplexes and fourplexes does hold the promise of adding to affordability while injecting some very “gentle density” into single family neighborhoods. But it is far from up to meeting the scale of our housing affordability problems. For that, we need to build larger multi-family buildings, including apartments.

In the words of Sheriff Brody (Roy Scheider), “You’re going to need a bigger boat.”

Row houses vs. apartments

Consider two buildings, both built on similar sized lots, just three blocks apart on Northeast Seventh Avenue in Portland. Both were completed in the last two years.  Both are on relatively large corner lots that were purchased by developers in 2014.  At the corner of Seventh Avenue and Thompson Street, a developer tore down a small existing home on a 12,500 square foot lot and built four adjoining row houses called the “New Albina Rowhouses.”

Just down the street, a second developer demolished a derelict 1920s era gas station on an 18,250 square foot lot, and built a six story, 68 unit apartment building, called “The Russell.”

Both of these projects added more housing in a high-demand area.  It’s worth comparing their cost and their impacts on the supply and affordability of housing in the neighborhood.

The fourplex is a great example of “missing middle” development, and relatively low-impact in-fill, while the six-story apartment building is an exemplar of the kind of development that neighborhood associations love to hate and indeed, neighbors protested strongly against this building when it was proposed. (As a sidelight, because the site was zoned for apartments, under Oregon’s land use laws, the developer was able to proceed with the project essentially by-right, in spite of neighbors objections.)

Apartments are more affordable due to lower land costs per unit

Both lots sold for almost the same price prior to development in 2014, according to property tax records: one sold for $35 per square foot, the other for $36. This has a straightforward implication for affordability.  Households who want to live in the fourplex have to buy about ten times as much land to accomodate their housing unit as households who want to live in the apartments. Each row-house needs about 3,175 square feet of land (at $35 per square foot, about $113,000 of land), while each apartment needs just 268 square feet of land (or about $9,400 of land per unit).

From the standpoint of housing supply, there’s about a ten-fold difference between the two projects.  The fourplex  provides four housing units (a net gain of three from the previous small, single family home of the site).  The six-story apartment building provides 68 housing units on a lot about 50 percent larger. (All of these were a net gain as the site had no housing previously). On a per square foot basis, the apartments provide about ten times as many housing units for the neighborhood.  That means the apartments soak up ten times as much demand, which would otherwise lead to households bidding up the price of housing in the neighborhood and elsewhere in the city. To accomplish the same increment to housing supply, you’d need to build about twenty fourplexes in place of existing single family homes, as each fourplex yields a net gain of 3 housing units.

Apartments mean less widespread disruption and demolition

The space efficiency of apartments is actually a boon if you’re looking to avoid widespread neighborhood disruption and minimize demolitions of existing homes.  In contrast to a four-plex for single family in-fill process, every “Russell” that we build avoids about 20 demolitions of single family homes. The “Russell” has a bigger impact on adjacent properties than the a single row house project, but you’d need 20 more row house projects, somewhere, to provide as much housing as the Russell.

The great thing about apartments is you don’t need a lot of land to build much, much more housing. There are plenty of gas stations, parking lots, under-utilized strip malls and similar properties to accomodate thousands of apartments in every city in the US–if the zoning and development approval processes allow it.

There are significant opportunity costs to underbuilding density in prime urban locations.  Both the Russell and the Albina row houses will likely still be standing in fifty or 100 years. It probably won’t make economic sense to demolish the row houses and build apartments on that site any time.  So committing that site to a fourplex, rather than a Russell-sized building forecloses that possibility for many decades–and means that either apartments (or many more fourplexes) will have to be built on less desirable sites in the meantime.

There’s a good argument to be made that we’re underbuilding density in many locations where it makes sense.  Neighborhoods with great transit, a mix of commercial uses, and high levels of walkability may justify more than just a duplex or triplex on a particular site.  We ought to be think about the long term, what the demand for the neighborhood is likely to be in 2050 or even 2075, rather than in terms of the 20 year time horizon of most housing affordability analyses.

Given that the average lifetime of a building is 50 or 100 years, building a duplex now may foreclose building 12 or 20 or 50 units on the same site for several decades. And cities and neighobrhoods don’t tend to grow my incremental changes to the density of buildings, i.e. with single family buildings in one decade, giving way to duplexes in the next decade, fourplexes in the following decade, garden apartments in the next decade, and mid-rise apartments later.  It’s far more economic to replace one or two single family homes with several dozen apartments, and leave the rest of the neighborhood untouched. Not to mention more politically palatable.

It’s helpful to remember that most of the neighborhoods that exhibit a mix of housing types, including duplexes, triplexes and fourplexes were originally built that way, because early zoning codes (especially prior to World War II) didn’t prescribe only single family buildings. It’s rare to see a neighborhood completely retrofitted with just slightly higher density. Allowing duplexes, triplexes and fourplexes is the sort of thing that is mostly likely to promote affordability and mixed income in peripheral or suburban or greenfield locations than it is in built up urban neighborhoods.

None of this should discourage us from celebrating the important political, rhetorical and practical victory of ending the long established bans on duplexes, triplexes and fourplexes in most residential areas. That’s a huge step forward. But dealing with housing supply and affordability will inevitably require a strategy to make it easier to build apartments in more places. We’re going to need a bigger boat.

No deal: Why a CRC revival is going nowhere

Reviving the Columbia River Crossing will never happen:  the two sides have incompatible aims

There are continued rumblings in the Portland-Vancouver metropolitan area about reviving the abandoned plan to spend $3 billion or more on a grand Columbia River Crossing to replace the existing 6-lane Interstate 5 freeway bridge with a 12-lane structure and a short light rail extension.  The project foundered almost five years ago over disagreements between Oregon and Washington about who would pay for the bridge.

The ill-conceived Columbia River Crossing is still dead, and will stay that way. (Graphic: BikePortland.org)

The bridge is the site of daily traffic jams, as thousands of people who work and shop in Oregon but live north of the Columbia River all crowd on one of the two interstate freeway bridges that connect the two states. The sentiment in Washington State is that the capacity of the freeway should be expanded; Oregonians, who have built an 60 mile long light rail system, have argued that any new bridge should connect Vancouver Washington to that system. Oregonians, it must say, are skeptical of the value of new freeways; many Washington residents are vehemently opposed to light rail (regularly labeled “crime rail”), and also aren’t interested in paying tolls to cross the river.

The original project was a compromise between the two states that on paper gave both sides some of what they wanted:  a light rail extension for Oregon, a doubling of freeway lanes for Washington.  No one ever bothered to ask why one would massively increase freeway capacity in a corridor in which one was also making a big investment in light rail: building one tends to undercut the rationale for spending money on the other. The project, led by the Oregon and Washington Departments of Transportation bumbled and bungled through a planning process that cost nearly $200 million and took seven years.  Just one example: The project suffered a year of delay because the bridge had to be re-designed for a higher navigation clearance, something highway engineers had been told for years, but had ignored. Ultimately the two state DOT’s produced a project that had serious financial holes, and which would, as proposed have been a major transportation disaster:  the project’s own studies showed that tolling the new bridge while leaving a parallel existing bridge untolled would produce gridlock while leaving the new bridge vastly under-used.

In the past couple of months, a few local groups have been agitating to revive the project in some form.  A special committee of the Washington Legislature held a meeting, across the river in Portland to discuss the the project, in December. A number of Oregon legislators–all veterans of a difficult and ultimately unsuccessful attempt to build the bridge–were their guests. The upshot of the meeting:  No one is satisfied with the status quo, but there’s an utter dearth of agreement on what the two states might do.

Into that vacuum has stepped Washington Governor Jay Inslee, who broadly endorsed a possible revival of the project in his newly proposed budget. It sets aside $17.5 million (a pittance, actually) for some planning. But Inslee–who is burnishing his environmental credentials, as part of a rumored Presidential run–in insisting that the project include light rail.

That’s a non-starter for a big share of the population in Vancouver, Washington, and especially the area’s Republic lawmakers, who immediately told Inslee that, as it was five year ago, the inclusion of light rail is a deal-killer today:  Those who don’t learn from history are doomed to repeat it, they warned.

The city’s newspaper of record, The Columbian, which is generally favorable toward the idea of light rail, still sees it as some kind of one-sided gift to Portlanders and Oregonians. From Vancouver’s perspective, the paper argues, Washington should support light rail, only if Oregon offers a generous set of expensive highway concessions to Washington:

If Oregon representatives insist on light rail, perhaps Washington negotiators can strike a deal. Bring light rail into Clark County in exchange for firm deals on a third and fourth Columbia River crossing, for vast improvements to the Rose Quarter corridor through Portland, and for an agreement that Oregon will drop plans for tolls on Interstate 5 and Interstate 205 near the state line.

Let’s tote that up for those of you who haven’t seen the price tags for all these ideas:  the calls for “third and fourth” bridges across the Columbia would require expenditures of something on the order of $500 million each (without allowing for any actual roads to serve such bridges).  There are already $500 million in lane widenings proposed for the Rose Quarter freeways; apparently the Columbia wants even more. And tolling was the essential financial cornerstone of the original CRC project; and that was when the two states expected billions in support from a federal government that shows no signs of providing such funding for the project today.  Within inflation, and cost overruns, this probably works out to $5-6 billion in projects fo which neither state currently has any budgeted resources. In short, the editors of the Columbian are delusional if they think that Oregon’s desire to see light rail go to Washington, extends not just to taxing its citizens to pay for this project, but also subsidizing two more bridges, widening Oregon freeways and giving Clark County residents a pass on paying any of the cost of these projects via tolls.

If anything, in the wake of the failure of the original CRC proposal, the two sides have become even more locked in to their mutually incompatible positions. There’s essentially zero basis for agreement between the two states on the proposal.  The original project was a tenuous and unstable compromise that tried to paper over fundamental disagreements. As it turns out, the real gridlock between Oregon and Washington isn’t on the I-5 freeway; it’s the irreconcilable differences within the region about the future of transportation. As a result, the revived Columbia River Crossing is going . . . nowhere.

Editor’s note:  For seven years, Joe Cortright worked as an advocate with community groups opposing the Columbia River Crossing.  For some of that time, he was compensated for his professional services.

Ten things more inequitable than road pricing

Don’t decry congestion pricing as inequitable until after you fix, or at least acknowledge, these ten other things that are even more inequitable about the way we pay for transportation.

There’s a growing interest in using congestion pricing to help tackle traffic issues in major cities. Putting a price on peak hour road capacity is the only thing that’s been shown to effectively reduce congestion, based on experience in London, Stockholm, Singapore and other cities, high occupancy toll (HOT) lanes in a growing number of US cities.  But proposals to put a price on something that’s widely–if inaccurately–perceived to be “free” invites all manner of arguments from those who might have to pay. And a favorite argument is that road pricing is somehow punitive to the poor, and inequitable.

Any time we charge a positive price for anything, the cost of paying that price is a higher burden on the poor than it is on the rich. It takes a special combination of myopia and tunnel vision to look at the prospect of congestion pricing anything other than a minor blip on a system of transportation finance that is systematically unfair to the poor and those who don’t own (or can’t afford) car.

Here is our list of ten things that are more inequitable that road pricing.

1. Flat vehicle registration fees. Many states charge the same amount to register a used economy car as they charge to register a new full-sized SUV.  As we demonstrated in our commentary, the Suburban and the Subaru, whether based on miles driven (a measure of value received from public roads), income or value (a measure of ability to pay) or weight (a measure of damage done to the roadway, a flat fee is simply unfair to lower income families.  On a per mile basis, the owner of a ten year-old Subaru can easily end up paying registration fees three times higher than the owner of a new Suburban.

2. Not pricing roads, which results in slower bus speeds. As we pointed out last year, those who depend on public buses are the victims of congested roadways.  Buses travel more slowly, are a less attractive alternative to car travel, and are less efficient (each bus and driver carry fewer passengers per hour or day). Not pricing roads makes bus travel worse for those who are dependent on it.

Not pricing roads means these travel more slowly, which is unfair to low income households

3. The storm sewer subsidy. Some of the most expensive infrastructure out there is the massive stormwater systems cities are building to deal with runoff during storms. Impervious surfaces like roadways account for up to half of urban stormwater, and much, if not most of the toxic material in stormwater comes from cars (leaking oil, tire residue, brake material, precipitated air pollutants). But roads (and therefore cars and car users) generally contribute nothing to the cost of collecting or treating stormwater:  the entire cost is usually added to city sewer and water bills.  The result is that city tax and ratepayers pay the cost of dealing with pollution, which comes from those who drive, many of whom are non-residents. It’s a huge burden for economically distressed cities, like Akron, which is spending over a billion dollars for giant new sewers to eliminate storm runoff. Akron city residents tend to be poorer and have low rates of car ownership, so they will pay for storm sewers; suburbanites in commute into Akron, and use the roads and surface parking lots that create the runoff, and who have higher incomes, won’t pay.  It’s inequitable.

4. Insurance rates. Virtually all states require motorists to purchase liability insurance as a condition of owning or operating a motor vehicle.  While insurance is privately provided, the fact that it is legally mandated makes it much like a tax. And insurance rates are not discounted for the poor.  If anything, there is abundant evidence that both the poor and urban residents pay more for car insurance.

5. Gasoline and gas taxes. Nearly all vehicles are fueled by gasoline. Gas taxes, the principal user fee for roads, is not pro-rated by income. Low income households pay the same per gallon tax as high income households.  Gas taxes, as a result, tend to be much more regressive than other forms of taxation. That’s just as inequitable, on its face, as congestion pricing, yet we’ve never seen a serious argument that we ought to discount the price of gasoline for poor households.

6. Tax credits for the purchase of new electric vehicles.  The federal government and many states offer tax credits for the purchase of electric vehicles.  Poor households both have much lower rates of car ownership, and are far less likely to purchase new vehicles; most can only afford used vehicles, for which tax credits are not available.  Giving $7,500 tax credits to households who are rich enough to afford a new Tesla (MSRP: $46,000) isn’t equitable. (Some 200,000 Tesla owners have already gotten the credit, which will now be dialed back to $3,750 per car). Doubtful that any poor families qualified.

7.  Paid parking.  In many locations, particularly dense city centers, you practically can’t use a vehicle for transportation unless you are willing to pay for its parking space either at a metered space on the street, or at an off-street lot or parking structure. While cities do provide free parking for disabled citizens (a perk that is frequently abused), parking meters don’t charge different rates to users based on their income; you have to pay the same amount to park your used Jetta as you do your new Mercedes. Again:  the cost of parking bears more heavily on the poor than on the rich, both as a share of income, and in relation to the value of their vehicles. Plus, we haven’t even said anything about the provisions of the tax code that subsidize parking, chiefly for high income workers. That’s inequitable.

Is high priced parking fair to the poor?

8.  “Free” parking. As Donald Shoup has demonstrated time and again, there’s nothing free about free parking.  The effective requirement that people have to build new parking as a condition of getting a building permit for a store, office, home or apartment, drives up the cost of new construction and housing. In addition, those who don’t own cars, who walk, cycle and ride buses, end up subsidizing those who get the free parking. One study estimates that carless renters pay almost half a billion dollars a year for garage parking that’s bundled in their rent, but which they can’t use, because they don’t own cars. As we wrote in our commentary on the triumph of parking socialism, the law in its majesty provides free parking to everyone, whether they own a vehicle or not. As a practical matter, “free” parking, like free roads, benefits those with higher incomes who can afford and who use cars extensively.

And parking can be so cheap it undercuts transit, and makes cities less walkable.

9. The property tax exemption for cars. Unlike say houses and other forms of real property, cars are seldom charged property taxes. For example, Oregon completely exempts cars from state and local property taxes, a provision that costs local governments $989 million per biennium in revenue. And naturally, the exemption is a benefit only for those who own cars, and disproportionately rewards those who own expensive newer cars.  If we extended the property tax to cars–with say an exemption on the first $10,000 of value, so that someone driving a ten-year old clunker would pay nothing, the system would be much more equitable.

10. Unfair taxation of greener, safer, less congesting modes of transportation. Consider the fiscal conditions imposed on scooter operators as part of Portland’s experiment with fleets of shared electric scooters last year. The city required the scooter companies to pay $1 per scooter per day to cover the cost of streets. As we noted at City Observatory, that’s vastly more that the amount charged to cars, considering that cars take up dramatically more road space, cause more congestion and air pollution, and damage roads more. If cars were charged proportionately to scooters relative to their weight or value, cars should be paying $10 or $20 per day to drive in the city. Again:  a transportation finance system that’s not equitable.

Peak hour congestion pricing actually tends to affect higher income households more because they are the ones who commute by car at the peak hour.  As we documented at City Observatory, peak hour car commuters in Portland earn roughly twice as much on average, as those who commute by car or bus or in non-peak hours. Similarly, unlike flat tolls, congestion pricing can have low or even zero charges during off-peak hours, creating a low-cost or free alternative for those with limited means and flexibility in travel schedules.

There are plenty of things we can do to ameliorate any of the perceived negative effects of congestion pricing. First, as we noted in number two (above) road pricing actually benefits the poor and transit-dependent by speeding buses.   But beyond that, there are good reasons to believe that we could rebate some of the funds from congestion pricing to offset the negative effect on low income households.  In addition, we can spend congestion pricing revenue on transit and other alternative forms of transportation.

Playing the “equity” card as an objection to pricing the roads actually turns out to be a way to advance the interests not of the poor, but of those who benefit already from the wealth of subsidies to car ownership. We seem to be perfectly fine with all kinds of inequity in our transportation finance system, so long as it benefits wealthier car owners.

If we’re going to talk about equity, let’s not apply it to one isolated part of the transportation system. Instead, let’s ask what it takes to create an overall system that is fair to all, considering all aspects of how the system is paid for, who benefits, and who bears the external costs (of things like crashes, air pollution and runoff). If we do, congestion pricing can be at the heart of a system that is both more efficient and fair.

Note: This commentary has been revised to correct a typo in the headline, h/t to @stevenspinello.

 

How tax evasion fuels traffic congestion in Portland

Tax free shopping in Oregon saves the typical Southwest Washington household $1,000 per year

Cross border shopping accounts for 10-20 percent of all trips across the I-5 and I-205 bridges

Tax avoidance means we’re  essentially paying people to drive and create traffic congestion

Those who live in “the ‘Couv”–Vancouver, Washington–often like to poke at their larger neighbor on the South side of the Columbia River; in the heyday of “Portlandia” for example, Vancouver wags produced their own “The dream of the suburbs is alive in Vancouver,” video in reply.  A favorite political slur is to describe someone in favor of light rail or higher density as supporting Portland creep. But much as they complain, Vancouverites actually really, really like their neighbor to the South, because it facilitates what must be the city’s favorite sport: tax evasion, specifically sales tax evasion.

Washington has a sales tax, Oregon doesn’t.  In fact, no two adjacent states have more starkly different tax systems that Oregon and Washington.  Washington is just one of seven states with no personal income tax, and consequently has a very high state sales tax rate (over 8 percent in most places). Conversely, Oregon is just one of five states with no general retail sales tax, and has one of the nation’s highest personal income tax rates.  (Business taxes are also different: Washington taxes gross business receipts; Oregon taxes net business income).

A bit of geography:  Clark County Washington sits just across the Columbia River from Portland, Oregon.  The county is effectively a suburb of Portland, and has a population of about 475,000. Two Interstate freeways connect Portland to Vancouver.

Several of the region’s major shopping centers are located within a mile of the state border.  To the west, along Interstate 5, are the Jantzen Beach and Hayden Island centers; to the east, near I-205 are Cascade Station and a series of big boxes along Airport Way.  You’ll find in these areas, for example, a Lowes, two Home Depots, two Targets, two Staples, a Dick’s Sporting Goods, two Best Buys, a Walmart, a Costco, the region’s only Ikea, as well as a host of others: Petco, TJ Maxx, Ross Dress for Less, Pier One Imports.  Most of the merchandise in these stores would be taxable to Washington buyers, if they purchased it in Washington.

Estimating Clark County sales tax evasion

How much money does all this cross-border shopping saving Clark County residents?  Our estimate is about $120 million annually. We used data from the Washington Department of Revenue and from the U.S. Bureau of Economic Analysis to develop this estimate.

The key to our analysis is looking at the relationship between total personal income (the income of all the households living in an area) and retail sales tax receipts (the amount of taxes from spending on taxable retail sales. Washington State reports retail sales tax collections by county and for the state as a whole, while the U.S. Department of Commerce’s Bureau of Economic Analysis has comprehensive and comparable estimates of personal income earned by households in every county.

For Washington State as a whole, in 2017, taxable retail sales were 155.6 billion and total personal income was about 428.8 billion, meaning taxable sales were about 36.3  percent of personal income.  But in Clark County, retail sales tax revenues were just 30.3 percent of personal income–or about one-sixth lower per dollar of personal income than the statewide average. Clark County’s taxable retail sales were $7.2 billion and personal income was 23.8 billion.

The difference is a good estimate of how much sales tax Washington residents avoid on an annual basis. The shortfall in taxable sales in Clark County, compared to the rest of the state is equal to a little more than six percent of personal income, or about $1.5 billion annually.  At a tax rate of 8 percent, that works out to tax avoidance in the amount of about $125 million annually.  Clark County’s population is about 475,000, which means per capita tax avoidance is roughly $260 per person per year.  A Clark County family of four on average saves roughly $1,000 per year in Washington sales taxes by shopping in Oregon.

So while they main complain about Portland, they’re certainly well-compensated for the psychic strain that comes from living next to a city that they apparently don’t seem to like very well.

Technically, like most state’s Washington’s retail  levy is a “sales and use tax,” meaning that residents are liable to pay the tax on goods regardless of where they are purchased. But as a practical matter, for most consumer goods, the law isn’t enforced. The state even seems to have some trouble getting Washington residents to pay sales tax on automobiles:  the Washington State Patrol has a special unit that surveils local areas to see if automobiles with Oregon license plates are frequently seen.

To be fair, the border isn’t all gravy for Washingtonians.  For those who live in Clark County but commute to jobs in Oregon, they–like Oregonians–have to pay income taxes on the income they earn in the state.  In 2016, Clark County residents earned about $3.3 billion in taxable income in Oregon and  paid a total of about $200 million in Oregon income taxes.  About one-third of all Clark County households pay some Oregon income tax, which means that being on the border is an unalloyed good for about two-thirds of all Southwest Washington households.

How tax arbitrage creates traffic congestion

Yesterday, we told the story of how residents of Vancouver Washington save $120 million annually, about 1,000 per household, by shopping in Oregon (which has no sales tax).  This loss is a drain on the State of Washington’s public finances, to be sure, but it also has another, little noticed impact:  it’s a major contributor to traffic congestion in the Portland metropolitan area.

Nearly all of those shopping trips to avoid sales taxes, we can be sure, are made by private automobile.  And all the traffic across the state border (the Columbia River) are on two Interstate highway bridges (I-5 and I-205).

Most of our discussions of transportation focus, appropriately on commuting trips, the weekday travel from home to work and back. Commuting is the single largest category of travel, and the biggest contributor to peak hour travel (with most work trips occurring in the early morning and late afternoon, giving rise to the double-humped nature of traffic congestion.

But the according to the National Household Travel Survey conducted by the US Department of Transportation, shopping trips and related errands are actually the most numerous kind of automobile trips. In 2017, they estimate that the average American household took 580 shopping trips per year , compared to 546 work commute trips per year (Table 5c).  The survey also notes that shopping and errand vehicle trips are as numerous at the afternoon peak (5pm) as are work trips (Figure 15).

What this means is that a significant fraction of the travel on our roadways at the peak hour are not the kind of inflexible work trips, but are instead shopping trips–the kind that can more easily be re-scheduled (your boss insists that you be at work at certain hours; stores are open at a wide range of hours for customer convenience).

Sales tax avoidance is a major motive for Washington households to shop in Oregon.  In general, they can only do so by driving, and by driving on the two Interstate bridges across the Columbia River.  That means that a not insignificant portion of the automobile traffic across the river, including at the peak hour, is fueled by tax avoidance.

Earlier, we estimated that Clark County residents save about $120 million per year in sales taxes by shopping in Oregon.  At the roughly 8.4 percent sales tax rate levied in Clark County, that works out to total retail sales of about $1.5 billion per year.  It’s hard to know exactly how much households spend on each shopping trip.  It’s likely that tax avoidance trips are for larger ticket items (clothes, appliances, electronics) where the sales tax savings would offset the aded time and expense of driving to Oregon, compared to shopping in Washington. (Most groceries are exempt from Washington sales tax, so it’s unlikely that routine food shopping trips would cross the river).

How many shopping trips to Oregon would be required to spend $1.5 billion on items that would be subject to retail sales tax in Washington.

As a rough basis for estimate, we’ll assume that the average shopping trip in Oregon results in spending somewhere between $125 and $250 per trip. (saving the Washington shopper between $10 and $20 per trip).  At this rate, the $1.5 billion in spending, that would work between 6 and 12 million shopping trips per year.  To put that figure in context, there are about 160,000 households in Clark County, which works out to between 38 and 76 shopping trips per household per year, or about 3 to 6 Oregon shopping trips per month.

All those shopping trips make a significant contribution to automobile traffic across the two Columbia River interstate bridges.  On a daily basis, our 6 to 12 million shopping trips works out to between 16,600 and 33.200 trips per day. Each trip represents two crossings of the Columbia River (one going South from Washington to Oregon, and a second crossing returning to Washington). This suggests that sales tax avoidance generates between 33,000 and 66,000 trips across the two Columbia River bridges. The two bridges average about 300,000 trips per day (about 135,000 on the I-5 bridges; and 165,000) on the I-205 bridges, which means that tax avoidance shopping accounts for roughly 10 to 20 percent of the total trips across the Columbia River.

A survey of parking lots in North and Northeast Portland

In December, 2018, on a weekday afternoon, we did a windshield survey of parking lots in major retail shopping centers in north and northeast Portland, just south of the Columbia River.  We looked at parking lots in Hayden Meadows and Jantzen Beach (adjacent to I-5), and at Cascade Station and Airport Way, adjacent to I-5  We counted the number of cars in parking lots at major stores and noted what fraction of the vehicles had Washington license plates. Here’s a summary of what we found. The share of Washington vehicles in these parking lots ranged from about 20 percent to 70 percent, and varied according to the type of store. Best Buy–which specializes in televisions, computers, and electronics, was almost 70 percent Washington customers.  Michael’s, a hobby and craft store had just 20 percent Washington customers.

Home Depot, Jantzen Beach Shopping Center, Oregon

Four of the five cars shown here have Washington tags

Target, Cascade Station, Oregon

Four of the five cars shown here have Washington tags.

Best Buy, Jantzen Beach Center, Oregon

All seven of the vehicles shown here have Washington tags.

These calculations suggest that traffic congestion between Portland and Vancouver is materially affected by tax avoidance. Short of changing one or both of the two state’s tax structures it may be difficult to remove this incentive. But there is another way. Congestion pricing, particularly variable peak hour tolls, could prompt sales tax conscious shoppers to make their Oregon trips at off-peak times. Off-peak shoppers could continue to get their Oregon tax break and also avoid paying a high toll for peak hour travel. The result would be better traffic flow during peak hours for those who had less flexibility in arranging their travel schedules. It’s also worth remembering that it doesn’t take a huge reduction in traffic volumes, particularly at the peak hour, to get traffic to move much more smoothly. Getting shoppers to re-arrange their trips would make a material difference to travel times between the two states.

There’s a coda to our earlier story about the video parody of Portlandia that caricatures Vancouver (“The Dream of Suburbia is alive in Vancouver). At the end of the video, Melanie (the stand-in for Carrie Brownstein) has made it to the mall parking lot in Vancouver where the rest of the cast is singing. Asked what took her so long she says:  “I got stuck  on the I-5 bridge.” Her ersatz Fred Armisten responds: “Yeah, they need to replace that thing with a bigger bridge–and make Portland pay for it.”

The dream of the suburbs is alive in Vancouver (Youtube)

 

A tool kit for value capture policies

Harnessing the value of public assets to support the civic commons

It’s widely recognized that public assets, like parks, libraries and community centers, generate important and tangible benefits for their neighborhoods. But it’s seldom the case that the value of these benefits is tapped to help generate revenue to enhance and maintain the public assets. A new toolkit developed by U3 for Reinvigorating the Civic Commons outlines programs and tactics that neighborhoods and cities can use to capture value from public assets.

Value capture strategies are based on the observation that well functioning public facilities often contribute to local economic growth, business revenues and property appreciation.  Much or all of the economic value from these facilities often spills over to or benefits neighbors or adjacent properties, but doesn’t directly provide additional revenue to finance or maintain the facilities themselves.  Value capture mechanisms aim to close this loop by capturing a portion of the value created by a public investment and using it to help support the asset itself.

Value capture can play an important role, especially in rapidly changing neighborhoods. Public resources, like parks, often play a key role in stimulating additional investment and drawing new residents, with the perception that there’s little benefit for existing residents of the neighborhood. Value capture mechanisms, like tax increment financing, can create a pool of resources for maintaining or improving the public realm, or as some cities have done, subsidizing the construction of affordable housing that keeps the neighborhood accessible for households from a range of different incomes.

The toolkit sketches out a range of different value capture mechanisms, some of which have been employed widely around the country, and others, like land value taxation, that are relatively rare.  The report discusses at what scale (project, neighborhood or city) each mechanism is most appropriate, as well as their merits and drawbacks.

U3’s value capture toolkit was developed collaboratively by practitioners and  philanthropic, economic and civic asset experts. (Full disclosure:  City Observatory’s Joe Cortright participated in this group).  A beta version was of the value capture toolkit was reviewed at a Civic Commons Studio in Philadelphia earlier this year.

You can’t feel ’em, if you can’t see ’em

We can’t have empathy for those we can’t encounter due to the way our cities are built

Editor’s Note: Last month, our friend Carol Coletta spoke to the Kinder Institute in Houston about the critical role that place plays in building a shared sense of community. We’re pleased to reprint her remarks here.

Monday night I headed out to what I expected to be a short political event, featuring blues artist Keb Mo and saxophonist Kirk Whalum – my idea of the perfect political event. I took a notebook along, thinking I might multitask since I hadn’t yet figured out what I wanted to say here tonight.   But the music was too good, and I found myself forgetting the need to write.

Late in the concert, though, Kirk Whalum said something that made me snap to attention. He was explaining empathy to the audience, and he said:

“You can’t feel ‘em, if you can’t see ‘em.”

And I thought, that’s it. That’s what I want to say to Houston.

Because today, we don’t “see” our neighbors, thus, we cannot “feel” our neighbors. And that is, I believe, at the heart of a developing crisis in our cities and in our nation.

At a moment where we are more connected than ever, how can we be so divided? Americans don’t trust government, we don’t trust the church, we don’t trust the media, we don’t trust each other. And Houstonians, it turns out, are even less trusting that the average American.  We didn’t get here overnight. And there is no single cause.

The decline in the commons

The problem starts close to home. We barely know our neighbors, because we spend less and time with them. Only 20% of Americans report spending time regularly with neighbors. And a third of us say we spend no time at all with neighbors. (Greater Houston residents do a little better, but it is only small comfort: 20 percent  of Houstonians report that they never talk to their neighbors.)

That’s not surprising, since we’ve made casual encounters with neighbors so difficult. Thank God for dogs, right? Because otherwise, many of us would never have an unplanned encounter with a neighbor. We’ve engineered walking out of our way of life. And if you have an unplanned encounter with a neighbor while driving, you are probably both in trouble. As the Houston Chronicle reported, Houston, along with Dallas and Phoenix, have the deadliest roads in the country.

The fact is, car-oriented cities reinforce distance between people. Especially in Houston with your outsize expressways. Not surprisingly, Houstonians are in their cars longer than other Americans. Your average commute distance is one-third greater than in the rest of the country. [12 miles v. 9 miles–one-way daily distance.]

Our increasing demographic diversity challenges our notions of communitywho’s in and who’s out – fueled by the fact that our foreign-born population is the highest since 1914. You see this up close, thanks to your glorious diversity.

We know that the internet and social media world amp up our differences and mainstream the fringe – even without foreign intervention – and that results in isolation and greater entrenchment in our beliefs. But the most disturbing trend fueling the decline in trust… fueling our inability to “see” each other and thus our inability to “feel” each other… fueling our loss of empathy… is that our communities have become dramatically more economically segregated in the past twenty years. Almost a third of us now live in neighborhoods where either everyone is rich or everyone is poor.

The number of urban poor people living in neighborhoods where the poverty level is greater than 40% has doubled – doubled! – since 2000.

Far more neighborhoods are declining than are gentrifying. Yet, all the focus – and the fear — seems to be on gentrification. That’s out of whack, given the trends. Why doesn’t the much faster spread of poverty get equal attention?

Perhaps it’s because the flood of college-educated millennials [35% of millennials] into city centers is so visible. With their relative affluence and the expansion of consumption driven by their disposable incomes, college-educated millennials are easy to spot and easy to resent. (They are seen – but in caricature.) And that’s a shame because our cities need these millennials for renewal. And we need them for their money. .

Now, while millennials have repopulated downtowns, middle neighborhoods have disappeared. Cities have become more and more polarized between the well-to- do and the poor and struggling.

The Big Sort is on. It is rare for people of different incomes to live near each other, especially African-Americans of different incomes to live near each other.

This economic polarization has devastating consequences for children growing up in low-income families. Neighborhoods make a difference on generational mobility, and it turns out that the worst place to be poor is in a neighborhood where everyone else is poor. It’s hard, if not impossible, in poor neighborhoods to access opportunity.

And we not only are sorting ourselves by income. We are also sorting ourselves by political belief. Nearly two-thirds (63 percent) of consistent conservatives and about half (49 percent) of consistent liberals say most of their close friends share their political views.

Finally, racial disparities are growing. Since 2000, white/black economic disparities have widened. In 2000, a black family in America made 70% of what a white family made. Today, black families make barely 50% of what white families make. It’s hard not to notice. And if you’re black, it’s probably hard not to be resentful.

Clearly, these forces of division and polarization won’t sort themselves out naturally. The belief among Americans that “most people can be trusted” has plunged from a majority agreeing with that statement in the ‘70s to only one-third today.

Remember: You can’t feel what you can’t see. And what – and who — you can’t see can too easily be defined by others – inflaming divisions and mistrust even further.

Given we are in the last two weeks of a political cycle, examples of that are all around us.

Restoring trust

How then do communities succeed with such deep fault lines? With such deep distrust? When we cannot feel others?

Now, this is not the way we frame the crisis. Heal the division! Rebuild trust! Make a city people will share so they can see each other! I know that.

These are not the imperatives that make headlines – except maybe in a David Brooks column. They don’t sound like a winning political platform.

But think of it as the crisis before the crisis.

The free fall of trust we are experiencing is deeply dangerous because:

  • Trust is fundamental to a functioning democracy.
  • Trust is fundamental to community problem-solving.
  • And Trust is fundamental to creating equitable cities. Without empathy, why would anyone with privilege care about equity?

We need that loose web of social connections and the trust it enables to combat rising rates of isolation, political polarization, and increasing economic segregation in our cities.

If we want to tackle the big challenges our communities face – resilience, complete communities, equity, poverty – pick an issue! – we have to begin with simple acts of bringing strangers together, not online, not digitally, but in place.

The importance of the public realm

This is why the way we make and manage place – particularly public place – is so critical.

We desperately need institutions and public space that bridge our deep divides – that bring people of different incomes and backgrounds together — that make it convenient and pleasant for us to be in the company of strangers – that do not require us as individuals to be well-intentioned or “progressive.”

Unless we are content to let Houston and other U.S. cities become like those in the developing world – armed camps of the wealthy surrounded by poor people – we must make communities where people enjoy mixing it up with others… where they will live a portion of their lives in public, not because they are forced to do so, but because it is delightful to do so. It’s an opportunity not to be missed.

This is a lot harder than it sounds. But there are some bright spots, and we can learn from them.

At very small scale, the 2015 Pop-up Pool in Philadelphia’s Francisville neighborhood became a terrific vehicle for people coming together across income and race.

You surely know that public pools have tortured racial histories, and as private pools proliferated, support for and investment in public pools waned, leaving a customer base consisting of those with no other options. In Philadelphia, that meant low-income, African-American and mostly under age 18.

A young planner named Ben Bryant took a look at Philly’s pools and saw the potential for a much more dynamic neighborhood asset. Strategically located between a neighborhood of concentrated poverty and a neighborhood that had a lot of new investment, the Francisville pool had the opportunity to become a welcoming place for everyone.

It wasn’t expensive or complicated: Ben added a few beautiful seating options where there were none, brought in a few palm trees, and made the space inviting and aesthetically pleasing. The pool staff added some water Zumba classes to the schedule. Then, Ben promoted it on social media and had an immediate hit. The pool didn’t lose its existing patrons, but it gained new popularity with residents who discovered the pool for the first time. People of different economic status, different ages, and parents and children happily filled up the pool together.

It took less than a month for the City of Philadelphia to announce that it planned to convert more of its pools to the pop-up pool model. And now, New York has followed suit.

At a broader scale, we’ve adopted some of the lessons of the pop-up pool to reimagine the civic infrastructure that exists in neighborhoods all across our communities – the neighborhood parks, pools, libraries, rec centers, community centers, even police stations. Like the pop-up pool, a lot of these assets have been poorly maintained and underinvested, as people who could afford to buy their services elsewhere abandoned the public option. Think about it: If you could afford to join a gym, you stopped going to the community rec center. If you could afford to buy your books and buy your internet access, you stopped going to your local library.

But that only reinforces our divisions and further polarizes us.

For the past three years, the Kresge Foundation, along with partners at Knight, JPB and Rockefeller, has been funding demonstrations by collaborations in five cities to Reimagine the Civic Commons, bringing new design, new management, new programming and new intention about outcomes to these neighborhood-based civic assets. Once reimagined, we expect these assets to produce socio-economic mixing, increased civic engagement, environmental sustainability and increased value in surrounding neighborhoods. Those are the ways we are measuring success.

And it is increasingly the way we will need to measure all future investments. We can no longer afford to build single-use or single-outcome infrastructure. We just don’t have that kind of money.

Sometimes, it’s one of a city’s glamour assets that can play the role of common ground: our riverfront in Memphis, the riverfront in Detroit, your amazing Bayou [By-you] Greenways. I had a chance to ride the Greenways today, and it’s so easy to see the vision for a connected, resilient, civic Houston embodied in this infrastructure. Wow! It’s potential to re-knit Houstonians across the city and across demographic and economic divides is breathtaking.

Portland, Oregon tackled the challenge of polarization at city-scale.

A few years ago, I had the opportunity to interview the founders of modern-day Portland, to discover how that city had set itself on a path from sleepy, third-tier military town to a model of robust public life.

They decided if they were to engage Portlanders in the civic life of their community, they had to be convinced to “live life in public.”

In other words, people had to be lured from the comfort and privacy of their living rooms and backyards to share public life in the company of strangers.

At the time there were a lot of impediments. There was a prohibition against playing music in the park. Sidewalk cafes were illegal. So they set out to eliminate as many of the rules and barriers that discouraged public life. And today, Portland has a wonderfully rich public realm and many signs of robust civic life.

What can cities do?

Can Houston make “encouraging public life” one of the tests of all of its plans for the future? It’s a question you should be asking.

By most measures today, Houston is a success. A lot of things are going right here.

But the future of any city is not inevitable. Today’s success will not last forever.

Remember that Detroit not too long ago was our nation’s equivalent of today’s Silicon Valley. Its innovation and prosperity made it a mecca for people seeking good jobs and a better life for their families. But its fall, when it came, was dramatic, and only now is Detroit clawing its way back into significance, with a lot of help from foundations like Kresge, and entrepreneurs like Dan Gilbert. Even with their efforts, Detroit’s population is still not growing.

And the reverse is also true. A city’s decline does not necessarily define its future.

Consider Seattle. Not quite 50 years ago, there was the famous billboard that read: Will the last person out of Seattle turn out the lights? That’s when Seattle was a Boeing town and lost 65,000 jobs. Now, it’s an Amazon town, and the worry is too many jobs and too much gentrification.

In the 1980s, Jimmy Carter’s Commission for a National Agenda wrote off cities. That’s how desperate the condition of cities was then. The outlook was so bleak for cities that the commission could not imagine a comeback and recommended ending federal place-based funding.

And yet, cities have come back with a vengeance, driven by the emergence of a new urban economy fueled by eds and meds and the desire of highly-educated young people to live in cities. So much for inevitability.

It’s a warning sign for successful cities like Houston:

Tom Bacon, chairman of the Houston Parks Board, recently said: “This is the moment that citizens have to insist that we catapult ourselves to absolute thought leadership in how to develop a city in the 21st century.”

One of the things we know about the 21st century city is this:

It must be developed in ways that allow us – no, compel us — to feel our neighbors.   Because the successful 21st century city will be built with trust, empathy and equity.

I look forward to seeing the 21st century Houston becomes.

 

The long tail of the housing bust

Adjusted for inflation, US home prices are still lower than in 2006

For most US households, the home they own is their biggest financial asset. After the housing bust of 2007, when collectively about $7 trillion in home value was wiped out by declining house prices, many people have looked to subsequent increases in home prices as an indication that things are getting back to normal, at least in the housing market.

For example, the US Census Bureau reported that in the third quarter of 2018, the average asking price for a home for sale ($206,400) exceeded the previous peak price recorded in 2007 ($201,500).  Surely this means that the effects of the housing bust are now behind us, right?

But there’s a problem in making these comparisons over long periods of time:  as the footnote in this chart concedes, they don’t account for the effects of inflation. If we’re going to appraise whether housing is really a good investment, we need to know whether it’s price is increasing faster than inflation. And as we’ve long argued at City Observatory, there’s a flip side to this debate:  if house prices are rising faster than inflation, it suggests that housing is becoming relatively more expensive and unaffordable. Housing can’t be both a great investment and be affordable.

To get a more accurate picture of the affordability and return on investment from housing, we’ve adjusted a major home price index for inflation. Our nominal home price data comes from Zillow, which reports monthly its estimate of the Zillow home price index for the nation and for every metropolitan area.  While people love to pick nits with Zillow’s estimates of the value of their houses, their model of regional housing prices is one of the best out there–it was just adopted by the Federal Reserve Board’s economists as its preferred data series for estimating American households’ housing wealth. To adjust for inflation, we use the consumer price index for urban consumers, excluding shelter. We’ve computed the average price of single family homes, per the Zillow’s data  in inflation-adjusted 2018 dollars.

Here’s the national picture.

The blue line shows home prices in nominal dollars, while the red line shows the price of homes in 2018 dollars for the entire two-decade long period. According to Zillow, while the nominal price of housing has recovered the level recorded at the peak of housing market bubble, the picture is considerably different if we look at real, inflation-adjusted prices. Real home prices peaked at about $238,000 in the fourth quarter of 2006, and today are still about $20,000 less than that amount, $218,000.

Timing matters. On average, the typical person who bought a home in the United States between 2004 and 2008 would find that today, in inflation-adjusted terms, there house was still worth less than it was when they bought it. If you had the foresight (and the credit score and down payment) to buy into the market in 2012, when inflation-adjusted prices hit rock bottom, you’ve seen a healthy appreciation over the past six years. But as we’ve pointed out at City Observatory, swings in credit availability work to systematically disadvantage lower income and lower wealth buyers, who tend disproportionately to be people of color. When it makes sense to buy (when prices are low) mortgage lending is typically difficult to obtain. During the bubble, as we recall, when it was a bad idea to buy, lending standards were relaxed, and many households that couldn’t weather the downturn bought homes that soon plunged them into default or bankruptcy, which is a key reason for the disproportionate decline in home equity wealth and home ownership for people of color.

Regional Variation

The national averages mask considerable variation across the different regions of the country.  In some places (like San Francisco), the housing market is booming and is hitting new highs.  In others, home values languish well below the peaks established during the housing bubble.  Adjusting for inflation shows which places have made a real comeback in home values, and which ones are still, in real terms, less valued than they were in 2006.

This map shows the percentage change in real home prices for the nation’s fifty-three largest metropolitan areas.  Metros shaded in green have housing prices that are higher in inflation-adjusted terms than in 2006; metros shaded in red have housing prices that are lower now than in 2006. Clicking on individual metro areas shows the 2006 and 2018 home price values for that metropolitan area.

Of these, 21 metropolitan areas have higher real home prices now than at the market’s peak in 2006.  Most metropolitan areas–32 of the 53 largest, have real home prices that are lower now than they were in 2006.  By a large margin, most of the residents of these large metropolitan areas live in a housing market where values are still lower now, adjusted for inflation, than twelve years ago.  Some 124 million people live in the 32 metropolitan areas where real housing prices are still below 2006 levels; just 50 million live in the 21 metropolitan areas where real housing prices are now higher than in 2006.

The takeaway:  Housing isn’t always and everywhere a good investment

It’s tempting to just look at the nominal dollar figures on average home prices and assume that the housing bubble and subsequent bust are a receding memory. But the damage from that cycle is still apparent in a majority of large American metropolitan areas. Between 2006 and 2018, housing values haven’t kept pace with inflation, meaning the real value of housing has declined. In the face of those who are telling you that “now is a good time to buy,” it’s good to remember that the housing market is actually a risky one.

Cities, talent and prosperity

America’s economy is increasingly driven by the concentration of talent in cities

The Economic Innovation Group (aka EIG, a DC-based think tank) has been compiling some interesting data on the relative economic performance of different parts of the US, in the form of their “Distressed Communities Index.”  The recently enacted federal opportunity zones program is a brain-child of EIG, so identifying which places are distressed is a key aspect of their worldview.

They’ve released a new report “From Great Recession to Great Re-shuffling” offering a highly detailed view of change in the US economy over the past decade. Their report is based on set of indicators measuring relative economic performance of more than 25,000 zip codes. EIG has assembled data on the job characteristics and demographics of these zip codes from federal employment and Census records. The result is an interesting pointillist portrait of where the economy is doing well (and where it isn’t). They’ve got an on-line map showing the relative economic economic health.  Best scoring zip codes are in blue, lowest scoring in orange and red.

This is a lot of data to make sense of, so the EIG report aggregates data for these 25,000 zip codes into five groups (with roughly 5,000 zip codes in each group), based on the relative performance of zip codes on a composite index of seven factors including income, poverty rates, education, business formation and job growth.  They’ve labeled the five groups “prosperous, comfortable, mid-tier, at-risk, distressed.”

EIG’s own analysis highlights a number of interesting results from this aggregation: a large swath of the country still has fewer business establishments now than it did before the Great Recession, most job creation and business formation has been concentrated in more populous areas, and there’s been a net loss of population from the bottom two-fifths of zip codes since 2007.

The importance of education

Well-down the EIG list of findings was the one we found the most telling:  The increasing concentration of talent in the most prosperous places. Here’s their chart showing the composition of the population of each of their five zip-code quintiles, based on educational attainment.


What’s striking is that each of these five groups has almost exactly the same number of people with less than a BA degree (about 30 million).  The pronounced difference among places is in the number of adults with a BA (light blue) and the number with a graduate, professional or doctoral degree (dark blue).  The most prosperous quintile of zip codes has 27.7 million adults with a bachelor’s degree or higher, nearly six times more than the 4.8 million that lived in distressed zip codes.

Even more striking, is the pattern of change over the past decade or so.  The EIG report compares American Community Survey data from two time periods (2007-11 and 2012-16).  They report that the number of advanced degree holders in the US increased by 3.7 million in that time. The increase in advanced degree holders was heavily concentrated in the most prosperous zip codes. Half of the increase in those with an advanced degree occurred in the top quintile of zip codes; only 5 percent of the increase was recorded in the lowest, “distressed” zip codes.

One one level, this data confirms the clustering of talent in the most prosperous places. Zip codes that have and attract well-educated workers are performing much better economically than those that don’t.  As we’ve stressed at City Observatory, much of what’s going on here is the clustering of talent in the largest metropolitan areas, and within those areas, in close-in urban neighborhoods. The EIG data are consistent with this analysis. Their report observes a strong correlation between the educational attainment of a metropolitan area and the share of its population that lives in top-tier “prosperous” zip codes:

Metro area findings reinforce the education advantage. Tellingly, seven of the 10 major metro areas in which the largest share of the population resided in a prosperous zip code also ranked in the top 10 for bachelor’s and advanced degree attainment. . . . Conversely, six of the 10 major metro areas with the largest shares of their population living in distressed zip codes ranked in the bottom 10 on college degree attainment nationally.

The limits of zip codes

This is rich and detailed data, but zip codes are far from ideal as the geography to characterize economic performance. They don’t represent complete economies in any sense. Some zip codes (a downtown business district or industrial areas) are job centers, typically with far more employed workers than residents. Other zip codes are primarily residential, with few jobs (other than some local retail and service jobs. Few people live and work in the same zip code; almost no one, especially in a large metropolitan area has their economic prospects defined by the number or kind of jobs that happen to be in their zip code.  For most Americans, one’s zip code reflects the kind of housing one can afford, rather than one’s place of employment. In important respects, zip code measures are really more revealing about sorting and economic segregation than they are about localized patterns of economic opportunity.

Let’s take a closer look at a couple of different metropolitan areas (Detroit and Orlando) to see the differences in geographic patterns of distress. First, here’s a map of the distress index for Southeast Michigan.  It shows high levels of distress in the City of Detroit proper (dark red), surrounded by areas of blue (with a fair amount of dark blue); illustrating a well-known disparity in economic conditions between the city and its suburbs. (At this level of detail, the maps shift to a more cubist appearance).

Distress Index, SE Michigan (EIG)

And in contrast, here’s the same map for the Central Florida region centered on Orlando.  Here the pattern is much more of a random patchwork, with few concentrations of extremely high performing or low performing zip codes in any particular area.  In contrast to the stark city/suburb divide in Detroit, high performing and low performing zip codes are found all over Central Florida.

Distress Index, Central FL (EIG)

The challenge in interpreting the data from the EIG report is that national level aggregations of zip codes don’t tell us anything about the spatial pattern of economic performance within metropolitan areas. One’s diagnosis of the economic problems of a “distressed” zip code in Detroit (surrounded by dozens of other distressed zip codes) should probably be very different from that of Orlando (where distressed zip codes are few and tend to be adjacent to healthy zip codes.

Get out! Why economic mobility might mean leaving home

Part of the disparity in intergenerational economic mobility may stem from a willingness to leave home

Raj Chetty, Nate Hendren and their colleagues at the Equality of Opportunity Project have crafted a rich picture of the role that community plays in long term economic opportunity. We’ve highlighted some of their findings in the past couple of weeks.  For example, proximity to jobs and nearby job creation doesn’t seem to have any positive impact on the rate of intergenerational economic mobility.  It also seems to be the case that for black children, the characteristics of the neighborhood they grow up in is even more important than for white children.

 

Daniel Kaluuya stars in Jordan Peele’s 2017 “Get Out”

The latest product of the Equality of Opportunity Project is their Opportunity Atlas, which provides a rich array of data on the characteristics and outcomes of neighborhoods throughout the nation. One feature of their new Atlas caught our eye.  They include a datapoint that measures the fraction of children who grew up in a city who continue to live their as adults. For clarity, “city” in these terms means something called a “commuting zone” or CZ, which is an area defined by the federal government to include a metropolitan area and its surrounding counties. The entire nation is part of one of more than 700 commuting zones, which makes them a useful sub-state unit for characterizing economic activity. Specifically, this variable measures the percent of kids who grew up in a commuting zone who live in that same commuting zone as adults.

One of the frequently cited motivations behind economic development programs, especially in small towns and rural areas, is the hope that by creating more jobs, or the right kind of jobs, the community will be able to hold onto its kids as they grow up and move into the workforce.

The Washington Post profiled Las Animas, a town in rural Colorado that’s seen struggling economy and steady out-migration. While parents may be stuck there, their children aren’t.

Most of their kids have already left town, for good reason.

“I’m just as guilty as the next,” says Frazier, a big guy in a Green Bay t-shirt. “I encouraged them — get out of here. There’s nothing here.”

The Opportunity Atlas bears out this observation:  Only about 52 percent of the kids who were raised in the Las Animas area sill live their, well below the median national rate of remaining in one’s commuting zone of 68 percent.

One of the interesting sidelights of the Opportunity Atlas is a rich new picture of where kids ended up living. Because they tracked children from the time they grew up until they were adults, Chetty, Hendren and colleagues have an unusually large longitudinal picture of migration trends. Their Opportunity Atlas shows what fraction of kids who grew up in an area still live their as adults.  Here’s the map for the United States.

There are some very strong regional patterns. In the Great Plains and Inter-Mountain West (Iowa, Nebraska, the Dakotas, Montana, Wyoming) a relatively small fraction of kids remains in town (the dark red areas)(.  In contrast, in much of the South, particularly in Appalachia and the Mississippi Delta, a high fraction of those who grew up locally still live in the same commuting zone (dark blues/greens).  Some big metropolitan areas (Los Angeles, Chicago, Houston, etc) also retain a high fraction of kids who grow up locally, but for different reasons than in rural areas.

Contrast this map with another one that should be very familiar to those who are familiar with the Chetty/Hendren work.  This map shows inter-generational economic mobility, the probably that kids who grew up in low income households achieved higher earnings as adults.  There’s a strikingly similar regional contrast, with relatively high economic mobility in the Great Plains states and relatively low economic mobility in Appalachia.  The following map shows children’s average income as adults, with darker blue and green colors representing relatively higher incomes and reds and oranges lower adult incomes.

This suggests to us a hypothesis:  One of the chief ways that an area can contribute to the economic prospects of its children may be to prepare them to go somewhere else to pursue their dreams. One of the reasons that children growing up in rural areas in the great plains do better as adults may be that they are more likely to move away as adults. Conversely, those in much of the rural South may find it difficult to improve their life prospects because they tend not to move too far from where they grew up. At this point, our observation is just a conjecture, but with great tools like the Opportunity Atlas, its something that we can get a much better handle on.

Finally, speaking of “Get Out,” be sure to get out and vote next Tuesday, November 6.  Your nation needs you.

The limits of job creation

Whether at the neighborhood or metropolitan level, more job growth doesn’t seem to improve economic mobility

There’s a seemingly un-questioned (and unquestionable) truth among economic development practitioners that more job creation is the universal answer to problems of economic opportunity. If our neighborhood (or city or region) could just grow more jobs, or grow them at a faster rate, there’d be lots more opportunity for the poor and disadvantaged to lead a better life. We support job creation efforts because we believe that the benefits will help those, especially at the bottom of the income spectrum.

The “proximity to jobs and job growth” argument is more than just a widely repeated shibboleth of economic development practitioners. For decades its been encapsulated in something called the “spatial mismatch” hypothesis, propounded by economist John Kain in 1968; he argued that the suburbanization of jobs meant that low skilled urban workers left in segregated low income urban neighborhoods were progressively further from jobs. As jobs moved away from cities, these workers suffered accordingly. The implication of this hypothesis is that if we could just get more jobs closer to these disadvantaged populations, they’d have more opportunity.

In the past few years, we’ve gotten a much clearer and more detailed picture of who’s flourishing (and where) thanks to the research of Raj Chetty, Nate Hendren, and their colleagues at the Equality of Opportunity Project. Using de-identified tax records, they’ve calculated the intergenerational economic mobility of Americans, identifying who has moved up. A core indicator is the extent to which children born to the poorest families move up in the income distribution and have higher relative incomes that their parents. It turns out that some cities and some neighborhoods do a much better job of enabling children to succeed as adults.

Their latest– The Opportunity Atlas: Mapping the Childhood Roots of Social Mobility–examines data at the neighborhood level to judge the correlates of economic mobility. We highlighted the on-line Atlas they’ve created, which enables you to see which places do the best job of giving kids a chance to succeed economically. The accompanying research paper steps back and asks what we can learn from these patterns about the characteristics of local communities that seem to underlie economic  mobility.

On specific point of inquiry is using data from the Atlas to test the connection between job growth and intergenerational economic mobility. Does living in a neighborhood that has a wealth of jobs nearby improve the lifetime economic prospects of kids growing up there? The short answer is “no.”

The authors use data from the Census Bureau’s Local Employment and Housing Dynamics (LEHD) report to estimate the number of jobs within five miles of every census tract (neighborhood) in the US and then compare variations in job proximity to their measures of intergenerational economic mobility. They find that the density and growth of jobs bears no statistically significant relationship to lifetime earnings prospects of kids growing up nearby.

[The number of jobs within five miles] is slightly negatively associated with upward mobility, with a correlation of -0.175 (s.e. = 0.004). The number of “high-paying” (annual pre-tax wages above $40,000) jobs exhibits a similar pattern. We also find small correlations with the rate of job growth between 2004-2013, the period when children in our sample were entering the labor market. In short, there is little evidence of a positive association between local availability of jobs and upward mobility, challenging spatial mismatch theories of economic opportunity (Kain 1968).

The same finding holds at the metropolitan level.  Kids who grow up in regions or metropolitan areas where job growth is strong don’t tend to exhibit greater intergenerational economic mobility as adults than kids growing up in weaker labor markets.  Here’s a scatter diagram of data for the 50 largest commuting areas (major metropolitan areas and their surrounding commuting zones), showing intergenerational economic mobility on the vertical axis and job growth on the horizontal axis.  As the report notes, there is essentially no correlation between the two measures.

 
As the authors conclude:

The upshot of these findings is that a booming labor market does not automatically translate into greater upward mobility for local residents. Hence, policies targeted based on job growth rates would reach quite different areas from the places where upward mobility is lowest. More broadly, the factors that lead to highly productive labor markets with high rates of job, wage, and productivity growth apparently differ from the factors that promote human capital development and result in high levels of upward income mobility across generations.

What does matter?  While the number of local jobs doesn’t seem to make much difference to lifetime employment prospects, the share of neighbors who are employed does. Chetty, Hendren and co-authors look at the correlation between the rate of adult employment and the intergenerational mobility of children.  The bigger a fraction of the local adults who have jobs, the more likely it is that kids will be successful as adults:

. . . we find a strong positive correlation of 0.349 (s.e. 0.004) between the employment rates of the local residents in a neighborhood and the outcomes of children who grow up there. Evidently, what predicts upward mobility is not proximity to jobs, but growing up around people who have jobs.

They also find a positive relationship between neighborhood level measures of social capital, proxied by the fraction of the local population that returns its census forms by mail (i.e. not requiring a separate personal contact to elicit a response).  Children who grow up in neighborhoods with higher levels of social capital also have greater economic success as adults.

The non-existent relationship between economic mobility and local job growth poses a major intellectual challenge for many economic development programs and policies, particularly if they’re promoted as ways of redressing poverty and promoting greater equity. Take for example the recently enacted Opportunity Zone program, which confers tax breaks on investments in low income neighborhoods. Even if such measures do succeed in creating more nearby jobs, it may be that this does little to promote the lifetime prospects of kids growing up in these neighborhoods.

Instead, if we care about opportunity, we need to be looking for ways to improve these kids’ human capital (get them greater access to education), strengthen their personal networks (as exemplified by greater contact with peers, neighbors and friends who are employed), and building social capital. These are not the usual targets of many job creation strategies.

The Opportunity Atlas: Mapping the Childhood Roots of Social Mobility∗ Raj Chetty, Harvard University and NBER John N. Friedman, Brown University and NBER Nathaniel Hendren, Harvard University and NBER Maggie R. Jones, U.S. Census Bureau Sonya R. Porter, U.S. Census Bureau October 2018

Exit, Hope and Loyalty: The fate of neighborhoods

How neighborhood stability hinges on expectations:  If people don’t believe things are going to get better, many will leave

One of the most perplexing urban problems is neighborhood decline. Once healthy, middle-class or working class-places seem to gradually (and then abruptly) fall from grace.

As we documented in our report Lost in Place, the number of urban high poverty neighborhoods in the United States almost tripled from 1970 through 2010. Most of the growth was in places we called “fallen stars” places that had lower than average rates of poverty (under 15 percent) in 1970, but that were places of concentrated poverty (with poverty rates of 30 percent or more in 2010). What had been the solidly blue collar neighborhoods of many cities had, over four decades, become among the nation’s poorest–the kinds of places that the research of Raj Chetty and his colleagues shows permanently reduce the lifetime economic prospects of kids raised locally.

The mental picture we draw of neighborhoods and their residents is often a highly idealized, and simplified one. Particularly in describing high poverty areas and gentrification, we often focus on its impacts on “long time residents.” The idea is that people tend to be very tied to a particular neighborhood and are either scarred by its deficiencies (high poverty neighborhoods) or disadvantaged by change (higher rents due to gentrification). What this simplified picture misses is that not everyone who lives in a neighborhood has (or necessarily wants) long term tenure.

Some people may have deep ties to a place. Their family may have lived there for generations, they may own a home (or a business), they may be deeply involved in community institutions and activities. They may have a enduring attachments that lead them to stay, regardless of whether the neighborhood improves or declines.

But many residents, and arguably most renters, have far weaker ties to particular neighborhoods. The average tenure of a renter in the United States in her or his apartment is less than two years. For these residents, a particular neighborhood may be simply one stop on a journey that ultimately takes them elsewhere. And for many, moving to a different neighborhood, may be aspirational–getting a bigger yard, a shorter commute, better schools or safer streets–may be more important than staying where you are now. There’s abundant evidence that moving to a different neighborhood is one key way that families better their living conditions and prospects.

In his famous book, Exit, Voice and Loyalty, Albert Hirschman boiled the choices a citizen had for influencing a larger organization into three broad categories.  Exit (one could leave if one wasn’t satisfied), Voice (one could try to change the organization by speaking out, and Loyalty (one could go along with the organization as it was, and expect some reciprocity for their commitment).

In urban neighborhoods, we would adapt that trio as “Exit, Hope, and Loyalty.” In the face of neighborhood decline, it seems that residents have a couple of choices. They can stick around (loyalty) or, if they want, they can leave.  Whether they do so or not depends critically on that middle factor “hope”–if people lose hope, if they have no reasonable expectation that things are going to get better, they may be well advised to cut their losses and leave before things get worse (for example, selling before their home has lost even more value).  Some will be extremely loyal to a neighborhood.  They may have deep ties, personal, social, and financial, and be willing to expend considerable personal energy to help their neighborhood survive, thrive and improve.

But for many, the choice of exit may be highly pragmatic. Some may not have deep roots in a community, and leaving may imply little personal or social loss. Even those with roots or loyalty however, may make the carefully calculated decision that they can’t afford to wait for the neighborhood to get better in order to improve their own lives (and the lives of their families and children). Even those who are loyal are unlikely to remain, if they’ve lost hope.

One of the paradoxes of civic engagement is that it is essentially uncorrelated with measures of community satisfaction.  People tend not to speak out, attend meetings, write letters, and so on, when they’re satisfied that things are going well in their neighborhoods.  A 2010 survey of residents of 29 cities by the Gallup Organization for the Knight Foundation found that civic engagement was the factor least correlated with community satisfaction.

The challenge, of course is that once the dynamic of “exit” gets put in place, it is self-reinforcing. When the people with means and choices start leaving a neighborhood, it can lose its economic and social vitality.  Fewer residents, and fewer middle class families means less money to support local businesses and more limited social capital to support schools and other institutions and to maintain the community’s collective sense of identity and well-being. As Alan Mallach describes, the neighborhood decline starts a cascade of events, a decline in property values that saps community wealth, leading to a downward spiral:

As the number of vacant properties increases, the value of the remaining properties declines further, and the confidence of the remaining homeowners begins to disappear.  Signs of disorder begin to appear, from litter in the gutters to graffiti on the walls of vacant houses or storefronts. Decline gradually undermines a neighborhood’s ability to maintain its stability in the face of problems.

Writ large, this process is fueling urban decline and concentrated poverty in many of the nation’s metropolitan areas. A recent report from the University of Minnesota’s Institute for Metropolitan Opportunity shows that the most common pattern of neighborhood change in the US is for low income residents to find themselves in neighborhoods of even more concentrated poverty.  As Jason Segedy writes:

Instead of displacement by gentrification, what we are seeing in most cities could be described as displacement by decline – as black middle class residents, in particular, frustrated by the continued social and economic disintegration of their neighborhoods, are moving to safer and more attractive neighborhoods in the suburbs.

Whether residents of struggling urban neighborhoods choose to remain loyal to the place they live, or whether they instead decide to exit depends directly on hope:  do they have a reasonable expectation that their neighborhood is going to get better, or at least not deteriorate further? Consequently, one of the critical factors in stabilizing or revitalizing these neighborhoods is addressing these expectations. Our colleague Carol Coletta once asked then Mayor, now Senator Corey Booker what the toughest challenge was that he faced as Mayor of Newark.  “He replied, “convincing people that things can be different.”  If the residents of a community no longer believe it is going to get better, not only will they be less inclined to invest their time and energy in supporting it, but their attitudes will likely fuel a more widespread perception of neighborhood decline, and this stigma, once established may be impossible to reverse.

A key challenge for fighting neighborhood decline, and overcoming concentrated poverty is building a credible expectation on the part of neighborhood residents that things are going to change, and change for the better. Building hope is needed to reinforce loyalty and minimize exit. That’s a major task for urban leaders.

 

Does your neighborhood help kids succeed?

The Opportunity Atlas: Stunning neighborhood maps of economic opportunity

Some of the most important research findings of the past decade have come from the work of Raj Chetty and his colleagues at the Equality of Opportunity Project. They’ve shown convincingly, the effect of community attributes on the life chances of kids. Their latest work, The Opportunity Atlas, is a masterpiece of data analysis, presentation, and accessibility.  It’s a detailed set of nationwide maps (with embedded data) that shows how individual neighborhoods influence the life chances of children. If you really want to understand how your neighborhood is–or isn’t–working for the kids growing up there, you’ll want to immediately go to this site, and spend a lot of time looking at what it shows.

One of the limits of most of our socioeconomic data is that it is based essentially on a series of disconnected, one-time snapshots of city or neighborhood conditions. But Chetty’s work is unique in that it is a longitudinal look at change over time, for example, looking at the adult earnings and other outcomes (employment marriage, incarceration) based on where children grew up. Another limitation of most analyses is that even our biggest data sets, like the American Community Survey, are based on sample data, so there’s a pretty noticeable margin of error, especially in estimates for small geographies.  Chetty and his researchers had access to the complete tax return data of Americans (anonymized, of course), to construct their Opportunity Atlas.  The Atlas is based on records for 20 million children born between 1978 and 1982, tracking their progress through their thirties. The data was assembled with the support and assistance of the Census Bureau. Taken together, this data gives us a unique and detailed view of the ways in which community characteristics influence intergenerational economic mobility and a range of social outcomes.

The Atlas of Opportunity takes this extraordinary analysis of lifetime outcomes to a whole new level:  the neighborhood level. While previous work used the geography of commuting zones (areas slightly larger than metropolitan areas) or counties to analyze variations in lifetime outcomes, this new work mines and presents data down to the neighborhood level using Census Tracts (areas that average a population of about 4,000).

In addition to the geographic detail, there are a wide range of measures of success. The data includes adult earnings, but also reports the fraction of persons incarcerated as adults–again based on the neighborhood in which they grew up.  There’s also data on marriage, college attendance and fertility.

And all their work is delivered transparently in an on-line resource that lets you look at neighborhood by neighborhood variation in outcomes by the income, race and gender of different households.  For example, you can look at within city variations in the adult earnings of girls from low income Hispanic families based on the neighborhood in which they were raised. There’s even data reporting what fraction of kids who grew up in a particular neighborhood remained in the same city as adults.

For example, here are a couple of detailed maps for the Detroit metropolitan area that compare the adult earnings of black kids and white kids based on the neighborhoods that they grew up in in the 1980s.  Reds and oranges correspond to relatively low adult earnings; blues and greens correspond to higher adult earnings.

Detroit:  Income of black kids who grew up in these neighborhoods

This map illustrates the relatively low adult earnings for black children who grew up in the City of Detroit.  Except for a handful of neighborhoods, adult incomes were in the bottom half of the distribution. Meanwhile, black children growing up in suburban neighborhoods, especially to the northwest and southwest of Detroit proper, earned more as adults.

Income of white kids who grew up in these neighborhoods

The second map shows the adult earnings of white children based on the neighborhoods they grew up in. Earnings tended to be lowest for those who grew up nearest to the city of Detroit (the city’s boundary is shown as a black line).  Those who grew up in more distant suburbs tended to have the highest adult earnings.

The negative space on these maps also clearly show one other feature of the Detroit area: the profound racial segregation of the city and its suburbs.  The gray areas on each map represent neighborhoods where there were too few observations (fewer than 20 children) to make statistically valid estimates of outcomes, i.e. too few black children (upper panel) or too few white children (lower panel). A huge swath of the suburbs to the North and to the West of Detroit had too few black children to produce estimates; a majority of neighborhoods in Detroit had too few white children to produce estimates. (A hat tip to New York Times Upshot reporter Emily Badger for pointing out how the Chetty work illustrates the neighborhood segregation.)

What we’ve shown here–adult incomes for kids growing up in just one city–is just a tiny fragment of what’s available and possible using this website. There’s so much more here than we can possibly describe in a single commentary. The website allows you download data for individual neighborhoods, upload your own data so that you can examine how other factors inter-relate to the opportunity measurements presented here. There are a series of case-study stories that describe different aspects of the data as they play out in different communities around the country.

This is truly one of the most extraordinary resources available for understanding cities.

Does new construction lead to displacement?

A careful study of evictions in San Francisco says “No.”

There’s a widespread belief among some neighborhood activists that building new housing triggers displacement. We-and most economists are highly skeptical of that argument at the metropolitan level, but its at least theoretically possible that there could be some neighborhood effects, for example, that building a nice new building triggers a change in the perception of a neighborhood and makes the existing housing nearby more attractive and more valuable. (There’s the further un-stated assumption that this spillover effect more than offsets the downward pressure on prices from additional supply.)

If there was any place where one would expect to find this localized displacement it would be in the San Francisco. The city is famous for its extraordinarily expensive housing and the great difficulty with which new housing is approved. There are also many neighborhoods on the cusp of gentrification.

A recent study from the University of California’s Kate Pennington takes a close look at data on eviction notices in San Francisco to see whether new development leads to an increase in displacement nearby. Her overall finding is that new construction has no discernable statistical effect on the rate of evictions. Describing her findings on the effects of new market rate housing on nearby eviction rates, Pennington reports:

Each of these point estimates is a precisely estimated zero, no larger than 0.05 percentage points and always statistically indistinguishable from zero. This means that the monthly probability of an eviction notice being issued does not change due to the completion of new housing nearby

Pennington’s study is remarkably detailed and meticulous.  She gathers project-by-project and block-by-block data on new construction and inventories all of the eviction notices issued in San Francisco over the course of a decade.  She looks at eviction rates before, during and after construction of new housing, and disaggregates her analysis separately for market rate and affordable housing. She event separately analyzes more than 1,000 different housing projects to see if particular ones were associated with upticks in eviction activity.

Not surprisingly, the study has generated considerable controversy. Some anti-gentrification activists dispute the studies findings, claiming that the data is incomplete, and runs counter to their experiences.  Mission Local (a blog for the city’s Mission neighborhood) worries that:

. . . Pennington’s report could serve as a cudgel for the city’s YIMBY (Yes In My Back Yard) faction, which fervently advocates for a glut of new construction at all levels of affordability.

A key issue is whether legal eviction notices are a good proxy for displacement activity. Not all displacement is associated with eviction; some tenants are pressured or harassed by their landlords. But as Pennington argues, there’s little reason to believe that landlords prefer or are more likely to use these tactics rather than legal eviction procedures when new development occurs nearby.

If you were going to find a localized displacement effect as a result of new construction changing the character of a neighborhood and generating an upgrading spillover in adjacent properties, you would most likely find it in a city like San Francisco, where prices are high, supply is tightly constrained, and new development is occurring in dense urban neighborhoods. The fact that there’s almost no discernable impact of new construction on the rate of evictions is a strong signal that we ought to be focused much more on the price-ameliorating benefits of additional supply, and worrying a lot less about whether new construction causes displacement, even locally.

Kate Pennington, “The Impact of Housing Production on Legal Eviction in San Francisco,”  June 8, 2018

Why inclusive is so elusive, Part 4: Metropolitan context

Part 4. Are racially and economically homogeneous cities and suburbs in a segregated metro “inclusive?”

Looking only at disparities within cities misses the often far larger disparities across cities within in single metropolitan area.

(Editor’s note: This is the fourth in a five-part series examining a recent Urban Institute report measuring and ranking city-level inclusiveness. Please read part 1 for an introduction to the report, and an overview of the issues it raises.)

The Urban Institute’s “Measuring Inclusion” report looks at both racial inclusion and economic inclusion, and ranks US cities based on a series of factors. As we’ve suggested, cities are far from ideal units for measuring inclusion; city boundaries are varied, and in many cases were constructed to divide metropolitan areas by race; often times city policies produce exclusionary results.

In the Urban Institute report, racial inclusion is defined as a composite of a segregation index (confined to looking at segregation within city limits), and the disparity between people of color and whites in homeownership, educational attainment and poverty.  These measures ignore the level of indicators (whether education is high or low) and only look at differences between groups. So if whites and people of color have have similarly low levels of say educational attainment, a community is regarded as more inclusive.

Let’s just zero in on the Detroit metropolitan area. Keep in mind:  Metro Detroit is by one common measure–the black/white segregation index–the most segregated large metro areas in the US. Here’s the 2010 ranking of US metro areas from Brown University’s US 2010 project:

Source: Brown University, US2010.

According to the Urban Institute analysis, the City of Detroit is a paragon of racial inclusion (ranking 11th of 274 cities), and so too, are its suburban cities of Dearborn (49th) and Sterling Heights (43rd).  Detroit is 90.9 percent persons of color, and Dearborn is 12.4 percent persons of color (271st) and Sterling Heights 17.3 percent persons of color (264th). To look at these cities in isolation is to pretend that they are different worlds, not connected to one another in any meaningful way. But anyone with a casual familiarity with the politics of local government and racial history in the US will know that is parsing willfully blinds us to the underlying problem.

The broader region–Detroit and all its surrounding suburbs–is profoundly segregated, moreso than every other major metropolitan area in the US. You cannot find any central city with a share of persons of color differs more from that of its suburbs than Detroit. But to read the UI report, one would believe that both the City of Detroit, and its surrounding suburban cities are exemplars of racial inclusion among American cities, all ranking in the top 20 percent of American cities in racial inclusion.

What’s driving the high racial inclusion ranking for Detroit and many of its suburbs is the fact that there are very small racial gaps in poverty rates, educational attainment and homeownership. The City of Detroit ranks in the lowest 10 percent of US cities for the poverty gap, education gap, and homeownership gap. But this is only because the city’s population, regardless of race, is so poor, and because so many people with choices, regardless of race, have moved away, leaving a poor, but equal population. Detroit is inclusive in its immiseration.

Nor is this peculiar to Detroit: other industrial cities experiencing population losses, such as Gary, Indiana, and Camden New Jersey also score highly on Urban Institute’s measure of racial inclusion–16th and 13th respectively, again out of 274.

The disparity between metropolitan and principal city scores for segregation extends to economic segregation as well. Again, let’s focus on Detroit.  The City of Detroit, according to the Urban Institute report has the lowest level of income segregation of any of the principal cities in the 50 largest US metro areas. But the Detroit metropolitan area is one of the five most economically segregated large metropolitan areas, according to the data compiled by Reardon and Bischoff.
The city of Detroit scores well on economic segregation because it is uniformly poor. The Detroit metro area scores poorly on economic segregation because the region’s poverty is concentrated in the principal city, and its higher income households are concentrated in suburbs. Looking just at segregation within city limits conceals the region’s high level of economic segregation.

Overall, there’s actually very little correlation between principal city economic segregation and metropolitan economic segregation. As a result, city level data don’t serve as a proxy or parallel for regional economic polarization. Looking just at city level segregation data misses big variations in regional income segregation. The following chart shows principal city economic segregation from the Urban Institute report on the vertical axis and metropolitan economic segregation on the horizontal axis.  Larger values on both axes correspond to higher levels of economic segregation.

In our view the high racial and economic inclusion rankings that the Urban Institute report assigns to Detroit (and similar cities) are a clear sign that its methodology is not well-aligned with our understanding of what really constitutes inclusivity. Looking only within individual city boundaries, and ignoring the larger metropolitan context, especially the patterns of segregation among cities within a metropolitan area, produces a highly distorted view of which places are inclusive and which aren’t.

Why inclusive is so elusive, Part 3: Annexing growth

Part 3.  Do annexations and mergers constitute economic growth?

Not adjusting city job growth estimates for changes in city boundaries produces misleading estimates, especially when used for comparing and ranking cities.

(Editor’s note: This is part 3 of a part of a five-part series examining a recent Urban Institute report measuring and ranking city-level inclusiveness. Please read part 1 for an introduction to the report, and an overview of the issues it raises. In part 2, we argued that city boundaries are a poor basis for comparing different regions, and that city limits seldom capture the imprint of exclusion.  Today, we point out that changes to city boundaries over time make them a poor basis for gauging economic success, one pillar of the Urban Institute report.)

The Urban Institute’s recent report, Measuring Inclusion, focuses heavily on the relationship between economic growth and inclusiveness. Does economic growth promote inclusion? Is inclusiveness helpful for stimulating growth? These are clearly interesting and important questions.

Setting aside for a moment whether city boundaries correspond to functional economies (they mostly don’t), a critical question is whether data reported for cities is comparable over time. The problem is that city boundaries often change from year to year. Some cities, especially suburban ones, grow through the steady addition of land via annexation. More rarely, but more sizably, some cities merge with their surrounding counties to form a new, and much larger jurisdiction. And importantly for time-series analysis, there are wide variations among cities in the contribution of mergers and annexations to growth: Many cities are land-locked (surrounded by other jurisdictions), or are in state’s where annexation is difficult and rare; while other cities–edge suburbs, are continually in the process of growing via annexation.

Calculating employment growth rates without making allowance for the effects of boundary changes makes raw city growth data a very poor basis for comparing the economic health of cities over time. But, unfortunately, that’s exactly what the Urban Institute report does. Measuring Inclusion highlights four cities for case studies, based in part on their strong economic growth, measured using city boundaries.  But three of these four case study cities–Louisville, Columbus, Ohio, and Midland, Texas, have all expanded significantly via annexation in the years covered by this study. As a result, we believe that it’s problematic to draw any lessons from these cities about the connections, or lack thereof, between economic success and inclusion, which is at the heart of the Urban Institute report.

Louisville was a job growth laggard, not a world-beater (once you correct for the merger)

To get a sense of how this can affect the results, take a close look at the City of Louisville, which merged with surrounding Jefferson County in 2003.  The UI report indicates that employment in the City of Louisville chalked up an extraordinary 150 percent job growth between 2000 and 2013.  As Urban Institute reports, this was the fastest increase in employment in any city in the US, which resulted in Louisville having the fourth highest ranking on economic performance of any of the 274 cities UI examined.

But this growth was driven entirely by the merger. Comparing employment totals for the old city limits in 2000 with the expanded county-wide totals in 2013 recasts the merger into economic growth. It’s actually quite easy to correct for this by comparing growth rates for a standard geography–Jefferson County–in both 2000 and 2013. Below, we show the wage and salary employment totals reported by the Bureau of Economic Analysis for Jefferson County from 2000 to 2013. Rather than increasing by 151 percent, employment in the county actually decreased by 2.1% (falling by about 10,000, from 470,000 jobs in 2000 to 460,000 jobs in 2013).  Once we correct for this boundary change, Louisville’s supposed employment growth goes from 1st fastest to 241st (out of 274).

 

This also has a material effect on the city’s overall economic prosperity ranking. Rather than a positive z-score of 5.467, employment growth was -2.1 percent which translates into a is was  z-score of -.788, which in turn lowers the city’s composite economic health score. The Urban Institute report ranks Louisville fourth overall based on a combined z-score for job growth, unemployment rate, vacancy rates and median family income.  If we adjust just the job growth number to accurately reflect the decline in employment in Jefferson County, this lowers the overall economic health score from +1.29 to -.273; this lowers the Louisville economic health ranking from 4th highest to sub-par 185th (out of 274 cities studied).  This is important because the study classifies Louisville as having experienced a recovery between 2000 and 2013 (it actually declined), and then makes the city one of four success stories selected for further analysis in the report.

Annexations helped drive job growth in other case study cities

While Louisville’s city-county merger is the most dramatic example of how boundary changes can distort the meaningfulness and utility of raw employment growth rates, its clear that this affects a number of other cities as well. Two of the four other cities selected as examples of successful economic growth (Midland Texas and Columbus Ohio), have both had growth fueled by annexation. Midland, Texas has regularly annexed growing areas adjacent to its city boundaries.  The city’s annexation map shows a patchwork of annexations.

Columbus, Ohio, is another city that has expanded substantially by annexations. Since 2000, the city has added more than 8,500 acres through 400 annexations, expanding the total size of the city by about 6 percent. It’s much more difficult to tease out the contribution of annexations to job growth, but importantly, because some cities are annexing aggressively, while others aren’t, un-adjusted job data provide a poor basis for making cross-sectional comparisons.

In sum, municipal boundaries are a poor choice for assessing economic performance, both because cities are incommensurable units, and, as this brief examination shows, annexations and mergers can easily distort growth statistics. As a result, the economic performance statistics reported in the Urban Institute report Measuring Inclusion, don’t provide a reliable or accurate indication of which cities are economically successful, and consequently don’t provide any reasonable basis for drawing conclusions about the connections between racial or economic inclusivity and economic performance.

Why inclusive is so elusive, Part 2: The limits of city limits

Part 2. Are city boundaries the right way to measure inclusion?

Municipal boundaries produce a myopic and distorted view of inclusion; the boundaries themselves were often drawn to create exclusion

(Editor’s note: This is the second in a five-part series examining a recent Urban Institute report that attempts to measure and rank the inclusiveness of U.S. cities. Be sure to read Part I, here, for a description of that report, and an overview of the issues it raises).

The Urban Institute report “Measuring Inclusion” uses data for the nation’s cities, i.e. following municipal boundaries to define and measure economic and racial inclusion and economic performance. In our view this is a critical, and problematic choice, especially if we want to compare and rank different cities on these measures, which is the central premise of the Urban Institute report. For very good reason, the standard for most scholarship looking at issues of segregation, for example, is to look an entire metropolitan areas, because they are defined in a relatively consistent fashion.

Cities are incommensurable units, especially when it comes to describing their inclusiveness and economic performance. Some cities are the core of an urban area, while others are largely or entirely suburban. So for example, Atlanta and Austin are both cities, but so, too are Overland Park, Kansas, Scottsdale, Arizona, and Bellevue, Washington (all much wealthier and whiter suburban cities than the central cities in their metro areas). Some cities are isolated and rural while others are a small part of a much larger metropolitan area. Older central cities tend to be landlocked on all sides by surrounding suburbs.

A big part of the problem is that city boundaries are just the wrong units for assessing whether a region is achieving inclusive outcomes. We know that many cities–especially suburban ones–were formed and had their boundaries drawn specifically to achieve exclusionary ends. They include some people and exclude others. Looking only within city boundaries to assess inclusion means that you are effectively blinded to the way these boundaries themselves are a cause or contributor to racial and economic exclusion.

While much of the race and class gerrymandering of municipal boundaries is a settled historic fact, it remains a current topic. For example, consider the drama currently being played out in suburban Atlanta, where the wealthy neighborhood of Eagle’s Landing is looking to hive off into a separate city from Stockbridge, of which it is currently a part. Incomes in Eagle’s Landing average $126,000, almost double the rest of the current city of Stockbridge.

The Urban Institute’s measures look only at the racial and economic composition of a city, and not at all at the larger metro area in which it may be situated. Segregation indexes are constructed so that they aren’t influenced by the variations in the racial ethnic composition of different jurisdictions.

Metro areas, particularly when confined to specific size ranges, make better (though not perfect) units for comparison.  Their boundaries are set using a common rubric (tied to the commuting patterns of workers and the size of the labor market), and so they encompass entire functional regional economic units, rather than varied fragments of a regional economy.

A key part of the trouble with cities is that people are sorted among cities within a metropolitan area, not randomly, but by a combination of economic and political factors and personal choices. If rich people opt out of distressed central city neighborhoods, and only poor people remain, the measured inclusion of the central city increases, according to the Urban Institute methodology because the remaining residents are statistically more equal.

The bottom line is that using city boundaries as a lens for measuring inclusion will almost certainly produce a distorted picture. The pattern of city boundaries is an intrinsic aspect of the way exclusion is achieved and maintained in the US, and it is the sorting of people of different races, ethnicities and incomes into a region’s different cities that is a principal manifestation of exclusion and segregation. Municipal measures blind us to these fundamental facts.

The Urban Institute gets inclusion backwards, again

The Urban Institute has released an updated set of estimates that purport to measure which US cities are the most inclusive.  The report is conceptually flawed, and actually gets its conclusions backwards, classifying some of the nation’s most exclusive places as “inclusive.” 

We all want our cities to be more inclusive. While it’s an agreed upon goal, measuring inclusiveness turns out to be difficult and complicated. A set of measures developed by the Urban Institute in 2018, and updated late last year, unfortunately paint a picture of inclusion that is misleading and incorrectly portrays wealthy suburban enclaves as inclusive.  In this commentary, we present our analysis of the problems with the Urban Institute’s measures of inclusion, published at its website “Measuring Inclusion in America’s Cities.”

If we’re going to make real progress in addressing community inequities in the US, we need a system for defining and measuring “inclusiveness,” both so we can identify those places that are doing well, and which need most to improve, as well as to to learn from more successful places about what policies and practices support inclusion, and to enable everyone to measure their progress toward greater inclusion over time.  Absent such definitions and metrics, it’s difficult to figure out what works, and what doesn’t.  While well intended, the metrics offered by the Urban Institute are actually a step in the wrong direction, obscuring our understanding of which places are most inclusive, and what policies it would take to expand inclusion.

Two years ago, when this ranking system was first published, we released a five-part review and critique of its approach, definitions and conclusions.  We found:

  • The Urban Institute’s definitions rest on the conceptually flawed idea that cities that are more homogenous are more inclusive; in fact, homogeneity is usually a sign of exclusion, not inclusion. This leads the report to get inclusion backwards and classify some of the nation’s most exclusive suburban enclaves as “inclusive.”
  • The Urban Institute methodology ignores metropolitan context, treating racially homogenous cities in segregated metro areas “inclusive.” For example, both the city of Detroit and several of its suburbs are rated as highly inclusive, even though the metropolitan area is among the nation’s most racially and economically segregated.
  • The Urban Institute methodology classifies a series of high income suburban cities with restrictive zoning policies as inclusive. Urban Institute claims that the nation’s most “inclusive cities” are a series of high income enclaves in the nation’s large metro areas, such as Naperville, IL, Bellevue, WA, Plano, TX and Sunnyvale, CA; all places with median family incomes of over $100,000.
  • City boundaries are an inappropriate and misleading basis for estimating inclusiveness; because of their widely varying size and composition, municipalities are a poor choice for computing and comparing equitable outcomes.
  • City boundaries change over time; The Urban Institute did not adjust its estimates over time to account for annexations and mergers, treating growth through annexation as an indicator of economic prosperity.

If you take the Urban Institute report literally—which you shouldn’t—you will believe that the nation’s most inclusive communities are mostly a bunch of the toniest suburbs in the nation’s large metro areas—Sunnyvale, CA, Plano, TX, Bellevue, WA, Torrance, CA, Naperville, IL and Carlsbad, CA—all places with median family incomes in excess of $100,000 per year.  Here we’ve taken the Urban Institute’s latest data, which ranks cities from most inclusive to least inclusive.  Median family incomes are shown for each city.

The problem is that instead of measuring inclusivity, UI’s metrics actually measure homogeneity.  They look only at the demographics within a city’s boundaries, and rate a place highly if all its residents have similar levels of income, poverty, employment and homeownership. It’s true that there are very small differences in homeownership, employment, and poverty by race in many expensive suburbs, but that’s because their high home prices exclude low income people generally, and assure if people of color live there, they too have high incomes and high rates of homeownership and low rates of poverty and unemployment.

Similarly, that’s also why some very economically distressed places get rated highly for inclusion—if all or nearly all residents are equally poor, their measured disparities are small. But while these places are homogenous, they’re not inclusive.  In fact, just the opposite.  High income suburbs exclude by having extremely high housing costs, and those high housing costs are created and enforced by local land use policies, especially single family zoning. Low income cities exclude by having too few amenities or opportunities to retain households with high incomes:  When middle and upper income households move away, a neighborhood of concentrated poverty is more homogenous, and by the Urban Institute formulation, “more inclusive.”  We think that’s just backwards.

To its credit, the Urban Institute makes all of the data and formulas it used for ranking cities easily available.  They present detailed city level data on population, income, employment poverty, racial and economic segregation, and other key demographic indicators.  But this is a case of negative synergy:  the end product (the composite inclusiveness measure computed by Urban Institute) is less informative than sum of its parts (all of the data points).  The ranking system actually conceals and misrepresents the inclusiveness (or exclusiveness) of cities, rather than providing useful and accurate information.

As a consequence, the report is a poor guide, both to who’s doing well in achieving inclusion, and what cities (and policies) might be thought of as means to advance inclusion.

As we’ve stressed at City Observatory, building more inclusive metropolitan areas, and in particular, reducing the amount of racial/ethnic and economic segregation is critical to building a more equitable nation.  Unfortunately, the metrics and conclusions offered in this report about which places are inclusive are simply wrong, and consequently provide no useful guidance to local or national leaders about the way forward.

Just because you stick that sign on it doesn’t mean it’s inclusive.

The Urban Institute Report:  Measuring Inclusion

In 2018,  the Urban Institute released its report, “Measuring Inclusion in America’s Cities.” The study presented  a series of metrics of racial disparities, income disparities and economic performance of the nation’s 274 largest cities, and examining how these have changed over time. It aims to benchmark where cities stand on inclusion, understand how inclusion relates to economic growth, and provide lessons for policy makers.

We’ve taken a close look at the report, and while well-documented and certainly well-intended, we’re concerned that some of the metrics it offers and some of the findings it presents make an already tortuously difficult policy area even more confusing.

The paradoxical relationship between inclusion and equality 

Paradoxically, at the very local level equality is fundamentally at odds with inclusion

On its face, it seems like defining inclusion would be simple. It ought to be the absence of disparities in a community. But the reality its much more complicated. Measured economic disparities in a city can be very small, if for example, everyone is rich or everyone is poor.

Urban Institute’s report measures two different dimensions of inclusion, racial inclusion and economic inclusion. In the case of racial inclusion, they look at segregation (whether the white and non-white residents of a particular live in different or similar neighborhoods), and whether there are disparities in educational attainment and home-ownership between whites and non-whites. They also look at the total fraction of the population of a city that is non-white.

In almost every case, the Urban Institute report defines inter-group disparities as an indicator of a lack of inclusion. If, for example, homeownership rates differ greatly between persons of color and whites, that suggests an area isn’t inclusive.

With each of these measures, cities score highest if they have very little inter-group variation.  If white and non-white incomes, unemployment rates, and homeownership rates are very similar, the Urban Institute defines an city as “inclusive.” For example, if everyone in a community is high income, regardless of race or ethnicity, then it is “inclusive.”  Similarly, if everyone in a community is low income, regardless of race or ethnicity, then it is also “inclusive.”

There’s some merit to this idea, but it’s really focusing on “equality” rather than “inclusion.”  It’s entirely possible for a community to be “equal” without at all being inclusive.  In fact, communities that are exclusive tend to be highly equal.  If, for example, you have a community where all housing costs at least $500,000 and there are no apartments, it’s unlikely that you will have many poor, or unemployed, or renters. And it may be that there are very limited economic disparities between residents of different racial and ethnic groups (everyone who can afford to live in such a community, regardless of race or ethnicity, almost certainly has a high income).

Conversely, very inclusive places, where people of widely varying incomes reside, almost by definition have high levels of inequality. A community composed of equal measures rich and poor, homeowners and renters, and with a wide variety of housing sizes and types, including mansions and public housing, will have larger measured disparities in incomes, in homeownership rates, in poverty and so on. It’s a seeming paradox: a place that is truly diverse and inclusive, whether measured by income or race and ethnicity, by definition needs to unequal.

This is not the first time this issue has arisen: We’ve pointed out that applying a broad standard of equality to small geographic areas produces a kind of weird parallax effect: places that have low levels of measure income disparity tend to be homogenous (usually all rich, or all poor), meaning that they are either exclusive or failing.  The geography of inequality is anti-fractal:  high levels of measured inequality at small geographies mean exactly the opposite of what they mean at large geographies.

The bottom line is that a truly inclusive place may in fact have high levels of measured disparity. As a result, there’s an important conceptual flaw in metrics that focus solely on the localized presence of disparities to discuss inclusion.

To summarize:

  • Cities don’t generate income distributions among their populations, so much as they include (or exclude) different income groups. City inequality is not a linear microcosm of national income inequality.
  • Highly equal cities are almost always either exclusive suburban enclaves (that achieve homogeneity by rigid zoning limits that exclude the poor) or impoverished cities that have been abandoned by upper and middle income households, leaving them homogeneous but poor. (For example, many of Urban’s Institute’s “most inclusive cities” are suburbs like Bellevue, Naperville and Santa Clara–among the wealthiest 20 cities in the US; while Detroit and Cleveland are also highly ranked for inclusiveness.
  • Small geographies, neighborhoods/cities that have high levels of measured income inequality (90/10 ratio, Gini Index) are generally much more inclusive than comparable geographies that with lower levels of measured inequality.  Rich and poor living closer together produces more measured inequality, but also means greater inclusion.

We want to stress that we have enormous respect for the researchers at the Urban Institute: over the years we’ve learned tremendously from their work. We and everyone who cares about cities has a huge debt for their scholarship and advocacy. They’ve provided powerful evidence of the huge economic and human toll continuing racial and economic disparities in their report “The Cost of Segregation.” Our own report on concentrated poverty Lost in Place, draws liberally from the canonical work by Urban Institute scholars. The critique presented here is meant to advance our shared understanding of how we can build more inclusive cities. We conferred with the authors of the report in 2018 after it was released and shared these concerns with them. The analysis presented here remains solely the responsibility of City Observatory.

Why inclusive is so elusive, Part I

Inclusiveness is a worthy policy goal, but in practice turns out to be devilishly hard to measure. A recent report from the Urban Institute shows some of the pitfalls: looking just within city boundaries ignores metropolitan context and gives a distorted picture of which places are inclusive.

(Editor’s note:  Over the next several days, City Observatory will be taking an in-depth look at a recent report from the Urban Institute that attempts to measure and rank the inclusiveness of US cities. “Inclusion” has become one of the most popular buzzwords in the urban realm right now. While it seems like the term is simple, and agreement over its merit–the idea of reducing unwarranted inter-group disparities–the practical business of defining and measuring it is not at all simple.)

Achieving a more inclusive nation,  more inclusive metropolitan areas, more inclusive cities, and more inclusive neighborhoods is a critical national priority.  Fifty years after the Fair Housing Act, America remains deeply divided by race, and there’s strong evidence that we’ve become more segregated by income. As we’ve stressed at City Observatory, concentrated poverty makes all of the pathologies of poverty worse. But this is not a simple problem, and is difficult and potentially misleading to characterize with seemingly obvious statistics relating inter-group disparities within cities. We think a recent Urban Institute report, Measuring Inclusion in America’s Cities, while well-intentioned, falls prey to some of the complexities of describing the nature of segregation and separation.

We want to stress that we have enormous respect for the researchers at the Urban Institute: over the years we’ve learned tremendously from their work. We and everyone who cares about cities has a huge debt for their scholarship and advocacy. They’ve provided powerful evidence of the huge economic and human toll continuing racial and economic disparities in their report “The Cost of Segregation.” Our own report on concentrated poverty Lost in Place, draws liberally from the canonical work by the Urban Institute. The critique presented here is meant to advance our shared understanding of how we can build more inclusive cities. We conferred with the authors of the report earlier this year and shared these concerns with them. The analysis presented here remains solely the responsibility of City Observatory.

Our analysis is divided into five parts. First, we address some of the broad conceptual issues in defining inclusiveness, and explore why disparities and inequality, particularly when measured for small geographies defy easy interpretation. Part one is presented in this commentary; future commentaries will address the other four issues. Second, we take a close look at the limits of city boundaries as a frame for measuring inclusion,arguing that the varied and fragmentary geography of cities is a poor lens for understanding actual  disparities,. Third, we explore a particular problem of using cities to measure change over time, showing that annexations and boundary changes easily confound measurement and comparisons. Fourth, we question whether racially homogenous cities in a deeply segregated metropolitan area can be regarded as meaningfully inclusive. Fifth, we ask whether it’s appropriate to rank some of the nation’s wealthiest suburbs as its “most inclusive” communities.

This post introduces the Urban Institute report, and considers discusses the difficult and often paradoxical connections between city inequality, diversity and inclusion.

 

If there’s a mantra in urban economic development, its all about inclusivity.

The Urban Institute Report:  Measuring Inclusion

Earlier this year, the Urban Institute released a new report, “Measuring Inclusion in America’s Cities.” The study is based on developing a series of metrics of racial disparities, income disparities and economic performance of the nation’s 274 largest cities, and examining how these have changed over time. It aims to benchmark where cities stand on inclusion, understand how inclusion relates to economic growth, and provide lessons for policy makers.

The report attracted predictable attention.  CityLab‘s article “America’s Most Inclusive Cities, Mapped,” called the report a “roadmap for a deliberate effort to mitigate the forces that have created unequal communities.”  Next City’s story, “Why it matters who get’s to shape a city’s economy,” reported that Urban Institute’s researchers had identified four cities–Louisville, Lowell, Midland and Columbus, OH, that had experienced “inclusive recoveries,” i.e. growth their economies and made progress on measures of racial and economic inclusion. As CityLab related:

 The ten cities faring the best on the inclusion metrics in 2013 were also flourishing economically. “There is a strong relationship between the economic health of a city and a city’s ability to support inclusion for its residents,” the authors write in the report.

We’ve taken a close look at the report, and while well-documented and certainly well-intended, we’re concerned that some of the metrics it offers and some of the findings it presents make an already tortuously difficult policy area even more confusing.

Part 1. What constitutes “inclusion?”

Paradoxically, at the very local level equality is fundamentally at odds with inclusion

On its face, it seems like defining inclusion would be simple. It ought to be the absence of disparities in a community. But the reality its much more complicated. Measured economic disparities in a city can be very small, if for example, everyone is rich or everyone is poor.

Urban Institute’s report measures two different dimensions of inclusion, racial inclusion and economic inclusion. In the case of racial inclusion, they look at segregation (whether the white and non-white residents of a particular live in different or similar neighborhoods), and whether there are disparities in educational attainment and home-ownership between whites and non-whites. They also look at the total fraction of the population of a city that is non-white.

In almost every case, the Urban Institute report defines inter-group disparities as an indicator of a lack of inclusion. If, for example, homeownership rates differ greatly between persons of color and whites, that suggests an area isn’t inclusive.

With each of these measures, cities score highest if they have very little inter-group variation.  If white and non-white incomes, unemployment rates, and homeownership rates are very similar, the Urban Institute defines an city as “inclusive.” For example, if everyone in a community is high income, regardless of race or ethnicity, then it is “inclusive.”  Similarly, if everyone in a community is low income, regardless of race or ethnicity, then it is also “inclusive.”

Equality is not the same as inclusivity

There’s some merit to this idea, but it’s really focusing on “equality” rather than “inclusion.”  It’s entirely possible for a community to be “equal” without at all being inclusive.  In fact, communities that are exclusive tend to be highly equal.  If, for example, you have a community where all housing costs at least $500,000 and there are no apartments, , it’s unlikely that you will have many poor, or unemployed, or renters. And it may be that there are very limited economic disparities between residents of different racial and ethnic groups (everyone who can afford to live in such a community, regardless of race or ethnicity, almost certainly has a high income).

Conversely, very inclusive places, where people of widely varying incomes reside, almost by definition have high levels of inequality. A community composed of equal measures rich and poor, homeowners and renters, and with a wide variety of housing sizes and types, including mansions and public housing, will have larger measured disparities in incomes, in homeownership rates, in poverty and so on. It’s a seeming paradox: a place that is truly diverse and inclusive, whether measured by income or race and ethnicity, by definition needs to unequal.

This is not the first time this issue has arisen: We’ve pointed out that applying a broad standard of equality to small geographic areas produces a kind of weird parallax effect: places that have low levels of measure income disparity tend to be homogenous (usually all rich, or all poor), meaning that they are either exclusive or failing. We wrestled with this issue in our own recent report America’s Most Diverse, Mixed Income Neighborhoods rather than labeling them racially or economically “inclusive,” we opted for neutral, statistical language “diverse” in the case of racial/ethnic characteristics, and “mixed income” in the case of economic characteristics.

The bottom line is that a truly inclusive place may in fact have high levels of measured disparity. As a result, there’s an important conceptual flaw in metrics that focus solely on the localized presence of disparities to discuss inclusion.  We think that’s a key reason to question the usefulness of the Urban Institute’s metrics. We’ll explore how this conceptual problem plays out across metropolitan areas, in central cities, and in suburbs, in the next posts in this series.

Does increased housing supply improve affordability?

The study seems like ready-made evidence for the housing supply skeptics who maintain that building more market rate housing will do little or nothing to solve our affordability problems. 

Is St. Louis Gentrifying?

Gentrification Debates Without Gentrification?

By Todd Swanstrom

Editor’s note: We’re pleased to offer a guest commentary from Todd Swanstrom. Todd is the Des Lee Professor of Community Collaboration and Public Policy Administration at the
University of Missouri – St. Louis.  He is also co-author of Place Matters:  Metropolitics for the Twenty-First Century (http://www.kansaspress.ku.edu/drepl3.html).

Recent research has provided evidence for the legend that Eskimos have more than fifty words for snow. Unfortunately, we seem to have only one word for economically improving neighborhoods: gentrification. Our profound lack of linguistic nuance is crippling our ability to talk about what kind of future we want to see for urban neighborhoods.

In the early 2000s, I taught courses in urban planning at Saint Louis University. The City of St. Louis is the #1 shrinking city in the world – having lost more than half a million people in the last half of the Twentieth Century (from 856,796 in 1950 to 348,189 in 2000).   For the first time, however, I began to notice that a few neighborhoods were coming back and actually gaining population and investment. I would bring these up in class. Invariably, a student would chime in, “But professor, that’s gentrification!” – bursting my bubble.

I now have a position at the University of Missouri, St. Louis where I spend a great deal of my time working with a coalition of community development nonprofits. I continue to hear anxious cries of “gentrification!” This time, they are directed not at rebounding neighborhoods but at poor African American neighborhoods north of the infamous “Delmar Divide”.

Historically, the biggest divide in the St. Louis region has been east-west, between the central city and its suburbs – with communities becoming progressively wealthier and whiter as you move west out of downtown. Increasingly, however, the biggest divide in the region runs north and south. In the era of Jim Crow, a tangle of public laws and private practices confined African Americans to neighborhoods north of Delmar Boulevard (see map). To this day, the areas north of Delmar tend to be African American and lower income. This now includes North County suburbs like Ferguson. All of the neighborhoods identified as racially and ethnically diverse (yellow) or mixed income (blue) in City Observatory’s recent report on the subject are South of Delmar.

Despite a weak housing market, many activists have warned about gentrification around the new $1.75 billion National Geospatial Agency (NGA) headquarters that is under construction in North St. Louis. When completed it will employ over 3,000 highly skilled professionals with incomes averaging over $90,000 a year. Even though there is little evidence yet that NGA scientists will choose to live in the area, many people are warning about gentrification.  In most of the neighborhoods around the new NGA headquarters, the housing market has collapsed. If you go to Zillow.com, you will find that there are almost no houses for sale in the immediate area around NGa and the few that are often sell for less than $50,000. If young professionals began moving in, there would be plenty of room for them in the vast tracts of vacant land around NGA (see Google Earth image of the area around NGA).

Google Earth: New NGA Headquarters (in yellow)

By contrast, the Central Corridor is booming with growth in medical, biotech, and various tech start-ups. My research on neighborhood change in St. Louis documents that there are, indeed, what I call “gentrification-like” processes going on. Young professionals who work in the Central Corridor are moving in to the Central Corridor and nearby neighborhoods to the south.

This trend, however, does not fit the classic definition of gentrification — understood as high-income households moving into poor, minority neighborhoods and rapidly displacing long-time residents. First, the neighborhood undergoing “gentrification-like” processes are not poor minority neighborhoods. Gentrification is almost completely absent north of Delmar. “Gentrification-like processes” are concentrated on neighborhoods in and just south of the Central Corridor that have always done relatively well.

Second, there is little evidence that “gentrification” is creating economic pressures that are rapidly pushing out large numbers of low-income residents. Indeed, our data shows that if you take out the citywide increases in rents, the rent burden actually fell in many of the “gentrifying” neighborhoods between 2000 and 2016. In 2016, the median rent in these communities was generally affordable to households making considerably less than the median income for the metropolitan area.

These neighborhoods have not become enclaves of rich white people. As the map shows, many of the most racially and economically diverse neighborhoods in the city are “gentrifying” neighborhoods in or just south of the Central Corridor. Indeed, according to a recent City Observatory report, the only census tract in the entire region that is both economically and racially diverse is Tower Grove East (seen in green on the map), a neighborhood just east of Grand Avenue that has witnessed an influx of young professionals.

Clearly, if I presented these findings to the activists in St. Louis who warn about gentrification, it would have absolutely no effect – and for good reason. Those who are worked up about gentrification in St. Louis don’t have some kind of conceptual or empirical confusion that social science research needs to straighten out. People are worried about gentrification for a reason. It is based on their lived experience.

In effect, “gentrification” has become the “g-word”–a kind of default term to express people’s anxieties about powerlessness and widening economic and spatial inequalities. Affordable housing is an issue in St. Louis, but the cause is not so much gentrification as stagnating wages and a private rental market that is not capable of providing decent housing for those at the bottom of the income scale. This is true everywhere – not just in St. Louis.

The forces behind these growing rifts are mostly impersonal and invisible, but neighborhood change, when it happens, is not. Not surprisingly, people focus on the things that they can see with their own eyes. Like many places, St. Louis is becoming a more divided city. Wages are stagnating and rents are rising across the region. Between 2000 and 2016, the median income of renters in St. Louis fell by more than 8 percent after controlling for inflation. The median income of homeowners rose slightly. (Calculated by Alan Mallach using data from U.S. Census, 1999 income, and American Community Survey data, 2012-2016). The Central Corridor is booming while the North Side continues to decline. People project these broader processes of division onto specific neighborhoods.

For the black community, concerns about displacement have a real basis in history. In the 1950s and 1960s, urban renewal and highway building forcibly displaced tens of thousands of African Americans. “Gentrification” is a shout out by people who feel they have little control over their lives and their neighborhoods.

Research shows that in cities across the nation, many more people live in neighborhoods that are going from low poverty to high poverty than from high poverty to low poverty (what we normally think of as gentrification). Especially in older industrial cities like St. Louis, the most pressing problem is not that higher income households are moving toward the poor and pushing them out – but that they continue to move away from the poor, leaving behind communities bereft of opportunities and resources.

We need a more nuanced vocabulary to discuss economically ascending neighborhoods. Right now, we have many neighborhoods experiencing “gentrification-like processes” but not exhibiting the massive displacement pressures roiling cities like Seattle, San Francisco, and New York. In St. Louis, over 90 percent of those who work in the Central Corridor live outside the Central Corridor. I wish many more of them lived in the Central Corridor or in other city neighborhoods. Of course, at some point in the future, this could happen and rents would then rise exponentially causing massive displacement. For this reason, we should act now, while land prices are still low, to preserve an adequate supply of affordable housing through land trusts, nonprofit-owned housing, and other methods.

Today, however, the big disruptive challenge facing older industrial cities like St. Louis is not gentrification but depopulation and disinvestment – not re-urbanization but de-urbanization. Contagious abandonment and the decline of solid working and middle-class neighborhoods are the most pressing issues facing St. Louis – not gentrification. At the same time, we cannot dismiss fears of gentrification. Ultimately, the only way to dispel those fears is to help city residents acquire political power, access to jobs in the new economy, and control over their own communities.

Editor’s note: This post has been revised to correct a reference to the home locations of those who work in the Central Corridor. (August 16).

 

 

Whither small towns? Wither small towns?

Rural and small town America faces some tough odds

In an article entitled: “How to save the Troubled American Heartland,” Bloomberg’s very smart Noah Smith shares his thoughts on how to revive the smaller towns of rural. For the most part, he’s in agreement with the ideas expressed by James and Deborah Fallows in their neo-Toquevilliean travelogue: Our Towns: A 100,000-Mile Journey into the Heart of America. The Fallows’ flew their small plane hither and yon across the nation, landing in the places that the pundits fly-over, and trying to glean signs of hope and lessons for success.

They visit towns such as Greenville, SC, and Allentown, PA and report that there are signs of life. Local leaders are valiantly working to revive their economies. And at least a few places are generating jobs, income and economic revival.

The key lessons, as suggested by Smith:  local universities can train kids to succeed in the new economy and help create new jobs. In addition, communities ought to encourage immigration and look to form public-private partnerships.  These are all plausible suggestions as to what small towns and rural areas ought to try. The problem with extracting “best practice” lessons from these relatively successful places is that it doesn’t answer the question of whether the strategy is replicable in different places or scales to all of rural America.

While essays like Smith’s and books like the Fallows’ treat the issue with a broad brush, the question is not what happens to the entire “heartland.” Some places, even rural, isolated ones will flourish, even if rural America as a whole is challenged or in outright decline. The bigger question is whether this is an exercise of sweeping rejuvenation, or something more like battlefield triage. The unstated objective of most rural economic development efforts to to restore some real (or imagined status quo ante, recapitulating a pattern of population and economic activity that existed in some prior year (usually an historic peak).

You can point to some rural and smaller metro places that are doing well. Smith presents this optimistic looking chart showing five smaller communities that have shown real growth since 2000.

But when we look at the entire nation, grouped by population size, it’s apparent that big cities are powering national growth, and that as a group, small town and rural America is getting left further behind. As we’ve argued at City Observatory, much of this has to do with the strong connection between cities, talent, productivity and knowledge based industries. Smart people and the industries that employ them are increasingly gravitating toward urban locations, and are more successful there than in rural ones.

As a group, the nation’s non-metropolitan areas had not fully recovered from the Great Recession as of September 2016, according to data compiled by the Oregon Office of Economic Analysis. Although larger metros experienced proportionately larger job losses in the recession, they’ve collectively grown faster in the recovery, with metros of a million or more population growing almost twice as fast as smaller ones.

There are some stellar examples of successful rural places. Usually they’ve parlayed a signature asset, like an outstanding university or great quality of life into the nucleus of a new economic engine. But these tend to be exceptions.

While this has worked and will continue to work in some rural communities, it’s far from a general prescription for success. The fact that a few rural communities succeed doesn’t mean than these tactics will work everywhere.  Noah Smith is reminded of the key role that Pike Powers played in organizing such a partnership in Austin.  But Austin was also capital of a large state, an established tech center and home to a huge university. (By the way: every economic development effort I’ve ever seen, good, bad or wildly unsuccessful bills itself as a public-private partnership).

Much of the challenge in rural America is about a re-shaping of economic activity.  Key service functions are becoming more centralized as the somewhat larger towns emerge as the medical center, the retail center or the university town for a rural region. The only thing worse than having the WalMart in your town is having it in a neighboring town. There are winners and losers in the rural landscape as this process unfolds.

Having a university is a valuable asset, but essentially we’re not building any new ones, especially in small rural communities.  The same holds for having the region’s principal medical center. So if you’ve got one of these, great:  If not, expect to see economic activity, kids, opportunity and entrepreneurship gravitate to the places that do.

What’s odd is that Smith’s telling overlooks some of the principal forces underpinning the economies of rural and small town America:  cheap housing, social security and farm subsidies. Even as urban areas become more congested and expensive, housing in rural places is affordable. The foundation of a growing number of rural economies is the steady flow of funds from Social Security and Medicare, both to a rural region’s retirees, and to other retirees they might attract from expensive cities. Medicare (and Medicaid) revenues come disproportionately from cities and are spent disproportionately in relatively rural areas (which are older and poorer). And while the federal government has nominally eschewed most industrial policies, it continues to prop up agriculture through a combination of farm subsidies and support for infrastructure projects, like irrrigation and low cost power.  In effect, we do have a federal policy for rural America-even if we don’t acknowledge it as such. The likely cuts to these and other federal programs necessitated by the recent tax cuts and ballooning federal deficit are likely to have a disproportionate effect on rural America.

There will continue to be occasional bright spots in the rural economy, but in most places, its beyond their means to think about job creation on a scale that’s going to reverse the slow process of decline.

The limited allure of small towns

A few knowledge workers decamp to rural America as they age, but cities are the key

It’s an oft-told tale: talented professionals grow weary of the stress and high cost of city-living, and decamp with their spouses, children and knowledge-based businesses to some rural hamlet. It’s a harbinger of the end of cities as we know them, because all these smart people can easily run their lives and enjoy greater elbow room and lower costs somewhere in smaller town or rural America.

Green Acres is the place for you! (Obscure boomer reference)

The latest installment in this series is entitled “The Allure of Small Towns for Big City Freelancers: It’s harder for creative professionals to make a living in big cities. Many are looking elsewhere.” It comes from Slate‘s Rebecca Gale, who relates the tale of filmmaker Joel Levinson who moved his independent production business–responsible for the indie film “Boy Band”–from Los Angeles to Yellow Springs, Ohio (pop. 3,500).

According to the article, it’s an example of how the staggering cost of housing in large cities is leading more and more millennials to move away.

The story apparently caught a lot of people’s eyes, as it ranked as most read on Jeff Wood’s daily Overhead Wire news service (which we subscribe to, and you should, too).

So is it true? Are cities being emptied out as talented knowledge workers depart for cheaper locales?

Not so much.  Statistically, smart young people are even more concentrated in cities now than they have ever been. Even though the college attainment rate of young adults continues to increase (i.e. relatively more 25 to 34 year olds have BA degrees or higher than at any time previously), these folks are more, not less likely to live in cities.

There are lots of reasons why, but a critical one has to do with what economists call “human capital formation.”  If you want to find an interesting or challenging job, develop some skills and work experience, build a network of colleagues and contacts, build and burnish a reputation as a skilled and productive worker, not to mention find out what kind of work best suits your interests and aptitudes, there’s no better place to be than in a large city.

And that’s exactly what a wide range of economic literature shows:  highly educated young workers earn more, and see their pay grow faster when they live in cities.  They’re also more productive. So when you’re starting out, in your twenties and early thirties, you’re well advised to be in a city.

That was obviously true of filmmaker Levinson. According to Wikipedia, after attending George Washington University in 2002, he moved to Los Angeles to work as a comedy writer and film maker. He honed his skills and no doubt had more than a few lunches in Hollywood, building contacts. Joel was a big hit in producing original video content for YouTube (he bills himself as “the world’s first professional online video contest winner, for which he was profiled on the front page of the NY Times and was a guest on the Tonight show with Jay Leno.” That success in hand, Joel was in a position to think about taking that accumulated capital and moving elsewhere. And because he had some successes, he could continue to draw on a network he’d established.

Take nothing away from Levinson:  He’s a creative filmmaker and writer, a successful businessman, and is genuinely original and funny. But his case, though striking, is a reminder of a couple of things. First, generating success and a reputation often requires paying your dues and building your skills and network in a big city, (in film and entertainment, that usually means Los Angeles or New York). Second, you’ve got to maintain those networks. Though Joel is in Yellow Springs, Ohio, his brother and collaborator Stephen Levinson is in Brooklyn, New York, where he’s been a writer for among other things, The Tonight Show.

As one gets older, and stops accumulating human capital at such a rapid rate, the relative advantage of being in a city may decline. That’s inevitably going to lead some people to put greater weight on space and costs than on being part of the intense buzz of an industry, like film-making. So some will leave cities for the suburbs, or in rarer cases, depart to the countryside.

But the big question is:  How many?

It’s always possible to come up with a single anecdote. Indeed, that appears to be all you need to write a story like this one.  This isn’t the first time than Levinson has been portrayed as the avatar of urban-to-rural emigration: His story was featured in a Newsweek article with a similar theme in 2016.

The fact that a single anecdote drives two different stories of the urban talent diaspora probably says as much about the herd-instinct of some journalists as it does about the extent of urban to rural migration. In the end, however, we ought to look closely at the details and the data. What they show is that young knowledge workers are gravitating toward urban centers, because that’s where they can quickly gain valuable skills. There’s little news in the fact that as some of these folks age that they move away.

More evidence of declining rents in Portland

Zillows data shows Portland rents have dropped 3.5 percent in the past year

A couple of weeks ago, we published the latest data from ApartmentList.com on the decline in rents in the Portland metropolitan area.  Their benchmark series for one bedroom apartments showed a year-over-year decline of 3 percent in Portland.

While we have a high regard for ApartmentList.com’s methodology (unlike some other sources), it’s always useful to look at a wide range of indicators data to verify what’s happening in the market. Accordingly, we’ve taken a look at the Zillow Rent Index for multi-family housing for the City of Portland. Zillow tracks rents and home values across the nation, and produces monthly estimates of current market prices. Their methodology is specifically designed to avoid the composition effects that can bias other estimates of current rental rates. Here’s what their latest data show:

Year-over-year (from June 2017 to June 2018) apartment rents in the city of Portland, measured by the Zillow Rent Index declined 3.5 percent. That’s very close to ApartmentList.com’s estimate of a 3.1 percent decline for one-bedroom apartments.

 

According to Zillow’s historical time series, Portland apartment rents peaked in April 2016, and have declined a cumulative 9 percent since that time. The median rent peaked at about $1,660 but has since declined about $150 to $1,509. Zillow’s data confirms that Portland experienced double- digit rent increases in 2016, but the decline since then is consistent with economic theory (a growing supply of new apartments has blunted rent increases) and our observations (a flowering of “For Rent” signs in Portland neighborhoods).

If you really care about housing affordability, the Zillow data (and the remarkably similar ApartmentList.com data) are what you ought to be paying attention to:  they show the rents Portlanders are paying in the current marketplace. These two independent data sources show that Portland’s housing is becoming more affordable. As we’ve explained at City Observatory, the declines we’re now seeing have a lot to do with the supply of new apartments finally starting to catch up with demand. If we want to see further improvement in housing affordability in Portland, we really ought to be looking for ways to encourage further increases in supply.

 

We disagree with the Washington Post about housing economics

Contrary to what you think you may have read in last week’s Washington Post, rental housing markets at all levels still conform to the laws of supply and demand

Monday’s Washington Post ran a provocative headline: “In expensive cities, rents fall for the rich–but rise for the poor.”  Citing data from one of the nation’s most authoritative real estate analytics firms, Zillow, it claims not only that rents are falling for the rich while rising for the poor, but also implies that building new apartments (chiefly at the high end) hasn’t done anything to help affordability.

It’s a classic “rich get richer, poor get poorer” story.

But in our view, it’s not quite right.

We follow these same data sources and housing markets closely, and what the article claims doesn’t square with our observations. That’s particularly true for the Portland market, which the Washington Post uses as a signal example of its claims.

As we explain below, in our view the story’s key claims can’t be supported by the data presented, and it presents a distorted and incomplete view of what’s happening in rental markets around the country.  When examined closely, it’s clear that increased supply–even at the high end of the market–is bringing down rental inflation in all segments of the rental marketplace.

For technical reasons,  the Zillow data series presented in this story really can’t be used to convincingly answer the questions posed.  For example, the “three tier” data relied upon in the Washington Post story don’t include any of the nation’s apartments; and in fact they’re made up primarily of estimates of how much single family homes would rent for, if they were rented (which most of them aren’t). In addition, the three tier data is constructed in a way that is subject to composition effects (the composition of each tier changes over time as new housing is built, and this can bias upward estimates of inflation in the lowest tier if housing is added primarily in upper tiers.)

Looking more closely at Portland, the city profiled in the Washington Post story, we find that there’s actually a very strong correlation between changes in rents in the high, medium and low tiers of the market. (Conversations with Zillow staff suggest that the connections between the high and low end of the market are actually especially strong in Portland). What that means is that rent inflation has closely followed the same path in all market segments. The Post story makes much of the fact that rents for the least expensive housing tier rose 42 percent from 2011 to 2018; but doesn’t tell readers than in the same time period rents for the middle tier rose 40 percent, and rents for the highest tier rose 33 percent.  Rents have gone up for everyone.

More importantly, as new housing supply has come on line in the past two years, rental inflation has declined sharply in every segment of the Portland housing market, according to Zillow’s data (again, the series that leaves out apartments).  As we reported earlier at City Observatory, Zillow’s separate apartment data (not stratified by price tier) show a 3.5 percent decline in apartment rents in the City of Portland since June 2017.

Taken as a whole, Zillow’s data shows housing markets, particularly Portland’s working in much the way that standard economic theory would suggest.  As new supply comes on line, rental inflation abates. And while increases may initially be stronger and more pronounced in one segment (like high priced dwellings), the prices and rents in different segments of the market tend to move in parallel to one another.  And the observation that rental inflation falls first and fastest in the market segment in which new units are being completed is fully consistent with economic theory.

In our view, the Washington Post’s story would have been much more accurate if it had said the following:

  • Estimated rents for housing (excluding apartments) are up across the board in Portland since 2011, and may have risen somewhat more for low and middle price tiers than for the highest price tier.
  • Monthly year-over-year price changes in the high, middle and low tiers of the Portland housing market are strongly correlated, suggesting that prices changes in one segment of the market affect other segments and/or that all segments are affected by common factors.
  • Over the past two years, rental price inflation in Portland has decelerated sharply in all segments of the housing market as large amounts of new housing, especially apartments, have been completed.
  • City of Portland apartment rents have now declined 3.5 percent over the past 12 months.

We’re grateful to Washington Post reporter Jeffrey Stein for answering our questions about his story, and to Zillow’s Matt Kreamer and Aaron Terrazas for engaging in a detailed discussion of the construction and interpretation of Zillow’s data (which we regard as some of the best available). The opinions expressed here are ours, not theirs.

The limits of Zillow data:  “All Homes/Three Tiers” omits apartments

Zillow gathers and publishes lots of data about housing, but the particular data series that the Washington Post used for its analysis leaves out estimates of the rents for apartments. As a result, may be invalid to make claims based on this data about whether rents have increased, decreased or stayed the same for the poor.  Here are the wonky details.

The Washington Post article  relies on a Zillow data series called the “ZRI All Homes 3 Tier” estimates. A little explanation is in order. If you’ve visited the Zillow website, you know that the company publishes, and regularly updates its estimates of the current market value of nearly all the single family homes and condominiums in the US.  It calls these its “Zestimates.” Based on an analysis of the home’s characteristics and the sales prices of similar properties in the area, Zillow’s model estimates how much that house would sell for this month. The critical input into the Zillow model is data on publicly recorded sales of single family homes, condominiums and cooperatives.

To track different levels of the “for sale” market in each geographic area it covers, Zillow divides these properties up into three equal groups, called “Tiers”.  They are ranked by price, with the most expensive third in the top tier, the middle third in the second tier, and the lowest priced homes in the bottom tier.  The critical thing about the three tiers is that they are based on those properties (single family homes, condos and coops) for which individual unit sales transaction data are available.  This data series, somewhat misleadingly labeled the “All Homes” data does not include apartments. See the methodology section of the Zillow website.

In addition to the home value Zestimate, Zillow has estimated how much these same houses would rent for, if instead of being offered for sale, were offered for rent.  These “Rent Zestimates” are prepared for all houses, including owner-occupied ones that aren’t listed for sale.  So “All Homes, 3 Tier” ZRI estimates reported in this series do not include the nation’s for-rent apartments.   What the three tier rental data show is variation only within the single family, condo and coop portion of the market, and not the rents of multifamily apartments.  In addition, a majority of the homes in these three tiers are owner-occupied, not rented, so the three tier data don’t reflect what renters are actually paying.

This is problematic because it is exactly this three-tier All Homes data that Jeff Stein is relying on to draw conclusions about the variation in trends by income groups.

Poorer city residents have experienced significant rent increases over the past several years. In Portland, average rents for the poor have risen from about $1,100 to $1,600 — or by more than 40 percent — since 2011.

As we’ve noted, the cited data don’t support this claim because they leave out all multifamily apartments. Also:  Its misleading to cite this 40 percent increase without saying what the estimates were for the other tiers for the same time period.  Between 2011 and 2018, Portland All Homes” ZRI in the lower tier rose 44 percent, in the middle tier rose 42 percent and in the highest tier rose 33 percent.  If you take that longer term view, the data (even ignoring its limitations) don’t support the headline claim that rents went up for the poor and down for the rich.  They went up for everyone.  (In addition, as we explain below, there’s one other technical problem that confounds using the three tier data for making claims about relative price inflation: they are subject to composition effects as the homes in each group change over time).

Rents move in tandem across all tiers

A central premise of the article is that cities like Portland have futilely pursued building higher income apartments which have essentially no effect on the rents paid by low income renters.  Despite its limitations, let’s take a closer look at the ZRI index for the three tiers to see what it shows.  The following chart shows the “All Housing” data for the Portland Metropolitan area.

This chart shows the year-over-year monthly monthly inflation rate based on ZRI data for each of the three tiers for the period 2011 through June 2018.  As you can see, these lines move very closely together, with similar seasonal and cyclical patterns. The high, middle and low tiers all record both peaks and troughs at very nearly the same time.  The highest tier appears to be somewhat more volatile than the others, the highest tier has higher inflation in the peaks and lower inflation in the troughs.  Statistically, the correlation between the monthly year-over-year price changes between the highest and lowest tiers is .86, meaning that they are either closely inter-related or determined by common factors. These data do not square with the article’s headline claim that rents are increasing for low tier houses and decreasing for high tier houses. From mid-2015 onward, rental inflation is heading down across the board.  (It’s possible to cherry-pick one or two months of data at the point where low tier inflation dips below zero to claim that in that time period rents went down in the low tier and up in the other tiers, but that ignores the strong parallel downward trend in all three tiers.)

The key point, though, for the purposes of this article is whether a decline the the estimated rents of higher end homes was associated with a decline in the estimated rents of lower end homes.  And that’s exactly what the data show:  Since the peak in 2015, the year-over-year rental increase for the highest tier has fallen from 15 percent to -5 percent, and for the lowest tier it has fallen from 10 percent to 2 percent.

Finally, although we believe these data are too incomplete to make such a determination, the greater volatility of higher tier properties (especially their steeper decline) is actually consistent with economic theory:  If we’re adding units at the high end of the market, they will be closer substitutes for other high end properties than they are for lower end properties, and so the price effects of adding supply will be felt first and most in the highest tier.  But as the strong parallel trajectory of these three tiers shows, adding supply in the high tier drives down rent inflation in the highest tier, and also in the middle and lowest tiers.  Far from disproving the economic prediction, these data (though limited) are consistent with it.

More Supply = Lower Rent Inflation

There’s a very simple way to make sense of what’s going on in Portland’s housing market.  Look at how many new apartments are built and what subsequently happens to rents.  In the wake of the Great Recession, apartment building in Portland fell off a cliff. Metro Portland had been adding 4,000 to 5,000 apartments per year, and that fell to just a 1,000 annually in 2009 and 2010.

(Barry Apartment Report, Spring 2018)

Portland’s economy rebounded quickly, but apartment construction did not.  As the building permit data show, it took several years for things to get going (and typically there’s a year or more between the time a permit is issued and a project is ready to rent).

In 2015, Portland experienced very low vacancy rates and double digit rent increases.  But since then, as more units have been built, that pressure has eased. And its apparent that rents are declining.  Zillow’s data on apartment rents have shown negative year over year changes for more than a year.

From 2011 through 2014, as supply dried up, rents went up, peaking at double digit levels in 2015.  When supply finally caught up, rent inflation declined, and over the past year, rents went down. As we’ve frequently stressed, these things play out with a lag.  During the recession, when new apartment construction slowed to a trickle, it took a while for vacant apartments to fill up and put pressure on rents. Similarly, once rents spiked, it took a while for new units to come on-line and exert downward pressure on rents.  (Editor’s note: the data in this chart are for the City of Portland).

A technical note:  The composition effect

Interpreting the Zillow All-Homes 3-tier rental data is complicated by a subtle problem called the composition effect. Zillow creates its three tiers by dividing its all homes database into three equal parts, and then computing the average prices for each part. The problem arises because the number of units in the database, and in each tier changes over time. If we add units primarily in one tier and not the others, some of the shift we observe in prices among tiers is due to the change in the composition of the tiers, rather than a change in the prices of individual houses.  This is particularly a problem based on the observation made in the Washington Post story that much new housing has been added in the top tier of the market. Adding more homes to the top tier causes a shift in the composition of the lower tiers, which, independent of any price changes tends to push the average of lower tiers upward.  The problem for the Washington Post story is that we don’t have a good way to know how much the composition effect is driving the observed changes in prices among different tiers.

Editor’s note: The original version of this post was changed to correct a word transposition in the first paragraph.

 

Portland rents are going down

More supply is driving down rents in the Rose City

According to Apartment List.com, rents for one bedroom apartments in Portland have declined 3 percent in the past year. It’s a solid vindication of the standard predictions of economic theory: adding more supply (building more apartments) helps drive down prices.

Just a couple of years ago, Portland experienced some of the fastest rental increases in the nation, with average rents, according to some indices, rising at double-digit rates. Alarm about the rent crisis prompted the City Council–unwisely, in our view–to adopt one of the nation’s most stringent mandatory inclusionary zoning ordinances.

As we’ve chronicled at City Observatory, that ordinance prompted a land rush of building permit filings in January of 2017, as developers looked to lock in building permission before the new ordinance took effect. That surge of building permits came on top of several strong years of new apartment construction.  The result has been a flurry of “For Rent” signs, the like of which Portland hasn’t seen in years.

Landlords are notoriously loathe to cut advertised rents, but in addition to tolerating longer vacancy periods, there’s evidence that there’s considerable discounting going on in the local marketplace, with landlords offering one- or two-months free rent, waiving fees and deposits and even offering pre-loaded debit cards.

An apartment building on NE Martin Luther King Blvd, Portland.

But the latest report from Apartment List shows that the average level of rents being paid in Portland has actually declined.  In the past year, the benchmark rent for one-bedroom apartments has fallen from $1,155 to $1,120, a decline of 3 percent. (While we’re skeptical of the accuracy of many published rental price indices, ApartmentList.com’s is one of the good ones, using a “repeat sales” method that provides robust estimates).

With lots more apartments under construction, it’s likely that the supply will continue to increase over the next year or so, driving up vacancy rates, and putting further pressure on rents.

As we’ve argued at City Observatory, the market works, but with a lag.  There’s a temporal disconnect in the housing market. Demand can change quickly (growing as incomes rise and new residents move into the region), but supply changes only very slowly (thanks to the time it takes to round up financing, navigate the permit process, and then actually build something). After lagging well behind demand in the early post-recession years, housing supply has finally caught up, with the predictable effect that rents have flattened out, and now clearly started to decline.

The big question going forward is whether Portland’s inclusionary zoning requirements will stifle the surge in new apartment construction. While the pipeline is, for the moment, full of projects that were pushed forward to beat the IZ deadline, those will mostly be built in the next year or two. Since the inclusionary zoning ordinance went into effect nearly a year and a half ago, new apartment proposals have slowed to a trickle.

There’s now abundant evidence that the market is working.  Policy makers would be wise to take a close look at these data, and aim to continue and accentuate this trend. Putting the city’s inclusionary zoning requirements on hold would be a good first step. In addition, it’s widely assumed that the 2019 Oregon Legislature will be asked to legalize rent control:  that would be a step in the wrong direction.

Philadelphia’s urban policy harmonic convergence

Philly’s University City: The urban challenge in a nutshell

The knowledge economy . . . tax breaks . . . NIMBYism . . . gentrification . . . Amazon’s HQ2 . . . high speed rail . . . university economic development?  All this in one location.

Many conversations about the nation’s urban challenges address individual issues as if they were separate and distinct from one another (zoning, transportation, housing affordability, economic development, etc). But in reality, a careful analysis will show that these things are often inextricably interrelated.  There’s one neighborhood in Philadelphia that brings all these strands together. This particular story has almost all of our favorite urban themes wrapped in a single bundle, from urban revitalization and anchor institutions, to gentrifying neighborhoods, to NIMBYism, to town-gown conflicts, to convoluted local development approval processes and tax breaks with just a hint of Amazon HQ2 thrown in for good measure.

PlanPhilly’s Jim Saksa has a long, but worthwhile read exploring the redevelopment of University City, the area near the city’s 30th Street station and adjacent to the Drexel University and University of Pennsylvania.  His article “As Drexel transforms University City, communities nearby prepare for gentrification” is available both in text and as a public radio podcast, courtesy of NPR station WHYY.

Big plans are afoot for redeveloping the area, and they raise, in a geographic nutshell, many of the current controversies in urban policy. Brandywine Realty is proposing “Schuylkill Yards”-a $3.5 billion (yes, with a “b”) redevelopment of 14 acres, to include millions of square feet of office space. The project has immediate access to the Northeast Corridor, is directly across the river from Center City Philadelphia. The site is short-listed as a potential location for Amazon’s HQ2, should it come to Philadelphia. However you like to characterize urban development strategies (tied to great transit, redeveloping brownfields, eds and meds, innovation district) you’ll find it here.

You’ll want to read the entire article, but here’s a short litany:

Growing enrollments, especially at Drexel, are bringing lots more young adults to the area, with housing demand spilling over into adjacent neighborhoods. The universities have built more student housing, but its expensive, and students looking for more affordable options are bidding for apartments in nearby neighborhoods.

The neighbors, concerned about rising rents, home prices and displacement, have persuaded the city (with the university’s support) to downzone many of these neighborhoods. While perhaps good for incumbent homeowners, this step further constrains the supply of housing and is actually likely to further push up land prices for those parcels eligible for greater density.

Philadelphia, with its famously arcane system for building and development approvals is likely partly responsible for driving up the cost of development. Local builders complain, but acknowledge that the opaqueness of Philadelphia’s process is a profitable competitive advantage over outsiders. One of the developer’s explains:

In fact, Sweeney argues that Philadelphia’s difficult building regime even makes Brandywine money, because it’s both an investor and a developer, and the city’s idiosyncrasies keep competitors away.

“So, while you might be frustrated as a developer, as an investor, you look for markets where it’s harder to build.”

Tax policy also figures prominently in area development plans. The entire site in question has been designated one of Pennsylvania’s “Keystone Opportunity Zones” which grant generous state and local tax breaks to new job-creating investment.

Urban redevelopment is never simple, and there are lots of moving and interacting parts to consider. This close look at how these difference forces are playing out in Philadelphia’s University City provides some keen insight in to these relationships.

(Hat tip to Jon Geeting for flagging this article on twitter, and pointing out the salient bits).

This post has been revised to correctly spell Jim Saksa’s name, and credit WHYY’s PlanPhilly for the original story and broadcast.

Where we embrace socialism in the US: Parking Lots

How we embrace socialism for car storage in the public right of way

Florida Senator Marco Rubio has denounced President Biden’s $3.5 trillion spending program as un-American socialism.  Rubio claims:

In the end, Americans will reject socialism because it fundamentally runs counter to our way of life.

That’s not accurate, of course.  Socialism is well-established in the US, at least for car storage; something that is near and dear, certainly to Republicans.  You think otherwise?  Before you denounce socialism, Senator Rubio, consider this perspective.

Comrades, rejoice:  In the face of the counter-revolutionary neo-liberal onslaught, there’s at least one arena where the people’s inalienable rights reign supreme:  parking.

Fear not, comrade sister: you will not have to search for a parking space in our socialist utopia!

We may not be able to make health care a right or make housing a right, but the one place the revolution has plainly succeeded in usurping the market is in the case of parking.  Every worker’s council (though they may still brand themselves in the pre-revolutionary nomenclature of “city councils” or “townships” or “planning commissions”) has established the right of every citizen to abundant, free parking.

To everyone, we can point to parking as one place where private property and the intrinsically inequitable forces of capitalist distribution don’t disadvantage the working classes and the poorest among us. There may be massive inequities in other aspects of life, but each citizen is guaranteed equal access to adequate parking spaces. To paraphrase Anatole France, the law in its majesty protects equally the right of the rich and the poor to park their massive sport utility vehicles pretty much wherever they would like without having to pay a penny for doing so.

True, we may face public opposition from reactionaries in the media, like New York Times columnist Tim Egan, who has decried the people’s efforts in Seattle to secure greater access to housing as a conspiracy between socialists and developers. Tosh!  As we have shown with our parking requirements, we will bend developers to the will of the people.

Throughout the nation, workers councils city councils have decreed that the people’s right to parking is supreme. No bourgeois developer may build so much as a small shop or an apartment without adequately providing for the needs of the automobiles that may travel to or from these destinations. We may still struggle to require inclusionary zoning for people, but we have long since achieved inclusionary zoning for cars.

Together, comrades, we embrace the timeless historical wisdom encapsulated in the ITE parking handbook, which assures that each citizen is allocated sufficient parking spaces at each of the places he or she may wish to store a vehicle.

To those neo-liberal apologists and enablers who call themselves “economists” and claim that socialism is flawed and unworkable, we can proudly point to the successes of the parking supply diktats established in every community, large and small.

The production of parking spaces has exceeded the quotas established in the five year community plans. Comrade Scharnhorst has produced a report that showing that in Seattle, there are 1,596,289 parking stalls, more than 5 parking spaces for every household: indeed, a triumph of the planned economy!  (Now if we could just figure out how to get one house per household?)

The production of parking spaces continues to set glorious new highs

Scharnhorst cleverly infiltrated the Mortgage Banker’s Association to assemble the data for his report; just as V. I. Lenin foretold, we will hang the capitalists with the rope they sell us; surely had the great teacher been writing in this century, he would have said we will disrupt capitalism with the big data downloaded from their servers.

Take heart comrades: whatever our challenges in other domains, we can proudly tell the masses that we’ve succeeded in establishing a socialist utopia for car storage.  Forward!

Parking: Where we embrace socialism in the US

How we embrace socialism for car storage in the public right of way

Comrades, rejoice:  In the face of the counter-revolutionary neo-liberal onslaught, there’s at least one arena where the people’s inalienable rights reign supreme:  parking.

Fear not, comrade sister: you will not have to search for a parking space in our socialist utopia!

We may not be able to make health care a right or make housing a right, but the one place the revolution has plainly succeeded in usurping the market is in the case of parking.  Every worker’s council (though they may still brand themselves in the pre-revolutionary nomenclature of “city councils” or “townships” or “planning commissions”) has established the right of every citizen to abundant, free parking.

To everyone, we can point to parking as one place where private property and the intrinsically inequitable forces of capitalist distribution don’t disadvantage the working classes and the poorest among us. There may be massive inequities in other aspects of life, but each citizen is guaranteed equal access to adequate parking spaces. To paraphrase Anatole France, the law in its majesty protects equally the right of the rich and the poor to park their massive sport utility vehicles pretty much wherever they would like without having to pay a penny for doing so.

True, we may face public opposition from reactionaries in the media, like New York Times columnist Tim Egan, who has decried the people’s efforts in Seattle to secure greater access to housing as a conspiracy between socialists and developers. Tosh!  As we have shown with our parking requirements, we will bend developers to the will of the people.

Throughout the nation, workers councils city councils have decreed that the people’s right to parking is supreme. No bourgeois developer may build so much as a small shop or an apartment without adequately providing for the needs of the automobiles that may travel to or from these destinations. We may still struggle to require inclusionary zoning for people, but we have long since achieved inclusionary zoning for cars.

Together, comrades, we embrace the timeless historical wisdom encapsulated in the ITE parking handbook, which assures that each citizen is allocated sufficient parking spaces at each of the places he or she may wish to store a vehicle.

To those neo-liberal apologists and enablers who call themselves “economists” and claim that socialism is flawed and unworkable, we can proudly point to the successes of the parking supply diktats established in every community, large and small.

The production of parking spaces has exceeded the quotas established in the five year community plans. Comrade Scharnhorst has produced a new report that showing that in Seattle, there are 1,596,289 parking stalls, more than 5 parking spaces for every household: indeed, a triumph of the planned economy!  (Now if we could just figure out how to get one house per household?)

The production of parking spaces continues to set glorious new highs

Scharnhorst cleverly infiltrated the Mortgage Banker’s Association to assemble the data for his report; just as V. I. Lenin foretold, we will hang the capitalists with the rope they sell us; surely had the great teacher been writing in this century, he would have said we will disrupt capitalism with the big data downloaded from their servers.

Take heart comrades: whatever our challenges in other domains, we can proudly tell the masses that we’ve succeeded in establishing a socialist utopia for car storage.  Forward!

IoT: The Irrelevance of Thingies

People and social interaction, not technology, is the key to the future of cities

Smart city afficianado’s are agog at the prospects that the Internet of Things will create vast new markets for technology that will disrupt and displace cities. Color us skeptical; our experience with technology so far–and its been rapid and sweeping–is that it has accentuated the advantages of urban living and made cities more vital and important. From the standpoint of urban living, one should regard “IoT” as “the irrelevance of thingies.”

Smart cities will be bathed in beams of light! (TechCrunch)

It’s been more than two decades since Frances Cairncross published his book “The Death of Distance” that prophesied that the advance of computing and communication technologies would eliminate the importance of “being there” and erase the need to live in expensive, congested cities. (It goes down, along with Francis Fukuyama’s “End of History” and Kevin Hassett’s “Dow 36,000” as one of the demonstrably least accurate book titles of that decade.)

Back in the 1990s, when the Internet was new, there was a widely repeated and widely accepted view of the effect of technology on cities and residential location. The idea was “the death of distance”–that thanks to the Internet and overnight shipping services and mobile communications, we could all simply decamp to our preferred bucolic hamlets or scenic mountaintops or beaches, and virtually phone it in.

And these predictions were made in an era of dial-up modems, analog cell-phones (the smart phone hadn’t been invented, and while Amazon was still making most of its money cannibalizing bookstore sales). We were all going to become “lone-eagles”, tipping the balance of power away from cities and heralding a new age of rural economic development. Here’s a typical take from 1996, courtesy of the Spokane Spokesman Review:

Freed from urban office buildings by faxes, modems and express mail, lone eagles are seen by economic development experts as a new key to bolstering local economies, including those of rural areas that have been stagnant for much of the century.

Faxes?  How quaint.  Since then, of course we’ve added gigabit Internet and essentially free web conferencing and a wealth of disruptive apps. But despite steady improvements in technology, pervasive deployment and steadily declining costs, none of these things have come to pass. If anything, economic activity has become even more concentrated. Collectively, a decade after the Great Recession, the nation’s non-metropolitan areas have yet to recover to the level of employment they experienced in 2008; meanwhile metro areas, especially large ones with vibrant urban cores, are flourishing.

The economic data put the lie to the claim that cities are obsolete.  One of our favorite charts from Oregon economist Josh Lehner points point that larger metropolitan areas have outstripped smaller ones and rural areas have continued to decline in this tech-based era.

But what’s more than this is the growing premium that people pay to live in center of cities. Economists call this the urban rent gradient: the price of housing is more expensive in the center of regions, which are generally the most convenient and accessible to jobs, amenities and services. Over the past two decades, the urban rent gradient has steadily grown steeper: people now may more to live in the center of cities than ever before.

Edlund, Machaco, & Siviatchi (2015)

The importance of this trends was identified by University of Chicago economist Robert Lucas writing in the late 1980s. His words are even truer today that they were then:

If we postulate only the usual list of economic forces, cities should fly apart. The theory of production contains nothing to hold a city together. A city is simply a collection of factors of production – capital, people and land – and land is always far cheaper outside cities than inside. Why don’t capital and people move outside, combining themselves with cheaper land and thereby increasing profits?  . . . What can people be paying Manhattan or downtown Chicago rents for, if not for being near other people?

The expanding power and falling price of computation and electronic communication has made these things not more relevant, but less relevant to location decisions. Because you can get essentially all these things anywhere, they make no difference to where you locate. What’s ubiquitous is is irrelevant to location decisions.

The growing ease and low cost of communication has, paradoxically made everything else relatively more important in location decisions. What’s scarce is time and the opportunities for face-to-face interaction.

Both in production and consumption, proximity are more highly valued now than ever. Economic activity is increasingly concentrating in a few large cities, because they are so adept at quickly creating new ideas by exploiting the relative ease of assembling highly productive teams of smart people. Cities too offer unparalleled sets of consumption choices close at hand. From street food, to live music, to art and events, being in a big city gives you more to choose from, more conveniently located and cheaper than you can get it anywhere else. Plus cities let you stumble on the fun: discovering things and experiences that you didn’t even know existed.

The “death of distance” illusion is being repeated today with similar claims about the impending disruption from “the Internet of Things.” On a municipal scale this manifests with the cacophony of visions for tech-driven smart cities. Supposedly attaching sensors to everything from cars, to streetlights, to water meters is going to produce a quantum leap in city efficiency.

Far from provoking a shift to the suburbs and a decline of cities, the advent of improved computer and communication technologies has help accelerate the revival of urban work and living. In recent comments to the Urban Land Institute’s European meeting in the Netherlands, Harvard Economist Ed Glaeser explains.

“So why didn’t computers kill cities?” Glaeser asks. It’s a fair question: remote working has never been more possible, and productivity has never been higher. So why are more people flocking to urban areas than ever? “Cities are about exchanging ideas,” he says. The proximity of people to other people sparks ideas in a way that is impossible in remote areas.

As Philip Longman wrote at Politico a few years back:

. . . in the centers of tech innovation, . . . the trend has been toward even greater geographic concentration, as Silicon Valley venture capital firms such as the storied Kleiner Perkins Caufield & Byers have set up offices in downtown San Francisco, closer to the action. Apparently, there is no app that will bridge the gap. To seal the deal, you must be in the room, literally, just like some tycoon from the age of the robber barons.

As technology becomes cheaper and more commonplace, it ceases to be the determining factor in shaping the location of economic activity. All the other attributes of place, especially human capital, social interaction and quality of life–the kinds of things that are hardest to mimic or replace with technology, become even more valuable. To be sure, the Internet of Things may disrupt some industries and promote some greater efficiency, but the arc of change is moving inexorably to the city.

The increasing centralization of urban economies: New York

Prime working age adults are increasingly clustering in the center of the nation’s largest metro area

City Observatory has long been following the movement of people and jobs back to cities.  Our inaugural study on the Young and Restless charted the growing propensity of well-educated young workers to live in close-in urban neighborhoods. Our follow up work on City Center Jobs plotted a remarkable resurgence in job growth in city centers compared to the urban periphery.

A new report from New York City’s Office of Planning sheds further light on how this trend is playing out in the nation’s largest metropolitan area. It has a couple of very salient maps.  The first shows the population growth of the greater New York region since 2010.  As widely reported, the most urban boroughs Brooklyn, the Bronx and Manhattan, have been powering regional population growth, with the five boroughs accounting for 60 percent of the region’s population growth.  As this dot density map shows, population growth (the blue dots) is heavily concentrated in the center of the region. Population growth in the suburbs is considerably more sparse, and in some places (the orange dots) population is actually declining.

Population Change, 2010 to 2016 (Dot = 50; blue = gain, orange = loss). Source: NYC Planning

As the report concludes: “In recent years, population growth has shifted toward the region’s center. Recent growth patterns suggest that more people are moved to and stayed in our region, with a focus on the urban core and in areas well-connected to rail transit.” Overall, suburbs have grown more slowly, and in some areas, suburban population has declined. As we’ve pointed out before, far from “hollowing out” the New York region is growing, especially in its center.

The working age population is even more centralized

While much of the media attention gets devoted to overall population totals, we like to zero in on the demographic composition of city population changes.  What’s been driving the surge in city population? The answer, according to the new NYC planning report, is working age adults. The urban/suburban divide is even more stark for prime age workers (again, blue dots represent an increase and orange dots a decrease). The pattern of change since 2000 shows that with the notable exception of Manhattan’s East side, nearly all of the labor force gains have been concentrated in Manhattan and the close-in portions of the Bronx, Brooklyn and New Jersey.  There’s a consistent pattern of labor force decline in the more suburban parts of the region (Nassau and Suffolk Counties on Long Island, and the New Jersey suburbs).

Prime Age (25-54) Labor Force Change, 2000-2016; Dot = 50; blue = gain, orange = loss. Source: NYC Planning

This growth was economically significant:  the five boroughs of New York City expanded their prime-age labor force much more rapidly than the nation as a whole.  As the report notes, “The rate of NYC’s prime-age labor force growth (gain of 17 %) was nearly three times higher than the U.S. average (gain of 6%), which increased more slowly as the baby-boomer cohort grew older and was replaced by younger, smaller population cohorts.”

What these data show in New York is the growing centralization of economic activity.  More people are moving to the center, and especially more working age adults. Several forces are powering these trends: the growth of the knowledge economy, which thrives on the density of smart people, the growing desire for urban living and amenities, and a generational shift in attitudes about the relative attractiveness of cities and suburbs. These forces are in important respects mutually reinforcing: firms are increasingly growing the in center of the region (in part to tap the labor force) and the labor force is growing more in the center (because of more concentrated job growth).

This shift to the center is closely connected to the region’s most pressing challenges. With more jobs and workers in the center of the region, its little surprise that there’s been a sharp increase in ridership on city subways. Likewise, the growing demand for urban living, in the face of a slowly growing housing stock is putting pressure on rents. In New York, as elsewhere, it’s yet more evidence that we’re experiencing a “shortage of cities.

 

The persistence of residential segregation

How slow growth and industrial decline perpetuate racial segregation

As regular readers of City Observatory know, we think that the continuing racial and economic segregation of the nation’s metropolitan areas is at the root of many of the nation’s most persistent problems. We got a fresh reading on the extent and persistence of racial and ethnic segregation in the nation’s large metropolitan areas from a sharp new analysis of Census data prepared by Rentonomics Chris Salviati.

If you want to see where your metropolitan area ranks in racial and ethnic segregation, and judge what progress its made since 2000, you’ll definitely want to have a look at this page.  In it, Chris has used data from the latest 2012-16 American Community Survey to compute dissimilarity indices for major racial ethnic groups in the nation’s metropolitan areas. The dissimilarity index measures how different or similar the residential settlement patterns are for pairs of racial/ethnic groups. The index runs from 0 to 100 and represents the percent of the population that would have to move to a different neighborhood (census tract) in order for the proportionate composition of each neighborhood to match the composition of the larger metropolitan area.  Higher numbers on the dissimilarity index correspond to greater levels of racial/ethnic segregation.

While it’s useful and important to look at rankings, it’s also important to consider trends, and the pattern of relationships over time among metropolitan areas.  There’s a very insightful chart in the Rentonomics analysis which compares the growth rate of metropolitan area population with the level of segregation in the metropolitan area.  The growth rate of the metropolitan area is proxied by the percentage increase in metro area population since 1970.  What this data shows is a strong correlation between recent population growth and lower rates of segregation.  Places that have grown more, recently, are less segregated.

This finding dovetails neatly with several strands of academic research about African-American migration patterns.

A recent paper by Richard Sander of UCLA and Yana Kucheva of CUNY–”Black Pioneers, Intermetropolitan Movers, and Housing Desegregation“–makes a good point about migration. They observe that black inter-metropolitan migrants are much more likely to live in integrated neighborhoods, compared to blacks changing homes within metro areas. This makes sense, as they’re less likely to be bound by preconceptions or family ties. Metros with a higher fraction of black in-migrants should, all other things equal, be less segregated than metros with lots of native born blacks. In general, we would expect faster growing metros to have a lot more migration and be less segregated (i.e. Las Vegas) while slower growing metros (or ones with declining black populations) would tend to be more segregated.

The reasons for this may be complicated, but it’s likely that a new metro area is a kind of clean sheet, one that’s less influenced by family ties or preconceptions. A new book by sociologists Maria Krysan of the University of Illinois, Chicago and Kyle Crowder of the University of Washington concludes that perceptions of neighborhoods had a major impact on whether people even considered moving there. Newcomers to a metropolitan area are less likely to have been exposed to and internalized these preconceptions, and therefore may be more open to a wider array of neighborhood choices.  Conversely, in slow-growing metropolitan areas, a larger portion of residents may be those who are long term residents, and who are more influenced by historical patterns of what constitutes an appropriate neighborhood for them.

Finally, it’s worth noting that this line of thought helps explain the persistent segregation problems of older industrial cities that the Brookings Institution’s Alan Berube explored in a recent commentary. Residents of older industrial cities, he reports,  are 30 percent more racially segregated than the national average. Workplaces are also segregated: workers of color in these older cities are more heavily concentrated in low-paying fields like sales and personal services than their counterparts elsewhere. Finally the income gap between white and non-white households is much greater in these segregated, older industrial cities.

Taken together, these data provide a useful snapshot of the extent and variation of racial segregation across metropolitan areas, and a plausible story about why segregation is so persistent in the nation’s older, more slowly growing metropolitan areas. Despite the disruption caused by migration, rapid growth and economic change, growing cities seem to create a more fluid environment in which patterns of segregation are more quickly eroded.

State government as an anchor industry

Eds and Meds . . .  and Capitol Domes?

I recently participated as a part of an expert panel reviewed Sacramento’s economic development strategy. You can learn more about the city’s “Project Prosper” here.  It is rightly focused on identifying what can be do to promote economic growth with inclusion.

Like most regions, Sacramento has a strong interest in developing a more diverse, knowledge-driven economy. Although it’s only about 100 miles from tech centers in San Francisco and San Jose, the region has a much smaller innovation and entrepreneurship sector. What shows up clearly in an economic base analysis–which measures industry specialization using location quotients–is that as the Golden State’s capital, Sacramento has a disproportionate number of state government employees. While it’s understandable that one might want to move beyond that established base, it’s good strategic advice to build on your assets. So how could Sacramento do more to turn state government into a key component of its economic strategy?

State Government as an anchor institution

One recurring theme in economic development is that cities ought to tap into so-called “anchor institutions” like universities and hospitals, to promote entrepreneurship and job development. An “Eds and Meds” strategy, can promote economic opportunity, particularly in adjacent neighborhoods, and can be a way of reaching underserved populations.

In that vein, maybe it’s time for urban economic developers to add state capitals to the the “Eds and Meds” list of anchor institutions.  Like hospitals and universities, state capitals are large, place-based and relatively stable employers with large procurement budgets.

On its face, making state government part of an economic strategy seems to contradict one of the fundamental tenets of economic development: that one ought to focus on what we call “traded sectors” of the economy:  businesses that compete in national or international markets, and who by doing so successfully generate new income that gets re-spent in the local economy.

A key insight behind this principle is that, in general, it’s difficult for the local-serving sectors of the economy to grow any faster than local population and income.  The local-serving sectors–things like grocery stores, retail banks, dry-cleaners, plumbers, hairdressers, and the lik–sell all or nearly all of their goods and services to the local population. If you want to grow your economy, you will usually get much more impact from boosting traded sector firms.

In general, state and local government gets classified as part of this local-serving sector of the economy.  Police, fire, schools, transit and other public services are generally driven, like retail and service businesses, by the size of the local population. For that reason, we generally don’t think about government being a traded sector industry.

But economic strategies should recognize an area’s particular strengths, and also address inclusion as well as growth. In particular,  if we’re concerned about promoting inclusion creating pathways for local residents to work in occupations that we can be confident will still be  in the area two or three decades from now, there are good reasons to focus on state government.

State government as a target industry

If you think about the kinds of criteria that economic developers use to choose target industries, for capital cities, the state government sector seems like one you would choose. In general, economic strategies target industries that have lots of jobs, that have a strong reason to be in a particular region, that are growing, that pay well, and that bring new money into the community. State government jobs do all these things in the Sacramento economy.

First, state government is a large employer in Sacramento.  According to the latest American Community Survey tabulations, roughly 130,000 state employees live in the Sacramento metropolitan area.

Second, the city’s position as state capital constitutes a defensible competitive advantage. Sacramento doesn’t face direct competition from other California cities to be the capital.  

Third, state government employment in California is a growth industry.  As the California population and economy grows, so too does the overall size of state government.  While it’s not recession-proof (state budget shortfalls hamstring hiring and state job growth), it has expanded even as other sectors of the economy like manufacturing, have declined.  Here’s Sacramento’s state job growth record:

Regardless of its exact connection to the macro-economy, Sacramento has a strong interest in a healthy, well-managed state government. Working out a resolution to the state’s long term fiscal challenges–such as meeting future pension costs–will affect the local economy, just as much as restructuring in a more traditional basic sector industry.

Moreover, thanks to the demographics of state government employment, there are likely to be a fair number of job openings in Sacramento in the next few years. According to the American Community Survey, about 30 percent of state government employees in the region are 55 or older, meaning most will qualify for retirement within a decade.  This could produce more than 40,000 job openings in Sacramento, even if total state employment remains flat.

Fourth, state jobs pay well.  According to data from the Bureau of Labor Statistics, the average weekly wage of state government jobs in Sacramento was about 60 percent higher than the average weekly wage for all private sector jobs in the region.

Fifth, even though it doesn’t fit the usual profile of a traded sector industry, as far as the Sacramento metropolitan economy is concerned, state government plays a similar role–it imports money (chiefly tax revenue) from the rest of California, which is then spent in the form of salaries and purchases in metropolitan Sacramento.  

So, to summarize:  state government jobs in Sacramento fit all the criteria one would apply in choosing a targeted industrial sector:  it’s a large industry, it pays well, it’s growing, the city has a defensible competitive advantage, and it’s bringing new money into the regional economy.

Just as with other anchor institutions, one of the keys may be brokering a stronger relationship between the city government and the institution in question. As Richard Florida has pointed out, the success of an “Eds” strategy hinges on bridging the town/gown divide and transforming what can be an often contentious or transaction relationship into a long-term mutually beneficial partnership.

Achieving inclusion

Like most cities, Sacramento has identified the challenge of promoting greater inclusion:  How do we help low income populations and communities of color to achieve greater economic opportunity? One of the critical challenges in many traded sector industries is that, in order to compete in global markets, firms need to have highly skilled and educated workers. For example, innovative software and biotechnology thrive or fall based on their ability to hire workers with years of advanced education and experience in these fields, and these firms tend to recruit from national and global talent pools. That makes them poor candidates for hiring lots of entry level workers with modest educational credentials.

State government is more like Eds and Meds.  While some of the jobs require an advanced education and credentials (teachers, doctors), others offer pathways from entry level jobs to long term employment and in some cases more demanding careers (as in nurse’s assistant, to LPN to RN).  

As with Eds and Meds, the challenge may be to create entry points and pathways for local Sacramento residents to learn about and prepare for potential state government jobs. Do elementary and high schools have career awareness and job shadowing programs or partnerships that include state agencies? Do these agencies participate in the  region’s internship programs for high school and college students?

The essence of strategy is recognizing the unique characteristics of your particular situation and capitalizing on distinctive differences. For a state capital city like Sacramento, and particularly in a large and robust state like California, state government jobs are, and will continue to be an important cornerstone of the economy. Figuring out how to leverage this resource to the region’s advantage will be time and energy well spent.

Cities as selection environments

Being cheaper may not be an advantage at all in a dynamic, knowledge based economy

It’s axiomatic in the world of local economic development that the sure-fire way to stimulate growth is to make it as cheap and easy as possible to do business in your community. Area Development, a trade journal for industrial recruiters rates states every year.  They’re clear about their criteria:

What’s it take to be recognized as a top state for doing business?  . . . The overall cost of doing business is, of course, a primary consideration, one that encompasses a wide range of components, from real estate costs to utility rates to labor expenses.

So it’s a seeming paradox that some of the most expensive, highly regulated places are routinely the most entrepreneurial and innovative. Cities like New York and San Francisco have some of the most expensive rents, and their workers are highly paid. And yet, year in and year out, they generate many of the most creative and successful businesses.

In part, we suspect it’s because cities function as rigorous selection environments for businesses.  By selection environment, we mean that the characteristics of the city systematically favor some kinds of enterprises and disadvantage others. If you have a low margin, low growth business that’s sensitive to land costs or worker wages, you’ll likely find that its cripplingly expensive to do business in a San Francisco or New York. The only businesses that can survive in such a location are the ones that are innovating quickly enough to be able to afford to be there. You need to be highly profitable, or on a plausible track to generate profits in order to pay the bills. Businesses that can’t meet those tests don’t start there, are more likely to fail, or will move away. The result is that surviving businesses are likely strong competitors.

As Frank Sinatra told us in the the famous refrain from New York, New York:

If I can make it there, I’ll make it anywhere

The tough competition for market share, recognition, capital and talent in these cities means that, disproportionately, the strongest business concepts and capable management teams move forward. The press of competition also forces firms move quickly, lest they be left behind.

The converse is also true: low cost locations may insulate businesses from the need to innovate.  If rents are cheap, taxes are low, and labor is docile and low paid, there may be little reason to undertake the risk and expensive of new equipment investment, worker training, or research and development. Economists often speak of “the resource curse“–that an abundance of some valuable natural resource, like gold or oil, skews a local economy’s activity away from innovation and entrepreneurship.  In a sense, cheap housing and low business costs can be a kind of resource curse.

There’s an additional factor as well:  cheap housing tends to attract and retain low skilled workers. If you live in a place with low housing costs, you may find it too expensive to move to a place like New York and San Francisco–unless you have the kind of skills that will get you a job that pays enough to afford high rents. So workers may self-select as well–and as a result, employers in low cost housing markets will have a lower skilled labor force.

Selective factor disadvantages: What doesn’t kill you, makes you stronger

Economists tend to focus on the story of comparative advantage: that economies tend to grow and flourish in those industries to which their natural and human resources are most conducive, relative to other locations. But in some cases, as business strategist Michael Porter has pointed out, comparative disadvantages can prompt innovation.

What is not so obvious, however, is that selective disadvantages in the more basic factors can prod a company to innovate and upgrade—a disadvantage in a static model of competition can become an advantage in a dynamic one. When there is an ample supply of cheap raw materials or abundant labor, companies can simply rest on these advantages and often deploy them inefficiently. But when companies face a selective disadvantage, like high land costs, labor shortages, or the lack of local raw materials, they must innovate and upgrade to compete.

With a lack of coal and high cost electricity, Italian steelmakers were an absolute competitive disadvantage to British and German steelmakers. Their disadvantages forced them to innovate, and their highly efficient electric mini-mills made them a flexible, low cost producer.

The fact that some factor disadvantages can stimulate an adaptive response with an economic upside doesn’t mean that one should treat this observation as a universal rationalization for high business costs. The key word in Porter’s formulation is “selective.” That suggests that you need to look at business costs, and the business climate, in a more comprehensive and nuanced way than is presented in the usual index rankings compiled by Area Development and its ilk.

You can build a simple, static model of economic competition in which having the low cost always wins. But in a rapidly changing knowledge economy, the ability to continuously create new ideas, new products and new businesses is the key to success and being “dynamically efficient,” as Douglas North put it. Cities are the selection environment that gives rise to new businesses, and the cheapest location is unlikely to be the one that optimally selects robust competitors.

No exit from housing hell

Distrust and empowering everyone to equally be a NIMBY is a recipe for perpetual housing problems

The recent defeat of SB 827–California State Senator Scott Wiener’s bill that would have legalized apartment construction in area’s well served by transit–was the subject of a thoughtful post-mortem in the Los Angeles Times: “A major California housing bill failed after opposition from the low-income residents it aimed to help. Here’s how it went wrong.” Liam Dillon notes that while the bill had the strong support of YIMBY–yes in my back yard–housing advocates, it foundered because of the combined opposition of not only local governments and homeowners, but also the very people it was supposed to help:  low income renters.

Dillon points out the schism between the economic and political cases for the legislation. SB 827 may have been great economics, but it was poor politics. YIMBY’s and a wide range of urban and housing scholars supported the SB 827 approach, arguing that more housing, especially in transit served locations, would ease lower rents and reduce displacement.

“The reality is that the heart of displacement is a lack of housing, which pours lighter fluid on housing costs, puts huge pressure particularly on low-income tenants and pushes people out,” [Senator Scott Wiener] said. Research from the state’s nonpartisan Legislative Analyst’s Office and UC Berkeley has found that building any new housing, especially homes subsidized for low-income residents, prevents displacement at a regional level.

But low income renters–and, importantly, those who advocate on their behalf–weren’t buying it. Dillon says “there is a fundamental disconnect between the approach of the senator and his supporters on one side and influential anti-poverty organizations on the other.” Their fear was that new apartment construction would happen disproportionately or exclusively in lower income communities.  The Brookings Institution’s Jenny Schuetz boiled this down to a trenchant tweet:

Tricky politics. Past experience shows that wealthy white communities have been more successful blocking development in their neighborhoods, so not unreasonable that lower-income [people of color] worried they’ll bear the brunt. But building more housing is only long-term solution.

Never mind that this pretty much flies in the face of the logic of real estate development: given the choice to build apartments in a high income community or a low income community, developers will inevitably tend to gravitate toward the places where rents are higher so that they can earn a greater profit. The fact that high income communities have been so adept at zoning land for single family uses and so resistant to development proposals is the principal reason that demand has been diverted to lower income neighborhoods in the first place. A sweeping, statewide pre-emption of “local control” is the only thing that’s likely to open up the opportunity to develop in these higher income places.

Ultimately, this shows how deeply in-grained the notion of weaponizing development approvals is in the land use process. The argument seems to be that unless low income communities have the same power to exclude new development that wealthier communities routinely exercise, that this is inequitable. Low income housing advocates have used withholding development permission and regulating density to extract concessions from developers in the form of community benefit agreements or construction of or financial contributions for affordable housing. This exactly parallels the way in which higher income communities extract concessions in the form of land dedication, park construction, contributions for schools and local government and other amenities.

As long as we view planning and development approvals as devices for extracting concessions from developers on a case-by-case basis, we’ll inevitably circle back to a low-build, NIMBY-dominated world.

This is pretty much the problem that has plagued New York’s Mandatory Inclusionary Housing program. In theory, the city’s program requires developers to dedicate a portion of units in new apartment buildings for affordable housing, which should ease the city’s supply crunch and help reduce everyone’s rent. But in practice, the individual neighborhoods in which the up-zoned apartment buildings would be constructed oppose the additional density.  While the city-wide policy easily gained a majority of the City Council, the individual up-zoning approvals that would activate the “mandatory” portions of the law have run into difficulties. In the first two projects forwarded under the law–in Manhattan and Queens— strong neighborhood opposition has prompted the local city councilor to withdraw support for the needed zone change–effectively torpedoing the projects.

In many respects, this is a reprise of the drama that doomed Governor Jerry Brown’s 2016 proposal to exempt affordable housing construction from the state’s CEQA environmental impact review process. While that would have encouraged development, it also would have removed a valuable bargaining chip that local communities (and labor unions and environmental groups) used to extract concessions from developers. As long as development permission is organized around this highly transactional, brokered process, its unlikely that any group is going to cede its points of leverage. We’ll achieve equality by enabling all neighborhoods, rich and poor, to be empowered to say “not in my back yard.”

As we’ve pointed out before, there’s a particularly nasty version of the prisoner’s dilemma operating when it comes to liberalizing land use laws.  Individual communities and groups would be better off if everyone were open to allowing more housing everywhere. But they don’t trust that others won’t renege, and their community (or group) will be saddled with all the burden and impacts of additional density. As in the prisoner’s dilemma, everyone looks out for their own self-interest, which produces a result that is collectively worse for everyone. Like Sartre’s No Exit, it feels like the actors are caught in a hell of mutually conflicting objectives.

If there’s going to be a way to break this logjam, it’s probably going to have to look a lot like Senate Bill 827, a relatively simple, clear and unavoidable state pre-emption that applies with equal force to all communities, rich and poor. The trick will be getting everyone to agree that this is in our common interest.

City as theme park

There’s no critique more cutting than saying that development is turning an urban neighborhood into a theme park.

 

The irony of course, is that cities like Dubrovnik and Venice represent a profoundly obsolete, pre-industrial technology.  They were built without machines, without computers, designed for walking and and most animal powered travel.  They provide a degree of walkability, density and human-scale–and freedom from cars–that’s simply not available anywhere else.

 

In 1990, some colleagues at the University of Washington invited Patrizio Bianchi, a professor of economics at the university of Bologna to present a lecture on Italy’s Marshallian industrial districts. After the lecture, the hosts took Bianchi on a walking tour of Seattle’s Pike Place Market, pointing to market stalls and fresh produce with obvious pride. Bianchi was non-plussed, much as if a group of Americans had been treated to a tour of a Walmart in Rome.

A critical look at suburban triumphalism

The “body count” view of suburban population misses the value people attach to cities

Lately, we’ve seen a barrage of comments suggesting that the era of the city is over, and that Americans, including young adults, are ready to decamp to the suburbs. We think this new wave of suburban triumphalism is missing some key points about the growing value people attach to urban living. Our takeaways:

  • Well-educated young adults are increasingly moving to cities and propelling city growth; their numbers are up 19 percent since 2012.
  • Counties are the wrong units for measuring whether city centers are growing or not. (We explore how misleading this is, using Chicago as an example).
  • In overall population growth, cities are performing better than they did in the previous decade, suburbs are performing worse than they did a decade ago.
  • We’re bumping up against the limits of how much cities can grow:  For a while they could grow by  reducing vacancy in existing housing; now they have to grow by building new units. That’s happening, but not fast enough.
  • People are paying a premium to live in cities; The value people attach to urban living (diverse, interesting, walkable neighborhoods) continues to rise relative to auto-dependent suburbs. In this case, price rather than quantity is a better measure of preference.

“The suburbs are back, baby” or so we’re told.

At least that’s the message we’re getting from many who look at population trends. The very careful Jed Kolko looks through the latest county population estimates and notes that in the aggregate the most suburban counties are again growing somewhat faster than the most urban counties.  Joel Kotkin, the patron saint of suburbanization hails this as a decisive turning point:

The trend of people moving to metros with the densest urban cores—a mainstay of media coverage—is clearly over.

Writing at Bloomberg, Conor Sen chimes in:

But every year that passes, the more population patterns are starting to look like the old sprawling dynamic serving suburban and exurban demand.

CityLab tracks down Joel Garreau, who coined the term “Edge Cities” back in the 1980s, to reprise his views on suburban triumphalism.  He dismisses city growth as limited and exceptional, confined to just a few rich cities.

Garreau dismissed the idea of a return to central cities as much ado about a handful of coastal locales that just happen to be home to most of the people who write and read about these kinds of things. The only downtowns that are seeing significant growth, he said, are New York, Washington, Boston, Seattle, Portland, Oregon, and San Francisco. All of them are very wealthy, and very non-representative.

“Yes, in six cities it’s happening, where the children of the people who read The New York Times live,” he said. “But I’m a numbers guy, and you’ll find that the vast majority of Millennials don’t live in the old downtowns but in the suburbs, like sensible human beings. Because that’s where the jobs are, or where their parents’ basements are if they don’t have a job.”

A closer look at the numbers

Let’s start with the population estimates. Jed Kolko has done a neat job of parsing census data, dividing counties in the nation’s large metro areas into broad categories of city and suburb. He correctly notes that in the aggregate, counties in the “suburb” category have added people faster in the past year than counties in the “city” category.

But a we’ve pointed out before, counties are fundamentally the wrong unit for measuring urban growth. County boundaries vary tremendously from region to region, and seldom match up with dense city centers. (See the discussion of Chicago and Cook County, below, for a detailed explanation of how county data conceal rapid growth in the core by combining it with declining first-tier suburbs).

It’s also important to put these numbers in a broader historical context:  While city growth has slowed and suburbs have rebounded in the past couple of years, compared to the first decade of the 2000s, city growth has accelerated, while suburban growth has slowed. Growth in the lowest density suburban counties which was running more than 2 percent per year (the gray line below) is still well below that level. Higher density suburbs are also below their growth rate of the previous decade (orange line). While growth in urban counties has slowed, its higher than it was in the prior decade (blue line). What we’ve really seen is convergence: low density counties wildly outpaced urban ones a decade ago; now all three categories are growing at much more similar rates. (And remember: this is the county level data that conceals what’s happening in the densest urban neighborhoods)

Source: Jed Kolko

It’s also a mistake to look just at domestic migration data. For example, Kotkin and Cox repeatedly point out that big cities like New York and Los Angeles are experiencing high levels of domestic out-migration.  That’s actually always been true because these cities are immigrant gateways. Domestic migration only counts those who lived in the US in the past year, and so misses foreign immigrants. The combination of births, international immigration and still considerable gross flows of in-migrants have fueled growth in large cities.

Who is moving to cities?

The raw numbers tell part of the story, but what’s equally important is to look at who is moving to cities. We’ve been carefully following these data for more than a decade, and we’ve seen a consistent, and accelerating trend:  it’s well educated young people who are fueling city center growth. In our 2014 report, the Young and Restless, we showed that close-in urban neighborhoods–those places within three miles of the center of the central business district, have been adding college-educated young adults twice as fast as in their surrounding metropolitan areas.

We updated this analysis earlier this year, looking at data on city population growth. The number of 25-34 year olds with four-year degrees living in large cities is growing almost twice as fast as that demographic group is growing outside those cities. In just the past five years there’s been a  19 percent increase in 25-34s with a college degree. The preference of young workers for urban living is drawing more firms to city centers, generating additional jobs for workers of all ages.

Some, like Conor Sen, try to claim that this back to the city movement will peter out as millennials age and have children. But this view conflates life cycle changes with generational shifts in preferences: older adults are always somewhat more suburban, but less so than in previous decades. And as Millennials age out of the 25-34 year old age group, they are being replaced by an equally numerous next generation, that is similarly, if not more urban-oriented.

Prices tell us people value urbanity and that we’ve got a shortage of cities

Focusing just on population changes, implicitly makes a naive and questionable assumption about revealed preference, that everyone is getting to live exactly where they would prefer. But we have good reason to believe that lots more people would like to live in cities if we built enough housing for them. The most powerful evidence on this point is the high and rising price premium that is being paid for urban housing. A variety of studies have confirmed this fundamental shift. Here’s the data on relative home prices (In constant dollars) compiled by Columbia University economist Lena Edlund and her colleagues. Homes located in the urban center now command much higher prices than they did two or three decades ago.

As we’ve noted this relative price increase in the most urban neighborhoods compared to suburbs constitutes a kind of “Dow” of cities, and reflects the value Americans attach to urban living. Data compiled by Fitch from the Case-Shiller repeat sales home price index shows that over the past 15 years or so, prices for the densest urban neighborhoods have increased about 50 percent faster than for suburban housing.

The urban price premium is driven by a growing demand for cities, but more recently, it also reflects a constrained supply. For a while, cities could grow faster than suburbs by reducing vacancies. But vacancies have shrunk, and now city population growth is bumping up against the limits of urban housing stock. As is most evident in places like San Francisco, demand for urban housing has increase much faster than supply, with the result that prices are high, and many people who would like to live their (or who did until recently) can’t afford the rent.

So far from illustrating the demise of the demand for cities, the somewhat faster recent growth of suburbs is fueled in part by the fact that we haven’t addressed our shortage of cities.

Lessons from the Windy City

It’s illuminating to look to see how these trends play out in a particular city. In some respects, Chicago is the poster-child for city population decline. But if you look in detail, you see a strong urban center and decline in the suburbs. County-level data mask the growth in the downtown Loop and nearby neighborhoods. It’s actually the case that these parts of the city of Chicago are booming and gaining tons of smart young residents.

Keep in mind that Cook County, Illinois, encompasses the city of Chicago–and dozens of surrounding suburbs. Cook County is losing population in the aggregate, but chiefly in its aging, first-tier suburbs. But Cook County is huge, and only partly the center of the metro. Downtown Chicago is booming (with 47 high rises under construction–mostly for new housing). Our analysis earlier this year shows that in the four years, 2012 to 2016, the City of Chicago (not Cook County) added 42,000 25-34 year olds with a four year degree, up 17% over just five years.

Meanwhile: When you look at Chicago’s suburban counties, they’re underperforming their previous growth, and many are declining in population. Chicago Business tracked down local demographers who’ve carefully studied change within the metro area. What they find is a surging center and declining suburbs:

. . .  according to demographers at the Chicago Metropolitan Agency for Planning, Cook is just returning to the slow decline that pretty much halted during the subprime mortgage recession, in which migration nationally slowed to a crawl.

The real thing that’s changed is that outer counties such as Will and McHenry aren’t growing nearly as fast as they did in prior decades, combined with a sharp reduction in immigration to the Chicago area, according to Elizabeth Schuh, principal policy analyst at CMAP, which represents the seven Illinois counties in the region but not those in northwest Indiana or southeast Wisconsin. “Almost all of the counties except Kendall are losing,” says Schuh, who’s had a few days to study the data. (emphasis added)

According to a new study by Schuh and CMAP’s Aseal Tineh, the population drop since 2006 among native-born residents is exclusively concentrated among those who earn less than $75,000 a year and, in the case of the foreign-born, less than $25,000. But the region has gained more than 350,000 residents since 2006 who earn at least $75,000 year, she says. (Those figures are not adjusted for inflation.)

Slow employment growth in many sectors likely is the reason, she says. “Our job growth is just lower than (in) other regions,” for middle-skilled positions in fields such as manufacturing and administration that require some post-high school education but not a college degree. But among the college-educated, the region continues to grow.

When you look city-by-city at the data, its apparent that urban centers are extremely robust, are attracting more talented young workers, and the firms who want to employ them. What cities everywhere are bumping up against, however, is the slowly growing supply of housing in great urban neighborhoods–a fundamental fact reflected in the growing city housing price premium.

Diverse, Mixed Income Neighborhoods Maps

This page contains maps showing the nation’s most racially and ethnically diverse neighborhoods, and those with the highest levels of income mixing. for City Observatory’s Diverse, Inclusive Neighborhood report.  These web-based maps that let you zoom in to a particular metropolitan area, and observe racial/ethnic and income patterns, and inspect data for individual census tracts. Tracts shaded yellow have a racial and ethnic diversity index score in the top 20 percent of all urban census tracts nationally.  Tracts shaded blue have an income diversity index in the top 20 percent of all urban census tracts nationally.  Tracts shaded green are both racially and ethnically diverse and have a high level of income diversity; they score in the top 20 percent of all urban census tracts nationally on both indices.

America’s Most Diverse, Mixed Income Neighborhoods

Map Navigation Instructions

To search for a particular city, enter text in the search box.

To zoom to a location, click and drag to center the map and then click on “+”, to zoom to the desired scale.

To see data values for an individual neighborhood, click on that neighborhood, and then click on the name of the census tract in the pop-up box to see values for that tract.

Data are available for the 52 most populous metropolitan areas in the United States. Census tracts shown are those with a population density of at least one person per acre. Data are from the 2011-15 five-year American Community Survey.

 

Inclusionary Zoning’s Wile E. Coyote moment

You won’t know that your inclusionary zoning program is wrecking the housing market until it’s too late to fix.

How lags and game theory monkey wrench inclusionary zoning.

One of the toughest problems in economics and economic forecasting is dealing with markets and decisions that involve considerable lags. There’s often a lag, or length of time between the time an economic actor recognizes a changed circumstance and the time that anything they do will act upon it.  The Federal Reserve Board is always trying to anticipate future conditions in the economy (incipient inflation or a potential recession) because it knows any policy action it takes will require several months to influence the larger economy.

Lags are an especially important problem with investment decisions.  Investments, particularly those involving new construction, often involve significant lags. In the housing market, for example, it can take a couple of years between the time a developer identifies a potential market, and goes through all the steps of searching for and buying property, developing plans, submitting plans for government approval, lining up financing, hiring a construction firm, and actually building something.  A key cause of the run up in rental prices in many cities was this lag, or what we might technically call the “temporal mismatch” between demand (which changed quickly) and supply (which took several years to respond).

Lags can also play out in the economic data–sometimes the negative effects of some policy don’t show up right away–it can take months or years before the effect of some decisions are apparent–and by then, it can be too late to do anything to correct the underlying problem.  You can call this the Wile E. Coyote problem.

As you know from the Warner Brothers cartoon, the Coyote can go wildly charging off a cliff, and keep going forward for some time before he looks down, and then discovers there’s no longer any ground beneath him.

The open question right now is whether the City of Portland’s housing market is in its Wylie Coyote phase. To hear the City of Portland housing bureau tell the story, everything is fine:  Housing permit activity is chugging along at relatively high levels.

But what this misses is that the pipeline to produce housing is a long one, and there’s abundant evidence that its drying up.  While there was a surge of housing proposed to beat the city’s deadline for complying with new inclusionary housing rules (February 1, 2017); since then, new proposals have slowed sharply.

Portland’s Daily Journal of Commerce reports that a number of developers say that apartment projects in Portland no longer provide an adequate rate of return to attract financing:

Inclusionary housing pushes the return on costs below a level that is really acceptable to capital markets,” said Brian Wilson, a partner with Mainland NW. “It is very difficult to underwrite them even in the best of circumstances.” Mainland has planned to develop several properties in the Cathedral Park neighborhood of North Portland, but the developer has been unable to raise adequate equity, Wilson said.

The housing industry group Up for Growth has tabulated figures on proposed new development and concludes that there’s been a significant drop off in proposed construction. They also report a substantial up-tick in proposed new apartment developments smaller than 20 units (to which the city’s inclusionary zoning requirements don’t apply). That signals developers are actively seeking to avoid the requirements, and (unfortunately for the city’s housing supply) may be under-building on some sites that could accomodate more housing.

Source: Up for Growth

For now, city officials appear to be in denial that anything’s amiss.  But there are some cracks in that optimistic facade.  As originally adopted the city’s inclusionary zoning ordinance provided for a step-up in the inclusionary set asides. In the first year of the program, developers were required to set aside 8 percent of units for households of 60 percent or less of Area Median Income (AMI) or 16 percent of units for households at 80 percent of AMI.  After the first year, that was scheduled to step up to 10 percent and 20 percent respectively.  But the city backed off implementing that requirement.  Why, if the inclusionary zoning program was working as planned, did the city drop those requirements?

It’s a sign that the city isn’t particularly comfortable with its claims of success.

More alarmingly, the situation the city now finds itself in, is in many ways problematic, for reasons suggested by game theory.

Game theory and inclusionary zoning

Game theory studies how people respond to incentives. One well-known game theoretic workhorse, for example, is the prisoner’s dilemma, which considers the incentives a criminal co-conspirator has to to turn state’s evidence against and accomplice in return for a lighter sentence.

Game theory is important to policy, particularly when it comes to the timing of investment. The prospect of changes in the rules often shapes behavior.  Changes in the tax rates of capital gains, for example, can prompt investors to do things differently (leading to an uptick in sales before increases in tax rates, or a deferral in sales until after tax cuts take effect).

We can think of apartment developers and investors as playing in a housing game.  One part of the “rules” of that game are the city of Portland’s land use planning process, and the players in this game will keep a close eye on possible future rule changes in deciding whether, and when, to invest in Portland.

The advent of new land use planning requirements, like Portland’s inclusionary zoning requirements, adopted in December 2016, but which took effect in February 2017, are subject to just this kind of game theoretic effects.  The city’s adopted ordinance projects that filed after February 1 to set aside a portion of newly constructed units for low and moderate income families. Projects that filed for land use approval before the law went into effect were exempt.

Unsurprisingly, there was a land rush of developers to file under the older, laxer rules.

From a policy standpoint, that was a huge plus, at least in the short run.  Despite concerns that the cost of complying with the IZ requirement would reduce apartment investment, any negative effect has, at least so far, been more than offset by the flood of new development approval applications.

So the first two years of inclusionary zoning in Portland have been a game-theory win-win for housing affordability. The threat of tougher future requirements prompted a whole lot of investment to happen much earlier than it otherwise would have, and new developments, added to those already under construction, have helped deliver a lot more new apartments in Portland. That came at a good time, and has clearly helped with affordability– vacancies have ticked upward, and rents have leveled off and even declined.  Yardi Matrix says that rent increases in Portland in the past 12 months were lower than in 19 of the 20 largest metro markets–a huge reversal from just four years ago, when Portland rents were increasing faster than almost any other market in the US.

As everyone acknowledges, though, that one-time surge in applications is just temporary.  As those projects get built, its increasingly apparent that very little new development is following in its wake. There’s credible evidence that development is slowing because the IZ requirements dramatically reduce the attractiveness of new apartment investment in Portland.And now is when the game theoretic aspects of the policy process come back to bite the city and threaten housing affordability.

So here’s the game theory problem:  If (as we believe) the IZ requirements are stifling development, what does the city do?  Broadly speaking, it has two choices:  It can “stay the course” and maintain the adopted IZ requirements, or it can lessen them.

If it pursues option one and  signals it’s going to stay the course, it probably just prolongs the pain. If investors are convinced that Portland would stay the case no matter what, they’d probably adapt to the new regime–some would move forward with their plans, many would likely leave the market. As several developers have noted, the market will rebound when the shortage becomes so severe that rents rise enough to offset the costs of complying with the IZ ordinance–which will mean that the IZ policy will have failed in its primary objective to make housing more affordable.

Option 2 may ease the pain of the IZ requirements, but potentially comes at a price. If the city signals it’s considering relaxing or adjusting the requirements, that would likely have the  perverse effect of encouraging investors to wait and see, at least until new rules are adopted.  Why move ahead with a project under expensive or burdensome IZ regulations now, when, if you wait a little while, you might get a much more favorable deal?  A prolonged debate over whether to walk back its IZ requirements could further dry up the development pipeline.

This puts city officials in an difficult policy bind:  They can stick belligerently to their beliefs that they’ll execute the law, even if it has adverse consequences (which may provide some certainty) , or if it actively entertains doubt and floats the possibility of liberalization or relaxation of requirements, unless it acts to implement them immediately or abruptly, it invites investors to postpone any plans to work in Portland.

Option 2 effectively constitutes the city’s “Wile E. Coyote” moment:  when it looks down and sees no ground holding up its IZ policy, things will start falling, fast.

As a result, we’re quickly moving from the “can’t lose” to “can’t win” portion of the housing affordability policy game. It will be interesting to see how the city responds.

Portland doesn’t really want to make housing affordable

Actions speak louder than words; blocking new housing will drive up rents

Nominally, at least, the Portland City Council is all about housing affordability.  They’ve declared a housing emergency. In the last general election, City voters approved a $258 million bond issue to build more affordable housing. The Council has made permanent a city ordinance requiring landlords to reimburse tenants for moving expenses if they pursue a no cause eviction, or if the tenant moves after a 10 percent rent increase.

But ultimately housing affordability in the Rose City, as everywhere, hinges on whether enough supply gets built to accomodate the growing demand for urban living. And the city’s zoning code and project approval requirements are where the proverbial rubber meets the road in terms of expanding housing supply. And in a series of recent actions, the Portland City Council is effectively sabotaging the supply of new housing in a way that will ultimately worsen the city’s affordability problems.

DENIED: The hundreds of people who would have lived here will now be bidding up rents elsewhere in Portland. (Next Portland).

Last week, the council voted to deny a building permit for a proposed 17-story, 275-apartment tower in the city’s booming Pearl District. Neighbors opposed the tower because it blocks views (from their recently completed condominium towers) of one of the city’s iconic bridges. Because it abuts the city’s Willamette River greenway, the building is subject to the city’s design review process. And while commissioners said they weren’t caving in on heights or views, they claimed that the building was somehow at odds with the city’s greenway policies. Portland for Everyone’s Michael Anderson has an excellent in-depth review of the proceedings in a post a Medium: “Open Season for NIMBY Appeals:  Portland blocks 275 homes after Pearl District neighbors ask it to.” Briefly:

. . .  the council unanimously voted to give the anti-housing activists exactly what they had been asking for: no new homes on the site.

It’s ironic because one of the virtues of Oregon’s planning system is that, for the most part, new developments that are allowable under a land use plan must be approved using “clear and objective” approval standards.  The idea is that the city should be bound by what’s in its plan: so if an area is designated for apartments, the city is obligated to approve permits for an apartment in that area.

A loss of certainty kills off housing investment

Arbitrarily invoking a vague feeling of discomfort about whether a project is consistent with the greenway–and overturning a vote of the city’s design review commission in the process–sends a clear signal to developers that they can’t rely on what’s written in city plans and policies.  In this case, the developer may be forced to return to the drawing board, and submit an entirely new proposal–and again run the gantlet of public outcry, and again confront a largely subjective determination as to whether the development meets with Council approval. Other developers are likely to heed this lesson.

As we’ve written at City Observatory, the city’s recently adopted inclusionary zoning law adds yet another layer of uncertainty. The law imposes a wide range of conditions on new 20-unit and larger apartment buildings, generally requiring that 20 percent of units be affordable, and that affordability be guaranteed for 99 years, and that the apartments be comparable in size and finishes to market rate units. The added cost of subsidizing such units is supposed to be made up by some combination of tax abatements, height and density bonuses, and parking requirement waivers. But since virtually no one has yet made their way through this process, it’s difficult (if not impossible) for developers to accurately assess how much time and money compliance will require. Again, this increase in uncertainty has a decidedly chilling effect on prospective investment. (New apartment proposals have come to a near standstill since the inclusionary housing ordinance went into effect a year ago.)

There’s one more problem: no one sees the buildings that don’t get built as a result of these disincentives and uncertainties. In the case of the now denied Fremont Place, we can publish a an accurate rendering of what the building would have looked like, had it been approved, and tell you that 275 households will now be competing for other housing in Portland. But going forward, many potential housing projects won’t even advance to the stage of having drawings, or marching through the approval process, because of uncertainty. Like Conan-Doyle’s hound that didn’t bark, we won’t see the housing that doesn’t get built. But we will likely feel it, as the still growing demand for urban living presses up against a finite urban housing stock.

 

Housing reparations for Northeast Portland

Attention freeway builders! Want to make up for dividing the community and destroying neighborhoods? How about replacing the homes you demolished?

One of the carefully crafted talking points in the sales pitch for the $450 million proposed Rose Quarter I-5 freeway widening project in Northeast Portland is the idea that it is somehow going to repair the damage to a community split asunder by a combination of road building and urban renewal in the 1960s. The Oregon Department of Transportation (ODOT) has created the illusion that the slightly widened freeway overpasses its building will be aesthetic “covers” that will somehow knit the neighborhood–historic center of the region’s African-American community–back together.

Over 300 homes were demolished along Minnesota Ave. (City of Portland Archives)

And political leaders have jumped on the bandwagon to make this point.

Portland Mayor Ted Wheeler went so far as claiming that the project,

“. . . restores the very neighborhood that was the most impacted by the development of I-5 and that’s the historic African American Albina community.”

And also following up in an interview:

One of the parts of this that nobody talks about, that frankly is the most interesting to me, is capping I-5 and reconnecting the street grid for the historic Albina community. And that of course is mostly a bicycle and pedestrian play. So I think this is being mischaracterized somewhat when people say, ‘Oh this is just a freeway expansion and it’s never going to meet its goals of congestion reduction.’ This is far from just focusing on just congestion reduction, this is an opportunity to restore one of our most historic — and not coincidentally — African American neighborhoods in this community.

City Commissioner Dan Saltzman added:

These new, seismically upgraded bridges will provide better street connections, improved pedestrian and bicycle facilities, a new pedestrian and bicycle-only bridge as well as lids (or covers) of the freeway that can provide much needed community space.ODOT is selling the project as a way of fixing the damage the freeway did to the area.

Public Policy and Community Affairs Manager Shelli Romero talked up the agency’s “environmental and public process” which will include:

“. . . a robust understanding, research and engagement strategy of the historically wronged African-American community and other communities of color. We understand the historic inequity concerns and will engage all communities in this project,” Romero promised.

Noble sentiments to be sure. But in our view this project does nothing to right the wrongs of freeway construction. Despite the high-minded rhetoric, widening the freeway repeats the same errors made a half-century ago and make the neighborhood’s livability worse. If these leaders are serious about redressing the historical wrongs done here, they could do much better.

The freeway and the damage done

But let’s step back for a minute and look at what the construction of the Rose Quarter Freeway did to the North and Northeast Portland neighborhoods is slashed through in the 1960s. Originally, I-5 was called the Minnesota Freeway, not because it led to that state, but because it followed the route of Minnesota Avenue. A few vestiges of that street remain, but mostly, I-5 runs in a trench that was excavated right down the middle of the former Minnesota Avenue (in some places also wiping out parts of the adjacent Missouri Avenue.  The freeway runs for 3 miles from Broadway to just north of Lombard Street.  In that stretch of road, the city also ended up dead-ending some 25 East-West cross streets. The southern portion of the route passed through the historically African-American neighborhoods of Albina.

The Minnesota Freeway cut a trench through N. Portland (City of Portland Archives)

 

When it built the freeway, the city condemned or purchased hundreds of homes in the neighborhood. We haven’t been able to locate any official Oregon State Highway Department records, but contemporary press accounts say at least 327 homes were demolished for the freeway.

Portland Oregonian, May 8, 1959

The highway department paid as little as $50 for homes, and because it judged that there were a sufficient number of vacant units in the region, it didn’t build any replacement housing. (Decades later, the highway builders did construct concrete sound walls to buffer the adjacent neighborhoods from the freeway noise.) The City of Portland tried to find money to help offset the displacement, but was barred from using urban renewal funds for the project. There’s no evidence the Oregon State Highway Department replaced even one of the more than 300 homes it demolished.

It’s now mostly lost to memory, but real people were displaced by freeway construction. We can get a glimpse of their presence and identity by looking at the City Directory for N. Minnesota Avenue for 1958, which lists the names and house numbers of all the families living along the street. George Palo, Victor Burns, Andre Guyot, Eldon Methum, John Pesola, Bernard Kolander, Arvid Renko, Lydia Kyrkus, Perry Anderson, Harold Roberts, Wesley Koven, John Mattila, Bernard Henry, Percy Stevens, Ida Davis, Reverend Monroe Cheek, and hundreds of others were living on Minnesota Avenue.

In 1958, hundreds of families lived on N. Minnesota Ave.

 

Just four years later all these people were gone.  The 1962 City Directory contains a blank spot where these houses and their hundreds of Portland residents had lived. Not a single building or resident remained on the blocks between N. Cook Street and N. Going Street.

By 1962, there were no houses left on N. Minnesota Ave. between Cook and Going Streets

 

With hundreds fewer homes and residents, there were fewer customers for local businesses, fewer students for local schools, and less property tax revenue to pay for public services. The quality and health of this neighborhood was sacrificed to speed through traffic and facilitate increasing suburban car commuters. The freeway had other effects on the neighborhood. It bisected the attendance area for the Ockley Green Elementary School, meaning many students could no longer easily walk to school. Several local neighborhood streets were transformed into busy, high speed off ramps. In the planning process, local officials raised these concerns with the state highway department, and were offered assurances that “every effort” would be made to solve these problems. The words spoken by highway department officials then sound almost identical to the assurances offered by ODOT spokesmen in response to concerns raised today about the Rose Quarter project. University of Oregon historian Henry Fackler describes the 1961 meeting convened by the city to address the effects of street closures:

At the meeting’s conclusion, state engineer Edwards assured those in attendance that “every attempt will be made to solve these problems.” The freeway opened to traffic in December 1963.  No changes were made to the route.

As part of today’s Rose Quarter freeway widening process, ODOT is holding similar meetings to the ones it held 55 years ago. When the Cascadia Times recently tried to follow up on concerns about construction and air quality impacts of the proposed freeway widening project on local schools, and was given a similar vague reassurance:

ODOT declined to make Johnson or Braibish available for comment. But spokesperson Don Hamilton pointed out that any ODOT construction is several years out, and that planning is in a very preliminary stage. He noted that PPS [Portland Public Schools] is moving on a different time frame than ODOT, but that when all is said and done, “We’ll work with them to make sure their needs are met …”

A tale of two agencies

Public leaders have acknowledged that freeway building and urban renewal devastated neighborhoods in Portland (and in cities around the US). It’s one thing to acknowledge a past transgression, but the sincerity of that admission is measurable by whether there’s any actual willingness to repair the damage done.

In recent years, the Portland Development Commission (now Prosper Portland) has publicly acknowledged the role its urban renewal programs played in undermining North and Northeast Portland neighborhoods.  Its dedicated a substantial portion of its tax increment financing moneys in the area to building new housing. The city even has a program to identify households displaced from the neighborhood by urban renewal, and give them preferential access to newly constructed subsidized housing.

In contrast, the Oregon Department of Transportation is proposing to double down on the scar it carved through the neighborhood. Its proposal widens the freeway. Despite talking points to the contrary, three key design features of the project show that the agency has the same old indifference to its impacts on the neighborhood. First, the widening project will run theeven closer to Harriet Tubman Middle School, so close, in fact that construction may undermine part of the building’s foundations, and worsen air quality. The Portland school district is contemplating a million dollar proposal for a wall and vegetation to shield the school from existing freeway emissions. Second, as we’ve discussed at City Observatory, the project will demolish the nearby Flint Avenue Bridge, a key low-speed, bike-friendly neighborhood street that crosses the freeway. Third, the project re-arranges the local streets connecting to the freeway into a miniature “diverging diamond” interchange, designed to speed car traffic, but creating a hostile and dangerous situation for pedestrians. Far from righting historical wrongs, ODOT is embarked on an expensive plan to repeat them, and inflict further damage on this neighborhood.

What ODOT should do: Build housing

If it really wants to make amends for the extensive damage freeway building did to North and Northeast Portland, and fulfill Mayor Wheeler’s pledge of “restoring” the neighborhood, a good place to start would be by replacing the housing demolished to build the Minnesota Freeway in the 1960s.  The average price of single family homes adjacent to the freeway (which is no doubt negatively affected by noise and air pollution) is about $424,000.  If ODOT were to build 330 or so homes to replace those lost in the 60s, the total cost would be approximately $140 million.

It’s worth keeping in mind that for the past 50 years, Portland has been negatively affected by the loss of that housing. Not only were its original residents displaced, but the loss of that housing meant fewer options for people who wanted to live in that neighborhood, fewer students at local schools, fewer customers for local businesses, and less property tax revenue for the city and schools.  More housing in this neighborhood would come at a propitious moment: helping alleviate a housing shortage, and providing more opportunities to live in one of the region’s most walkable, bike-friendly locations.

This freeway slashed through Portland neighborhoods, destroyed housing, and displaced families. Widening that freeway–at the cost of half a billion dollars–does nothing to right that wrong. It actually repeats the same mistake. If it’s serious about fixing the damage it’s done, it could do something very different and meaningful:  build homes.


This post has been revised to correct identify Missouri Avenue as the other street affected by freeway construction, and to correct formatting errors in the originally published version.

Barack Obama on Gentrification

. . . we want more economic activity in this community, because that’s what creates opportunity and with more economic opportunity it does mean that there’s going to be more demand for all kinds of amenities in the community. So you can’t have one without the other. You can’t say we want more jobs, more businesses and more opportunity for our kids but otherwise we want everything to stay exactly the same. It just doesn’t work that way. But what we do is make sure that we’re working with organizations and institutions in the community to preserve affordable housing, to make sure that it is residents that are benefiting.

On February 27, former President Barack Obama appeared at a community forum in Chicago to answer public questions about the proposed Presidential Center to be built in Jackson Park.  One part of the conversation dealt with gentrification.  You can watch the President’s answer here.  We’ve also transcribed his remarks for your reference.

Questioner: President Obama, I have a question how is the presidential center working to revitalize the South side without pushing out existing residents like myself.

Barack Obama:

Well, you know this is really an important issue. Some people have asked, by the way, why did we locate on a park? Part of the reason is, as I described earlier, when you look at the most vibrant parks in the world, whether it’s Central Park in New York, or Grant Park downtown or Lincoln Park , or Luxembourg Park in Paris, what characterizes great parks is activity and life and movement and people being around and people being outside and stuff going on.

Which isn’t to say you don’t want quiet spaces, contemplative spaces.

It’s a lived-in place. It’s not behind a glass case to look at. It’s something to be in—that’s the point.

But one of the things that we were also committed to is making sure that we weren’t displacing residents in the construction of the actual facility and we will not be.

Now, the issue that then gets raised is “Okay, that’s true but isn’t it true that once this gets built and all these visitors are coming and everybody sees how pretty Jackson Park is and how nice the lakefront is, won’t more people want to live down here.” And there is constantly a balance we’ve got to strike between making sure that existing residents are benefiting from increased economic development, benefitting from increases in home values and benefiting from more businesses being active and all that revenue, because that creates more wealth and more jobs and so forth,

We have to balance that with the fact that we want more economic activity in this community, because that’s what creates opportunity and with more economic opportunity it does mean that there’s going to be more demand for all kinds of amenities in the community. So you can’t have one without the other. You can’t say we want more jobs, more businesses and more opportunity for our kids but otherwise we want everything to stay exactly the same. It just doesn’t work that way. But what we do is make sure that we’re working with organizations and institutions in the community to preserve affordable housing, to make sure that it is residents that are benefiting. Those are the kinds of plans, activities, foresight that we have to have in order to get that perfect balance: revitalizing and renewing the community but also making sure that people who are already living there are benefiting from it.

I know that I heard a couple of people concerned about like, maybe rents might go up. Well, here’s the thing. If you go into some neighborhoods in Chicago where there are no jobs, no businesses and nothing’s going on, in some cases, the rents are pretty cheap, but our kids are also getting shot on that block. So what I want to do is make sure that people have jobs, kids have opportunity, the schools have a better tax base and if the rent goes up a little bit, people can pay it because they’ve got more money. And if they’re seniors, if they’re on fixed incomes, if they’re disabled, then we’ve got to make sure that there’s a process in place to encourage and plan for affordable housing units being constructed there.

But here’s the one thing I will say: I think a lot of times people get nervous about gentrification, and understandably so. But what I will also say is this: I first came to Chicago in 1985, and was on the South Side for – just doing the math real quick – twenty-some years, before moving to Washington because of the presidency. It is not my experience during that time that the big problem on the South Side has been too much development, too much economic activity, too many people being displaced because all these folks from Lincoln Park pouring into the South Side. That’s not what’s happened. I mean, it’s happened in some places along, you know, near like West Loop area; most of that has happened right around the city. There is so much room. Think about all the abandoned buildings and the vacant lots that are around here. We’ve got such a long way to go in terms of economic development before you’re even going to start seeing the prospect of significant gentrification. Malia’s kids might have to worry about that. Right now, what we’ve got to worry about is you have broken curbs, and trash and boarded up buildings, and that’s really what we need to work on.

Cloaking a weak argument in big—but phony—numbers

Journalists: Stop repeating phony congestion cost estimates. They’re just weak arguments disguised with big numbers.

This month The Economist has an excellent special report exploring the prospects for autonomous vehicles. They seem to be coming faster than many people anticipated, and they pose some big challenges and opportunities for cities. This otherwise very useful contribution to the conversation is marred, unfortunately, by The Economist also posting as fact the congestion cost estimates produced by traffic monitoring firm Inrix.

As regular readers of City Observatory know, we’ve pointed out serious problems with the Inrix congestion cost estimates.

It’s painful to watch an otherwise intelligent journal like The Economist uncritically reproduce the demonstrably fictitious congestion cost estimates. Carrying on the in the tradition of the Texas Transportation Institute, Inrix now annually produces some VERY SCARY NUMBERS about how much congestion supposedly costs travelers in cities around the world.

Congestion cost estimates are the horror fiction sub-genre of what we’ve called “Hagiometry.” Hagiography is flattery in prose form; hagiometry is flattery with numbers; and congestion cost estimates are designed solely to use big numbers to scare people into believing that a problem is somehow worse than it really is.

The Inrix figures are an argument—and a remarkably flimsy one at that—masquerading as economic statistics.  The implicit argument is that there is some state of the world in which people could travel just as fast at peak hours than they do when there’s no traffic on the roads. The cost estimates are constructed based on adding up how many more minutes it takes to travel when there’s traffic compared to when there are no traffic delays, and then multiplying that by some value of time. The math may be right, but the assumption imbedded in that argument, that there’s some way to build enough lanes to accomodate all that traffic, is just wrong.

In the words of Raising Arizona’s Nathan Arizona:  Yeah, if a frog had wings, it wouldn’t bump its ass a- hoppin’.

A clever quantitative biologist using the Inrix methodology could easily compute the number of excess derriere contusions the world’s frogs suffer every day because of their unfortunate decision to choose jumping rather than flying as a means of travel.  And that would provide exactly as much insight into transportation policy as does the Inrix report.

There’s no way to build enough roads to eliminate congestion, at any price

The essence of the Inrix calculation is this: if there were no other cars on the road when you wanted to drive to work each morning, and also when you drove home each evening, here’s how much time you’d save, and what it would be worth to you? But would it be fiscally or even physically possible to build enough roadway space to give everyone the same level of service at 5pm every day as you get when you’re driving at say 2am?  Of course not.

One could just as easily add up and value the total amount of time people in the world spend traveling between any two sets of points, and count that as the cost of not having “Star Trek” style matter-transporters. But, you would argue, that can’t realistically be regarded as a “cost” because such transporters don’t exist. And that’s precisely the point, there’s no way to build a road system that allows peak hour travelers to travel at the same speeds

And even if you did build a stupendous amount of additional roadway capacity (and could repeal the fundamental law of road congestion, in which the increment of additional capacity generates more peak hour travel thus replicating congestion), there’s absolutely no doubt that the cost of constructing that roadway would greatly outstrip the supposed “benefits” of eliminating congestion. We don’t have to speculate about this: when road users are asked to pay even a fraction of the of cost of building additional road capacity, in the form of tolls, they readily indicate by voting with their feet (or wheels), that they attach very little value to travel time savings. Notice, for example the case of the I-65 Ohio river crossing in Louisville. The state highway department doubled the size of the bridge from 6-lanes to 12-lanes, and then started charging a toll. Almost immediately traffic on the bridges fell from 120,000 vehicles daily, to about 70,000.

Make it stop!

Here’s the point:  Anyone who claims that congestion has a cost–meaning a real, net social cost–has to propose some transportation system that would (a) eliminate all of the delays that they’ve counted, and (b) do it for less cost than the value of supposed value of time lost to congestion. If they can’t do that they haven’t shown that congestion has any real costs at all.

Credulous reporters need to start looking past the dizzying array of numbers and ask some hard questions about the assumptions behind them. Repeating fictitious claims about the supposed cost of congestion isn’t helping their readers understand, much less solve, the world’s urban transportation problems.

Junk food America elected its president

The states with the worst diets voted disproportionately for Donald Trump

A powerful new study from  uses big data to shine a powerful light on our eating habits. Using data from grocery store scanner records, Hunt Allcott, Rebecca Diamond, and Jean-Pierre Dube (researchers from the New York University, Stanford and the University of Chicago, respectively) developed a detailed profile of what consumers buy in different parts of the country. Linking this information with the nutritional labeling of products, they were able to characterize the variations in the fat, calories and protein of the various products we buy.

The purpose of their work was to examine the food desert hypothesis–to see whether proximity to full-line grocery stores shapes purchasing habits and actual diets. (It doesn’t; personal characteristics, like income and education are much more important in determining diet than distance to healthy food outlets; opening more full-line grocery stores has almost no impact on shopping and diets in so-called food desert neighborhoods.).

Their data allow us to paint a very detailed picture of the geography of nutrition in the US.  Here’s their county level map, with the healthiest eaters shown in the light colors and the least healthy in the darker colors. It shows that the most healthy eaters are concentrated on the East and West Coasts, and that those in the interior states, and especially in the south, eat the least healthy diets. Just as with many things, it looks like our nation is highly polarized.

It struck us that we’ve seen this pattern before: in county-level election maps. Here are the results of the 2016 presidential election, coded with the familiar red/blue republican/democrat shading. Darker colors signify relatively larger margins for the dominant party.  As you can see, there’s a strong similarity between this map and the one on nutrition.

We don’t want to place too much reliance on visual evidence; so we’ve done a quick bit of statistical analysis. We don’t have access to the county level data, but Alcott, Diamond and Dube did publish a color-coded map showing state-level nutritional scores.  We took the values indicated on this map and compared them with Donald Trump’s electoral margin in each of the 48 contiguous states (positive values indicate a victory for Trump). The following scatter chart show’s Trump’s victory margin (on the horizontal axis) and the state’s relative nutritional score on the vertical axis).  Higher scores on the nutritional axis correspond to a healthier diet.

Overall, there’s a pretty strong correlation between the two data series. Statistically, the coefficient of determination (r-squared) between the two data series is .51, suggesting that nutrition alone explains about half the variation in state voter margin. The healthier a state’s eating habits (at least as evidenced by the shopping data) the more likely they were to vote for Clinton. Conversely, the least healthy states (we’re looking at you Alabama, Oklahoma and Louisiana) had the largest margin of victory for Trump.  There’s a clear pattern here:  16 of the 18 healthiest states voted for Clinton; 24 of the 25 least healthy states voted for Trump.

The usual caveat, that correlation doesn’t mean causation, definitely applies here. You might be tempted to assume that voters in states with less healthy diets feel a special affinity for a candidate with a well-documented weakness for junk food. But it’s just as likely, as in the case of Alcott, Diamond and Dube’s study, that there are other factors–notably education and income–that influence both one’s dietary choices and one’s presidential preference.  That said, there’s a certain symmetry between recent reports about the President’s eating habits and the dietary patterns of his electoral support.

Story from The Daily Kos

 

It’s food for thought.

 

Road pricing for all vehicles, not just ride-hailed ones

The problem isn’t the ride-hailed vehicles, it’s the under-priced street

It really looks like we’re on the cusp of a major change in transportation finance. Cities around the country are actively studying real time road pricing. And no where is the conversation more advanced than in New York, where Governor Cuomo has endorsed the FixNYC concept of charging cars entering Manhattan South of 59th Street.

Recent reports from the City’s former transportation director, Bruce Schaller have shown that the growth of traffic–recently accelerated by the expansion of ride-hailing services like Lyft and Uber–is slowly strangling city traffic. Street speeds in Manhattan have fallen by a full mile per hour, which is bad enough, but what’s worse is that slower speeds have reduced the productivity of the city’s buses, driving more passengers to choose the already over-crowded subway. A system of road pricing would make a world of difference, reducing traffic in Manhattan, speeding buses, and easing the strain on the subways.

A key feature of the Move NYC plan is that it would subject all private vehicles (taxis, ride-hailed vehicles, commercial vehicles and privately owned cars) to the congestion charge.

In a recent commentary at Wired, the financial journalist Felix Salmon, suggests that we might get away with a plan that tried to pin the road pricing bill just on ride-hailed vehicles. A key part of his argument is that somehow the proposed congestion charge wouldn’t affect the number of ride-shared vehicles in Manhattan, because ride-hailing supply is inelastic (drivers are going to drive anyhow). Salmon argues “While charging them to drive into a crowded zone can certainly raise tax revenues, it’s not going to reduce congestion, because drivers-for-hire are almost entirely price-inelastic. They’re effectively forced to pay whatever the fee is.”

This is a rare instance in which we’ll disagree with Salmon:  That argument strikes us as both wrong and factually incorrect.  The factually incorrect part is that drivers will somehow have to pay the fee.  They don’t:  tolls and charges are passed directly on to customers. For example, Uber’s policy is:

Additional charges may apply to your trip, including tolls, surcharges, or other fees. These charges are automatically added to your trip fare.

The wrong part is the argument about inelasticity of supply (and for that matter demand). If in fact, Uber drivers did have to pay the fee, they’d find driving in Manhattan less profitable, and would either serve other boroughs, or stop driving for ride-hailing services. (Or if the fee were flat, look to maximize the number of within Manhattan trips among which to amortize a fixed fee).  But, inasmuch as the fee is likely to be passed on to passengers, it raises the cost of Uber to them (just like surge pricing) and prompts some users, at the margin, to not use Uber (so that demand is price elastic). The takeaway here is that congestion pricing will tend to reduce the use ride-hailed vehicles.

Salmon’s suggestion provoked an immediate, and spirited reply from Charles Komanoff, one of the authors of the FIxNYC plan, who argued that an Uber tax in place of a cordon toll would sacrifice a majority of the traffic relieving benefits of road pricing.

Without a cordon toll to reduce 15-20 percent of car trips into the Manhattan CBD, we can kiss most of the time savings goodbye, not to mention the political support of players in the for-hire vehicle industry who will be left holding the bag.

His estimates suggest that eliminating the cordon pricing portion of the plan would reduce its traffic easing effects by 60 percent (and consequently reduce revenues, which would be used to subsidize transit, by a like amount). Another key problem: as the Uber tax drove ride-hailed vehicles out of lower Manhattan, lessened congestion would attract other private vehicles, offsetting the congestion benefit.

Salmon responded to Komanoff’s concerns, in a blog post calling “Taxing Uber is Easy” arguing that taxing ride-hailing is more politically palatable than charging all vehicles. “I think my idea is something which is eminently politically possible, in contrast to congestion pricing, which has been implemented exactly nowhere in the USA.”  In his view, we can start by taxing Uber, and then someday later expand the tax into a full-fledged congestion charge.

There are at least two key problems with this argument:  First, if Komanoff is correct, the Uber-only tax won’t do much to reduce congestion, and will be used as an argument by pricing opponents that the system doesn’t work. Second, it feeds a narrative that congestion is somehow the fault solely of the ride-hailing companies, which though politically convenient, isn’t accurate.  Moreover, it’s likely that FixNYC is one of those rare chances to make a big change in transportation pricing–so getting it wrong now could saddle us with a second best system for a long time.

First generation road pricing systems that rely on exclusively cordon charges (as in London, Milan and Stockholm) create perverse incentives for ride-hailing vehicles to stay in the cordoned area once they paid the toll, which as Felix points out, can actually make congestion worse. But Komanoff’s preferred FixNYC proposal would include a combination of a cordon charge plus a $3-$5 charge on ride-hail trips South of 59th street, and so has at least some elements Salmon should support.

It strikes us that there’s little reason to single out ride-hailed vehicles for congestion pricing. Each incremental vehicle (taxi, Uber/Lyft, delivery truck or private car) makes essentially the same contribution to congestion as it travels city streets at the rush hour. Confronting all of these vehicles with a charge that reflects the costs they’re imposing on the road system and other travelers is the optimal way to make the system run efficiently and sort how higher value, more productive uses from lower value ones.

Chances to fundamentally rethink the way we pay for road systems come along about once every century or so.  In the horse and buggy era, we didn’t finance roads with a hay tax. The gasoline tax was invented in the nineteen-teens, but clearly its days are numbered.  As we’ve argued, even a “dumb” vehicle miles tax is a half-step in the direction of a road pricing system that embraces readily available technology and achieve maximum results.


This post has been revised to correct a broken hyperlink.

Gentrification & integration in DC

Gentrification is producing more diverse schools and growing enrollment

In Washington DC, gentrification is producing higher levels of integration and increasing the total number of kids–black and white–attending schools in changing neighborhoods. DC’s gentrifying neighborhoods have both more white residents, more total residents, and more kids attending local schools. These facts discredit the folk wisdom that neighborhood change is an irreversible, zero sum game that inevitably replaces one kind of segregation (low income people of color) with another (rich white enclaves). It suggests how we manage change–and particularly how we organize urban public education–can have a huge impact on creating neighborhoods that are both more diverse and inclusive.

The dominant narrative of gentrification is displacement:  As neighborhoods change, long-time residents are forced out by newcomers.  The caricature goes something like this:  Young, well-educated white people are moving in to lower income urban neighborhoods, populated primarily by people of color. As these wealthier, whiter, younger singles move in, families of color are displaced, and long time cultural institutions that served the previous population suffer as well.

If there’s a place where this narrative ought to be playing out, it seems like it ought to be Washington, DC. The District has experienced a fairly major demographic shift in the past three decades. Whereas as recently as 2000 the city was 61 percent African-American, today it is much more diverse, and no longer majority African-American.  In addition, the growth of the white population in the District has been concentrated in a number of close-in neighborhoods.

Students at lunch, Bruce-Monroe Elementary School, Washington (USDA)

A recent study from the University of California, Los Angeles, looks at enrollment trends in schools in Washington DC in the neighborhoods most affected by gentrification, and discovers a surprising result: these schools have not only become more integrated, but the number of students enrolled in local K-12 schools, including African-American students has actually increased substantially.

Gentrification is actually boosting diversity in DC’s public schools, a new study suggests. While many white parents are still sending their kids away to schools outside of their neighborhood or enrolling them in private institutions, the study shows an increasing number are choosing public schools over charters. However, care must be taken to ensure that traditionally disadvantaged students benefit from the increased diversity.

The full paper, White Growth, Persistent Segregation: Could Gentrification Become Integration? By Kfir Mordechay and Jennifer Ayscue  December 2017, is available on line.  But let’s take a look at a couple of its key findings.

Declining Segregation in Schools

The study’s headline finding is that school segregation is decreasing in Washington, albeit slowly. The District of Columbia’s schools have long been profoundly segregated, more than 90 percent of students were black in 1990. Then, roughly 3 in 5 of  the district’s schools were classified as “hyper-segregated,” with more than 99 percent non-white students.

The study focuses on 11 census tracts in the center of the district that have experienced the greatest gentrification in the past two decades, and tracks enrollment changes at all of the schools within a mile of those census tracts. White enrollment in schools in this area increased from 1 percent of all students to 8 percent from 2000 to 2014.  Although the increase was significant, it’s still the case that many white parents don’t seem to be enrolling their children in their local DC schools; the DC schools in these areas are 8 percent white, but the school age population in these census tracts is 17 percent white, implying that many white parents are sending their children to private schools.


Source: Mordechay & Ayscue

Growing urban enrollments

The buried lede in the story is the increase in school enrollments in most gentrifying neighborhoods of Washington. As these neighborhoods have rebounded, school enrollment has increased sharply. In these neighborhoods, between 2007 and 2014, the number of students enrolled in public schools–including both traditional public schools (TPS) and publicly supported charters–increased from about 15,600 to more than 24,000, an increase of roughly 70 percent.

The study doesn’t explore the reasons behind the increase in school enrollments. Some of it is likely due to the increased population in these neighborhoods, associated with the new development in recent years. In addition, its apparent that the number of schools in this geographic area has increased, with 20 additional charter schools and 7 additional traditional public schools in 2014, compared to 2007.

The district clearly has a long way to go in achieving integrated schools. But these data show that an increase in white students in these neighborhoods hasn’t been accompanied by a decline in the number of black students. Contrary to the usual view that gentrification inevitably leads to fewer kids, and fewer kids of color, this shows that neighborhood change is not a kind of zero sum game, where new residents invariably displace, one-for-one previous residents.

Part of the story has to be an expansion of urban education options. Over the past decade, Washington’s public schools have been implementing a program of universal pre-kindergarten education for children living in the District. The availability of publicly provided pre-K for all three- and four-year olds is a major inducement for families to remain in the District, and can serve as a “gateway drug” for enrollment in the public school system for higher income families who might otherwise choose private schools or move to the suburbs. Once enrolled in a public pre-K, a parent may find it more convenient (and less uncertain) to have their children continue into public grade schools. In addition, parents with children in pre-K may network with other parents in their neighborhoods and become support groups for one another and advocates for quality in their local schools.

The myth of the child-less city

For years, city skeptics like Joel Kotkin have been decrying the family-unfriendliness of cities, and points to declines in the number of children and falling enrollments in many urban school districts.

. . . we have embarked on an experiment to rid our cities of children. . . .The much-ballyhooed and self-celebrating “creative class”—a demographic group that includes not only single professionals but also well-heeled childless couples, empty nesters, and college students—occupies much of the urban space once filled by families. Increasingly, our great American cities, from New York and Chicago to Los Angeles and Seattle, are evolving into playgrounds for the rich, traps for the poor, and way stations for the ambitious young en route eventually to less congested places. The middle-class family has been pushed to the margins, breaking dramatically with urban history.

The experience of DC in the past decade suggests that Kotkin is wrong. The urban revival isn’t a zero sum game: cities can attract new residents, especially young adults, and also retain them as they age and have children, and at the same time increase both the diversity and total school enrollments. And much of white flight and what Robert Reich has called the secession of the successful, has been caused by (and has in turn amplified) the weakness of urban schools. Strengthening city schools can be a key means of promoting greater demographic diversity, and hanging on to families of means who might otherwise take their energy, attention and tax payments to some suburban area.

One of the big unanswered questions about urban revival is whether the young adults who have increasingly concentrated in urban centers in the past couple of decades will continue to live there once they have children. We know that as their children get older, families are less likely to live in central cities. The quality of local public education no doubt plays a critical role in determining the location choices of many families. So, too, does the quantity of education; in Washington’s changing neighborhoods there have been more schools and more school choices, as well as Pre-K.  Initiatives that improve education, and particularly interventions like universal early childhood education, that engage all families in the local schools at a time when the educational stakes may seem lower, could be the key to more diverse, family friendly cities.

 

What drives ride-hailing: Parking, Drinking, Flying, Peaking, Pricing

Ride-hailing is growing: We distill a new report into 5 key factors that explain its growth

A good reporter is always supposed to ask five questions: “who, what, when, where and why?” A new report on ride-hailing provides a range of keen insights about the demand for these services, and has important implications for predicting the future of autonomous vehicles.

There’s a new report from Shared Use Mobility Center, published by the Transportation Research Board, with a ponderous title: “Broadening Understanding of the Interplay Between Public Transit, Shared Mobility, and Personal Automobiles. The report is a over a hundred pages long, and filled with interesting and often arcane detail about who’s using services like Lyft and Uber, where and when they use them and how this is affecting demand for street space and transit. If you’re a real transportation geek, you’ll want to read the whole thing. But if you aren’t (or in the meantime, until you’ve had a chance to read it) we can boil the whole thing down to five words: drinking, parking, flying, peaking and pricing.

The heart of the work is an analysis of trip-making data supplied by ride hailing companies, similar data collected in San Francisco, and surveys of more than 10,000 customers of ride-haling services in five cities: Chicago, Los Angeles, Nashville, Seattle and Washington, DC. The study also explored the question of whether use of ride-hailing services had any impact on transit ridership in these cities.

The slightly longer narrative take on the report is this: Ride-hail demand is driven by drinking (people who want to avoid having drive their own vehicle on Friday and Saturday nights), by the cost of parking (ride-hailing is a more economical alternative to bringing your own car places where parking is expensive). People traveling by air seem to value their time more highly, and may have no access to a personal private vehicle at one end of their flight. Ride-hailing is highly peaked in those times (especially Friday and Saturday nights) and places (downtowns and airports) in which there is demand. Finally, ride hail trips tend to be short (just 2-4 miles), because customers pay for each additional mile and minute.

The report also tells us a lot about where these services aren’t valued. There’s little demand for ride-hailing when parking is cheap or free, when drivers expect to be sober, and for longer trips, and to destinations that don’t have a high density of people and destinations (or an airport).

As a result of these factors (especially pricing, parking and flying), most ride hail trips tend to take place in a small subset of a region’s neighborhoods (especially in and around downtown, where there’s a density of customers and destinations, and where parking tends to be pricey). Take Chicago for example:

 

 

There’s a strong economic implication here: The income and value of time of travelers, and the availability and price of alternatives figures prominently in when, where and whether people use ride-hailing services. The risks and penalties associated with drunk driving (legal, financial and moral) likely lead many people to choose ride-hailing when they consume alcohol; in addition unlike journey to work trips, social trips may be groups rather than individuals.

People use ride-hailing services “on an occasional basis” and for trips where speed and reliability are important. Few people use it for commuting on a regular basis.

What this suggests is that the markets for ride-hailed vehicles today–and for their successors, fleets of autonomous vehicles tomorrow–are likely to be shaped by many of these same factors. Demand will be highest where people, and trip origins and destinations are densest, where parking is expensive, where and among populations that have a high value of travel time savings (and the ability to pay). Ride-hailing and autonomous vehicles are going to affect some people, some trips, and some places much more than others.

The content of this report is an excellent addition to our knowledge about ride-hailing and its place in the urban transportation system. It’s a shame that much of the most salient content and its implications are buried in the report. (Just a thought: starting the title of your report with the words “broadening understanding of the interplay” is not an attention-grabber). More tangibly, while the report’s data implicate parking costs as a key factor in ride-hailing demand, that observation isn’t made in the report and the word parking appears on just 6 of the report’s more than 100 pages. We’ve tried to punch up the findings a bit so that the meat of this report gets the attention we think it deserves.

Feigon, S. and C. Murphy. 2018. Broadening Understanding of the Interplay Between Public Transit, Shared Mobility, and Personal Automobiles. Pre-publication draft of TCRP Research Report 195. Transportation Research Board, Washington, D.C.

 

 

 

The emperor’s new infrastructure plan

Politics and the President’s wheeler-dealer background suggest the infrastructure plan is a mirage

If there’s been one shred of hope for bi-partisan progress in this politically polarized time, its been the idea that somehow the “populist” Trump Administration and congressional democrats and republicans might somehow see eye-to-eye on the subject of good, old-fashioned public works pork, what now goes by the term infrastructure. Trump endorsed big plans to beef up infrastructure spending in his campaign, and most democrats and not a few republicans would relish dispensing billions in federal funds for pet projects in their states or districts.

Given Donald Trump’s history–from steaks, to private colleges, to casinos–is it any surprise that his infrastructure “plan” is primarily a vehicle for personal self-promotion financed with “OPM” (other people’s money) that, if it goes ahead at all, will be a financial and economic failure?

It’s actually so unremarkable, that we really shouldn’t waste too much time thinking about it.  But just to clarify that we’re not being flippant, have a look at what the experts are saying:

The normally staid Brookings Institution published Adie Tomer’s take: “too much cynicism, too little leadership.” In their view, the plan effectively asks state’s and cities to come up with the money, while the administration takes credit for the spending.

At CityLab, Laura Bliss is plaintively asking people to stop calling it a $1.5 trillion infrastructure plan (given than $1.3 billion has to come from somewhere other than the federal government).

And as Henry Grabar reports at Slate, one estimate is that there are $281 billion in cuts to existing infrastructure programs, (including a $170 million reduction in revenue for the Highway Trust Fund) so that the Trump budget actually constitutes a net reduction in federal support for infrastructure investment.

The American Prospect thinks it’s either fictional, a scam, or both: “The plan’s not about building more. It’s about privatizing what’s already there.”

Transportation for America points out, that despite all the rhetoric about deteriorating roadways, the plan doesn’t prioritize repairs, and many of its financial mechanisms (requiring private participation) essentially skew spending toward big capital projects (many of which would had been built anyway), to the neglect of maintenance.

Politico sees a highly politicized distribution of the spoils, with big reductions to programs that benefit blue states and provisions that sweeten the pot for red states.

Matt Yglesias argues persuasively that there’s little reason to believe that there’s any likely political support for an infrastructure bill. Not only is there not actually a Trump Administration when it comes to advancing major policy initiatives (the President effectively cedes these details to congressional leaders), there are actually pretty strong fiscal and political reasons why infrastructure is going nowhere. Let’s check back in a few months to see if his political forecast isn’t exactly correct:

Now Trump has a thing that he can say is his plan, congressional conservatives can propose paying for it with safety net cuts that Democrats won’t agree to, and Republicans can try to pass the whole thing off as an example of gridlock or obstruction rather than reflecting the fact that conservatives don’t favor spending more money on federal infrastructure.

No one with a passing familiarity with Trump’s business record should be surprised that the the administration’s Infrastructure plan, such as it is, is a combination of hot air, self-promotion, and other people’s money.

Trump’s name is taken off the Trump Plaza Casino in Atlantic City, New Jersey (2014).

Opaque, debt-fueled, and Trump-branded, lots of up front-bluster, but lackluster real-world returns, as stockholders in Trump Entertainment and Resorts can attest. Here’s how Fortune described its track record in March, 2016:

From mid-1995 to early 2009, Trump served as chairman of Trump Hotels and Casino Resorts (renamed Trump Entertainment Resorts in 2004), and held the CEO title for five years (mid-2000 to mid-2005). During Trump’s 13 years as chairman, the casino empire lost a total of $1.1 billion, twice declared bankruptcy, and wrote down or restructured $1.8 billion in debt. Over the same period, the company paid Trump—essentially Trump paying himself—roughly $82 million by Fortune’s estimates, collected from a dizzying variety of sources spelled out in the company’s proxy filings, as varied as payments for use of Trump’s private plane to fees paid directly Trump for access to his name and marketing expertise.

From his casinos, to  his hotels, to his steaks, to his for-profit college, Trump ventures have always been about building the Trump brand and enriching Donald Trump, and have been fiascos for either investors or customers or both.  There’s no reason to think Trumpfrastrcture will be any different.

The limits of localism

Overselling localism is becoming an excuse to shed and shred federal responsibility

Our friend, and director of the Brookings Institution’s Metropolitan Policy Program, Amy Liu, weighs in with a timely commentary on the limits of localism. As regular readers of City Observatory will know, we’ve been concerned that the soaring rhetoric of those enamored of local solutions distracts attention from the way in which critical elements of the federal system are being dismantled. Chief among the proponents of localism have been Bruce Katz and Jeremy Nowak, authors of the book “The New Localism.”

Though she doesn’t mention Katz and Nowak by name (there is a link to an article they’ve written), Amy Liu lends her voice to those concerned about the limits of localism in her CityLab article “The limits of city power in the age of Trump.”  Her argument closely parallels the one we’ve made here:

. . . city boosterism can also go too far: Urging city leaders to go it alone celebrates a deep dysfunction in federalism—and it normalizes a self-destructive shift in politics and governance.

For instance, the Trump administration is using the narrative of increased local capacity to justify draconian cuts to federal support for cities, from transit programs, community development financing  to the entire Economic Development Administration.

We would hasten to add that likely cuts to social service programs, ranging from Food Stamps, to Medicare and Medicaid, and housing assistance will all amplify the problems faced by cities. In effect, the abdication of national responsibility for these core functions transforms these issues into “local” problems, ones that are disproportionately visited on some cities, and which in many cases are insoluble at the local level.

For reference, we’ve reproduced our original critique of The New Localism here.  We also invite our readers to read Bruce Katz and Jeremy Nowak’s response, which was published at CityLab.

We also want to take this opportunity to share with Bruce and the urbanist community our condolences for Jeremy’s passing. Jeremy was an intellectual force and an inspiration. He will be missed.

Why we’re skeptical of localism

As our name City Observatory suggests, we’re keen on cities. We believe they’re the right frame for tackling many of our most important problems, from concentrated poverty to housing affordability, from economic opportunity to more sustainable living. But enamored as we are of cities, we harbor no illusions that cities, by themselves, can solve our most pressing problems.

In the past year, a growing chorus of voices, disillusioned by growing polarization, has called for cities to be our saviors.

Take Richard Florida’s description of the latest of these “The New Localism.”

“In a time of national dysfunction and, frankly, gloom, our best hope for our society lies in our cities and metropolitan areas. That’s the message of the newly released book The New Localism, by Bruce Katz, the noted urbanist at the Brookings Institution, and Jeremy Nowak . . .”

In one of the oddest of imaginable odd couples, Florida himself co-authored a Daily Beast op-ed with perennial sparring partner Joel Kotkin, that called for us “To reunite America, liberate cities to govern themselves.”  The article effectively concedes that the divisions between the blue places and the red ones are irreconcilable, and each ought to be free to go its own way.

With control of the national government in the hands of a president of debatable competence and a Republican party seemingly intent on dismantling the federal government, it’s not surprising that many people are looking for reassuring alternatives. Hoping cities can save the day Is in many respects an effort to make a virtue of necessity. Katz and Nowak relate a litany of instances of Mayors and other civic leaders working above or across partisan and sectoral divides to tackle important problems their cities face.  Their approaches are refreshingly innovative, direct and often productive.

The productivity of these cities is equally attributable to the pragmatism of their leaders and the solidly blue political compositions of their polities. Large US cities are overwhelmingly blue. And to the extent you find Republicans in cities, they tend to be the most moderate kind. If you’re a Democrat, you find yourself wishing that all Republicans were like Mick Cornett or Michael Bloomberg.

There’s an understandable impulse in the face of growing national divisions and what for many was the shocking and unpleasant outcome of the 2016 national elections to retreat to a comforting cocoon of the like-minded. Blue cities will do all the things that a solidly Republican national government won’t do: respect LGBTQ rights, provide sanctuary for immigrants, denounce climate change, and tax themselves to pay for needed investments and public services. But withdrawing to the safety of agreeable blue localities cedes the important national battle at just the time when it needs to be contested.

It is well and good to celebrate the successes that mayors and local leaders are having. But transforming these heartening but small successes into a sweeping call for a new localism is misplaced when the fundamental functions of the national government are being steadily undermined. None of this works in a world in which the federal government is not simply rending holes in the safety net but knocking down its foundations.

We should remember that the federal government took on the roles it did almost entirely as a last resort. As Churchill reputedly remarked, “America can always be counted on to do the right thing, but only after it has exhausted all the other alternatives.”  While the rest of the world’s nation states adopted the trappings of modern social democracies, the U.S. was late to implement things like unemployment insurance, social security and universal health care. The New Deal, the Great Society and Obamacare were only enacted after various local and state programs to address these problems were simply overwhelmed.

What cities do badly or can’t do at all

Cities are not merely ill-equipped to tackle our major challenges on their own. Localism has an undeniable history of making many problems worse. Take two big issues of our time:  climate change and surging inequality. Mayors and cities can strike a pose and demonstrate effective tactics, but they lack the policy throw-weight to solve these problems.

Bravo for mayoral pledges to adhere to the Paris accords, but there’s precious little substance and sufficient scale. New York Mayor Bill de Blasio can sue the oil companies but is an ardent opponent of congestion pricing, a tangible, effective market-oriented step that would reduce the number one source of greenhouse gases. It’s almost impossible to imagine that we’ll take effective action to address climate change unless it’s done at a national level in cooperation with the rest of the world. Without a federally imposed carbon tax or cap and trade, localized efforts are likely simply to relocate the dirtiest pollution to the most permissive states.

Similarly, inequality—which has been dramatically worsened by changes to the federal tax code—dwarfs anything cities can do. Cities are constitutionally incapable of redistributing income because the wealthy have the option of exit (which they have regularly done.) Witness the exodus to suburban enclaves, a trend Robert Reich has termed the secession of the successful. Similarly, states and cities have been largely powerless to take on large corporations. Not only has globalization moved a considerable part of corporate earnings beyond the reach of state and local tax collectors (note Apple’s relocation of its profits to Ireland thanks to U.S. tax laws), but look at the way states and cities are falling over one another to offer tax holidays and subsidies to Amazon for its proposed HQ2.

It’s also worth noting that a key aspect of localism that has been effectively exempt from federal control—local control of zoning and land use—has worsened the economic segregation of our nation’s metropolitan areas.  In sprawling metros, separate suburban cities have used the power of land use regulation to exclude apartments, directly contributing to the problem of concentrated poverty that intensifies and perpetuates the worst aspects of income inequality. Cities have been implicated in the nation’s housing affordability and segregation problems, but that’s hardly mentioned in Katz & Nowak. The word “segregation” appears only once in the book (page 40). The word “zoning” occurs on 8 pages. Housing affordability is mentioned just once (page 28).

The root of the problem here is too much localism. The most localized governments have the strongest incentives to exclude neighborhood groups within cities lobby against density. Suburbs within metropolitan areas do the same. Only larger units of government have the incentives and ability to challenge this kind of parochialism. Notably, two initiatives of the Obama administration–HUD’s affirmatively furthering fair housing rule and the Council of Economic Adviser’s critique of local zoning–represented important national steps pushing local governments to confront this issue. Both are going nowhere under the current administration.

Assume we have a can-opener: Cities can’t do this without a strong federal government.

The danger here is that these calls to renewed localism effectively aid and abet the ongoing efforts to systematically dismantle federal programs.  The clarion call to act locally diverts our political attention from the national stage and perhaps, unwittingly, becomes an excuse to stand by and watch these foundational programs be destroyed. Katz and Nowak briefly address this question of the federal role early in their book:

“. . . the devolution of power and problem solving to local levels is not an argument against the importance of federal and state governments.  . .  The federal government must do things that only it can do, including safeguarding national security, providing a stronger social safety net than it presently does, providing guarantees of constitutional protections and civil rights, making smart national infrastructure investments, protecting natural resources, protecting the integrity of markets and funding scientific research, innovation, and postsecondary education to keep the nation competitive.” (Katz & Nowak, page 10).

Oh, is that all? This caveat swallows the book’s premise. Localism will work brilliantly–provided we have an extraordinarily competent, generous, fair and functional federal government.

In effect, it’s a reprise of the classic economics joke about the physicist, chemist and economist, trapped on a desert island with cases of canned food but no way to open them. The chemist proposes evaporating seawater into a potent brine and letting the salt solution rust the cans open. Too slow, says the physicist, who works out the exact angle from which to drop the cans onto sharp coral and cause them to split open. The economist waves them both away and says, “I have a simpler, much more elegant solution.  Let me explain:  First, assume we have a can-opener . . ”

A competent, generous, fair and functional federal government is the can-opener.

One more point should be made: Many of the innovative city strategies celebrated in this book are directly dependent on the ability of mayors and city institutions to tap into federal largesse. Take Pittsburgh, heralded as an exemplar of local innovation. Katz and Nowak acknowledge that Carnegie Mellon and the University of Pittsburgh receive more than $1 billion in federal research funding annually (page 75). Cities looking to exploit an “eds and meds” strategy can’t do it without huge federal support in the form of research grants, student aid, Medicare, Medicaid and the Affordable Care Act. A federal government that defunds these programs—as seems likely because of the new tax reform law—will make it all but impossible for cities to innovate.

Laboratories, not factories

Katz and Nowak marshal an impressive list of inspiring local innovations from cities, such as Indianapolis, Chattanooga, Oklahoma City and St. Louis. Mayors and civic leaders in these places are generally pragmatic and entrepreneurial and are developing solutions that cut across partisan and ideological lines. Cities are, as the saying goes, the laboratories of democracy. But for the most part, they are the small-scale, bench-test laboratories for incubating ideas and showing that they can work at a municipal scale. Implementing these ideas at a national scale is essential to their success.

The key lesson of policy experimentation is that while ideas can be tested and refined at the state or local level, they ultimately need to be national in scope. States experimented with minimum wage laws, unemployment insurance, and old age pensions, but none of these were began to address our problems until extended nationwide in the New Deal.

For a long time, we could take the federal government more or less for granted.  There was no hope that it would ride to the rescue, but at least it would keep doing what it had always done: cashing social security checks, bankrolling medical care for the poor and aged, enforcing a minimum of civil rights everywhere, engaging seriously with the rest of the world on global issues. Now, every one of those fundamental roles is very much in jeopardy.  If the poor lose health care, are turned out of subsidized housing, see their education prospects dim, that will directly add to the costs burdening states and cities. The pressure to fill in for a diminished federal presence will greatly handicap local innovation.

Like Localism? Time to fight for an effective national government

If you care about cities and believe local initiative can lead to solutions, you need to be marching on Washington and fighting for a federal government that does its job well.  The hollowing out of the federal government now underway is the clearest threat to creative, effective localism. Ultimately, the magic of our federal system is that both national and local government have important and complementary roles to play. It’s not either/or. It is both/and. Innovative cities require a supportive federal government.

Rather than turning their backs on the federal government and national debates, cities and civic leaders ought to be pooling their energy and efforts to kindle a new dialog about how we appropriately divide responsibilities between national and local governments. We must insist that the national government do its job well and that it provide the room and in some cases some of the resources to help cities tackle problems at a more local level. We need a 21st century federalism that envisions strong and mutually supporting actions at both the national and local levels, not a retreat to homogenous but balkanized localities.

 

 

Qualms about the new localism: Cities need the national government to do its job well

We like cities, but localism can only flourish with a competent, generous, fair federal government 

As our name City Observatory suggests, we’re keen on cities. We believe they’re the right frame for tackling many of our most important problems, from concentrated poverty to housing affordability, from economic opportunity to more sustainable living. But enamored as we are of cities, we harbor no illusions that cities, by themselves, can solve our most pressing problems.

The localism chimera

In the past year, a growing chorus of voices, disillusioned by growing polarization, has called for cities to be our saviors.

Take Richard Florida’s description of the latest of these “The New Localism.”

“In a time of national dysfunction and, frankly, gloom, our best hope for our society lies in our cities and metropolitan areas. That’s the message of the newly released book The New Localism, by Bruce Katz, the noted urbanist at the Brookings Institution, and Jeremy Nowak . . .”

In one of the oddest of imaginable odd couples, Florida himself co-authored a Daily Beast op-ed with perennial sparring partner Joel Kotkin, that called for us “To reunite America, liberate cities to govern themselves.”  The article effectively concedes that the divisions between the blue places and the red ones are irreconcilable, and each ought to be free to go its own way.

With control of the national government in the hands of a president of debatable competence and a Republican party seemingly intent on dismantling the federal government, it’s not surprising that many people are looking for reassuring alternatives. Hoping cities can save the day Is in many respects an effort to make a virtue of necessity. Katz and Nowak relate a litany of instances of Mayors and other civic leaders working above or across partisan and sectoral divides to tackle important problems their cities face.  Their approaches are refreshingly innovative, direct and often productive.

The productivity of these cities is equally attributable to the pragmatism of their leaders and the solidly blue political compositions of their polities. Large US cities are overwhelmingly blue. And to the extent you find Republicans in cities, they tend to be the most moderate kind. If you’re a Democrat, you find yourself wishing that all Republicans were like Mick Cornett or Michael Bloomberg.

There’s an understandable impulse in the face of growing national divisions and what for many was the shocking and unpleasant outcome of the 2016 national elections to retreat to a comforting cocoon of the like-minded. Blue cities will do all the things that a solidly Republican national government won’t do: respect LGBTQ rights, provide sanctuary for immigrants, denounce climate change, and tax themselves to pay for needed investments and public services. But withdrawing to the safety of agreeable blue localities cedes the important national battle at just the time when it needs to be contested.

It is well and good to celebrate the successes that mayors and local leaders are having. But transforming these heartening but small successes into a sweeping call for a new localism is misplaced when the fundamental functions of the national government are being steadily undermined. None of this works in a world in which the federal government is not simply rending holes in the safety net but knocking down its foundations.

We should remember that the federal government took on the roles it did almost entirely as a last resort. As Churchill reputedly remarked, “America can always be counted on to do the right thing, but only after it has exhausted all the other alternatives.”  While the rest of the world’s nation states adopted the trappings of modern social democracies, the U.S. was late to implement things like unemployment insurance, social security and universal health care. The New Deal, the Great Society and Obamacare were only enacted after various local and state programs to address these problems were simply overwhelmed.

What cities do badly or can’t do at all

Cities are not merely ill-equipped to tackle our major challenges on their own. Localism has an undeniable history of making many problems worse. Take two big issues of our time:  climate change and surging inequality. Mayors and cities can strike a pose and demonstrate effective tactics, but they lack the policy throw-weight to solve these problems.

Bravo for mayoral pledges to adhere to the Paris accords, but there’s precious little substance and sufficient scale. New York Mayor Bill de Blasio can sue the oil companies but is an ardent opponent of congestion pricing, a tangible, effective market-oriented step that would reduce the number one source of greenhouse gases. It’s almost impossible to imagine that we’ll take effective action to address climate change unless it’s done at a national level in cooperation with the rest of the world. Without a federally imposed carbon tax or cap and trade, localized efforts are likely simply to relocate the dirtiest pollution to the most permissive states.

Similarly, inequality—which has been dramatically worsened by changes to the federal tax code—dwarfs anything cities can do. Cities are constitutionally incapable of redistributing income because the wealthy have the option of exit (which they have regularly done.) Witness the exodus to suburban enclaves, a trend Robert Reich has termed the secession of the successful. Similarly, states and cities have been largely powerless to take on large corporations. Not only has globalization moved a considerable part of corporate earnings beyond the reach of state and local tax collectors (note Apple’s relocation of its profits to Ireland thanks to U.S. tax laws), but look at the way states and cities are falling over one another to offer tax holidays and subsidies to Amazon for its proposed HQ2.

It’s also worth noting that a key aspect of localism that has been effectively exempt from federal control—local control of zoning and land use—has worsened the economic segregation of our nation’s metropolitan areas.  In sprawling metros, separate suburban cities have used the power of land use regulation to exclude apartments, directly contributing to the problem of concentrated poverty that intensifies and perpetuates the worst aspects of income inequality. Cities have been implicated in the nation’s housing affordability and segregation problems, but that’s hardly mentioned in Katz & Nowak. The word “segregation” appears only once in the book (page 40). The word “zoning” occurs on 8 pages. Housing affordability is mentioned just once (page 28).

The root of the problem here is too much localism. The most localized governments have the strongest incentives to exclude neighborhood groups within cities lobby against density. Suburbs within metropolitan areas do the same. Only larger units of government have the incentives and ability to challenge this kind of parochialism. Notably, two initiatives of the Obama administration–HUD’s affirmatively furthering fair housing rule and the Council of Economic Adviser’s critique of local zoning–represented important national steps pushing local governments to confront this issue. Both are going nowhere under the current administration.

Assume we have a can-opener: Cities can’t do this without a strong federal government.

The danger here is that these calls to renewed localism effectively aid and abet the ongoing efforts to systematically dismantle federal programs.  The clarion call to act locally diverts our political attention from the national stage and perhaps, unwittingly, becomes an excuse to stand by and watch these foundational programs be destroyed. Katz and Nowak briefly address this question of the federal role early in their book:

“. . . the devolution of power and problem solving to local levels is not an argument against the importance of federal and state governments.  . .  The federal government must do things that only it can do, including safeguarding national security, providing a stronger social safety net than it presently does, providing guarantees of constitutional protections and civil rights, making smart national infrastructure investments, protecting natural resources, protecting the integrity of markets and funding scientific research, innovation, and postsecondary education to keep the nation competitive.” (Katz & Nowak, page 10).

Oh, is that all? This caveat swallows the book’s premise. Localism will work brilliantly–provided we have an extraordinarily competent, generous, fair and functional federal government.

In effect, it’s a reprise of the classic economics joke about the physicist, chemist and economist, trapped on a desert island with cases of canned food but no way to open them. The chemist proposes evaporating seawater into a potent brine and letting the salt solution rust the cans open. Too slow, says the physicist, who works out the exact angle from which to drop the cans onto sharp coral and cause them to split open. The economist waves them both away and says, “I have a simpler, much more elegant solution.  Let me explain:  First, assume we have a can-opener . . ”

A competent, generous, fair and functional federal government is the can-opener.

One more point should be made: Many of the innovative city strategies celebrated in this book are directly dependent on the ability of mayors and city institutions to tap into federal largesse. Take Pittsburgh, heralded as an exemplar of local innovation. Katz and Nowak acknowledge that Carnegie Mellon and the University of Pittsburgh receive more than $1 billion in federal research funding annually (page 75). Cities looking to exploit an “eds and meds” strategy can’t do it without huge federal support in the form of research grants, student aid, Medicare, Medicaid and the Affordable Care Act. A federal government that defunds these programs—as seems likely because of the new tax reform law—will make it all but impossible for cities to innovate.

Laboratories, not factories

Katz and Nowak marshal an impressive list of inspiring local innovations from cities, such as Indianapolis, Chattanooga, Oklahoma City and St. Louis. Mayors and civic leaders in these places are generally pragmatic and entrepreneurial and are developing solutions that cut across partisan and ideological lines. Cities are, as the saying goes, the laboratories of democracy. But for the most part, they are the small-scale, bench-test laboratories for incubating ideas and showing that they can work at a municipal scale. Implementing these ideas at a national scale is essential to their success.

The key lesson of policy experimentation is that while ideas can be tested and refined at the state or local level, they ultimately need to be national in scope. States experimented with minimum wage laws, unemployment insurance, and old age pensions, but none of these were began to address our problems until extended nationwide in the New Deal.

For a long time, we could take the federal government more or less for granted.  There was no hope that it would ride to the rescue, but at least it would keep doing what it had always done: cashing social security checks, bankrolling medical care for the poor and aged, enforcing a minimum of civil rights everywhere, engaging seriously with the rest of the world on global issues. Now, every one of those fundamental roles is very much in jeopardy.  If the poor lose health care, are turned out of subsidized housing, see their education prospects dim, that will directly add to the costs burdening states and cities. The pressure to fill in for a diminished federal presence will greatly handicap local innovation.

Like Localism? Time to fight for an effective national government

If you care about cities and believe local initiative can lead to solutions, you need to be marching on Washington and fighting for a federal government that does its job well.  The hollowing out of the federal government now underway is the clearest threat to creative, effective localism. Ultimately, the magic of our federal system is that both national and local government have important and complementary roles to play. It’s not either/or. It is both/and. Innovative cities require a supportive federal government.

Rather than turning their backs on the federal government and national debates, cities and civic leaders ought to be pooling their energy and efforts to kindle a new dialog about how we appropriately divide responsibilities between national and local governments. We must insist that the national government do its job well and that it provide the room and in some cases some of the resources to help cities tackle problems at a more local level. We need a 21st century federalism that envisions strong and mutually supporting actions at both the national and local levels, not a retreat to homogenous but balkanized localities.

 

 

Challenging the Cappuccino City: Part 2: The limits of ethnography

City Observatory has long challenged the popular narrative about the nature and effects of gentrification. This is the second installment of a three-part commentary by our friend and colleague Alex Baca. You can read parts one and three as well. Alex has worked in journalism, bike advocacy, architecture, construction, and transportation in D.C., San Francisco, and Cleveland. She’s written about all of the above for Washington City Paper, CityLab, Slate, The American Conservative, Cleveland Magazine, Strong Towns, and Greater Greater Washington.

This week, City Observatory is addressing, in a series of posts, how Derek Hyra’s Race, Class, and Politics in the Cappuccino City doesn’t stick its landing. This second installment critiques Hyra’s ethnographic process and his references to other scholars who have addressed through their work the effects of upscaling neighborhoods on longtime residents. Part one appears here.

Ethnography In the Gentrification Canon

Shaw is a relatively dense, transit-rich neighborhood within walking distance of D.C.’s downtown, and it has been upscaling for over two decades. Hyra began his work there in 2010, though he did not release his book until last year. Of his intent with Cappuccino City, he writes:

“This book sets out to answer four questions. First, what broader political and economic dynamics relate to the transformation of the dark ghetto into the gilded ghetto? Second, what attracts some White residents to historic yet low-income urban African American neighborhoods? Third, what happens when people who have been segregated for so long come together in a diverse neighborhood? Lastly, how are low-income people benefiting when more affluent people move near them?”

Cappuccino City is not unlike The New Urban Renewal, Hyra’s first book. In each, he uses an ethnographic approach to frame the exploitation of black culture, in notable black neighborhoods, for the purposes of creating a marketable identity that appeals to newcomers; heavily references Saskia Sassen’s work to illustrate the impact of global forces on individual neighborhoods; and extensively documents community-meeting minutiae to illustrate the push and pull of neighborhood factions. In each, his hall pass into key non-white spaces is a black friend that brings him, most often, onto a neighborhood basketball court—or, as Hyra calls these acquaintances, “Docs,” in reference to William F. Whyte’s Street Corner Society (15).

Flickr: Ted Eytan, Demolition of Scripture Cathedral

And in each, Hyra writes about gentrification without actually writing about the structural reasons why housing in neighborhoods like Shaw has become so expensive. Peel back Cappuccino City’s conspicuous-consumption arguments—that affluent white people are so attracted to black culture that they’ll move for even a contrived facsimile of it, and will do so in great enough volumes to shift neighborhood demographics—and there’s no discussion of supply and demand, zoning, geography, or transportation. Further, despite Hyra’s talk of global influences, his Shaw functions in a vacuum: He does not address how whiter, more affluent neighborhoods in D.C. and the region have hoarded their wealth, and perpetuated gentrification in other neighborhoods, by attempting to block nearly any new development. Falls Church, Hyra’s place of residence, is an apt example of a place with policies that reinforce regional unaffordability over time: There, multi-family housing over three stories requires a special-use permit, which excludes people who can’t or don’t want to live in single-family homes and restricts supply in a relatively transit-rich municipality. Falls Church is more expensive than it could be because of policies like this; as a result of multiple Falls Churches choosing to be more expensive than they could be through their zoning codes, the D.C. region is constricted and pricier, too.

Hyra’s second claim, after creating a new academic framework for gentrification, is that he is posing questions that have been heretofore unexamined: “While the gentrification literature importantly examines whether residential displacement occurs alongside redevelopment, this book redirects the focus to whether low-income people who are able to stay benefit in meaningful ways.” However, he cites a number of authors whose work is concerned with the effects of upscaling neighborhoods on residents who have lived in those neighborhoods for some time. His notes show a great debt to Japonica K. Brown-Saracino, Brett Williams, and Gabriella Modan, who in A Neighborhood That Never Changes, Upscaling Downtown, and Turf Wars execute exactly what Hyra calls a new paradigm. Each are explicitly clear that what their subjects tell them is site-specific, often inconclusive, and not to be taken as a broad referendum on gentrification.

Hyra makes no such distinction. He orients his conclusions about Shaw as far-reaching, writing in his first chapter, “This pattern of central city redevelopment, driven largely by a White influx, and increasing minority and poverty presence in the inner suburbs is not unique to D.C. The cappuccino lens provides an urban account that not only helps to understand Washington, D.C., and its Shaw/U Street neighborhood but highlights community processes and outcomes likely occurring in other advanced service-sector cities, such as New York City, Atlanta, New Orleans, and Houston” (20). Similarly, The New Urban Renewal is billed on Hyra’s website as “offer[ing] an unparalleled analysis of the nation’s most difficult and complex issues.”

In service of this extrapolation, Cappuccino City blares statements such as, “Some newcomers to redeveloping ghettos who might be inspired by and appreciate elements of Black culture do not truly engage in the ghettos’ complexity. The younger newcomers, the tourists in place, seem more concerned with consuming ghetto-inspired culture than connecting and identifying with those struggling with the ills of racism and structural inequality” (101). The commodification of black culture—the undercurrent of nearly all of what we consume—is absolutely relevant to an academic consideration of Shaw. “Living the wire” and “black branding” clearly convey the frustrating-at-best, harmful-at-worst appropriation that’s been noticeable in D.C. and elsewhere for years.

But Hyra hews to these theses despite his white-newcomer subjects tell him directly that they’ve chosen to live in Shaw because of its close proximity to where they need to go: “Paul, a recent arrival, explains, ‘A large part of the reason I moved to Shaw and pay D.C.’s higher taxes was because of the ability to bike or walk to work” (129). That people will move as close to the things that matter to them as they can reasonably afford, that those with relative financial or social capital will have an easier time of this, and that those with relative financial or social capital are often white is not as interesting as “living the wire.” It is, however, the more likely culprit of neighborhood change.

Flickr: Ted Eytan, Duke Ellington Mural

Anyone on a local listserv, NextDoor, or Facebook group knows how easy it is to find someone willing to bemoan changes in their neighborhood. Likewise, it’s just as easy to find a character whose opinions about existing residents are irritating at best, and bigoted at worst. But drawing conclusions from bombastic stories, like the lurid retelling of a “hood party”, is not randomly selected or representative. Cappuccino City doesn’t consider a control group, selection bias, or comparative analysis. Hyra does not examine other neighborhoods within D.C. or outside of it, much less ask long-term residents in neighborhoods other than Shaw how they view change. If he had, he may have found that in some places, there are few “oldtimers” left behind to interrogate: Very poor neighborhoods that don’t rebound, or “gentrify,” are much more common than gentrifying neighborhoods, and essentially hemorrhage residents. As City Observatory’s Lost In Place report found in 2014:

“While media attention often focuses on those few places that are witnessing a transformation, there are two more potent and less mentioned storylines. The first is the persistence of chronic poverty. Three-quarters of 1970 high-poverty urban neighborhoods in the U.S. are still poor today. The second is the spread of concentrated poverty: Three times as many urban neighborhoods have poverty rates exceeding 30 percent as was true in 1970 and the number of poor people living in these neighborhoods has doubled.

The result of these trends is that the poor in the nation’s metropolitan areas are increasingly segregated into neighborhoods of concentrated poverty. In 1970, 28 percent of the urban poor lived in a neighborhood with a poverty rate of 30 percent or more; by 2010, 39 percent of the urban poor lived in such high-poverty neighborhoods.”

Cappuccino City’s exclusion of such findings could be unremarkable. The D.C. region is one of a handful in America facing an across-the-board housing crunch, so the idea that Hyra should conduct research in neighborhood that “hasn’t gentrified” so as to compare to Shaw might be laughable to some. But despite spending a great deal of his introduction extolling that D.C. is exceptional, Hyra ultimately claims that his theories, terminologies, and frameworks have far-reaching application. In addition to this misguided claim, Hyra retreads well-laid lines of academic thought, with faulty steps, rather than providing new insights: His fetishization of “living the wire” and “black branding” ignores what his subjects tell him to present a marketable narrative about gentrification.

Challenging the Cappuccino City: Part 3: Cultural Displacement

City Observatory has long challenged the popular narrative about the nature and effects of gentrification. Today, we are pleased to offer the final installment of a three-part commentary by our friend and colleague Alex Baca. (You can read part 1 and part 2 as well). Alex has worked in journalism, bike advocacy, architecture, construction, and transportation in D.C., San Francisco, and Cleveland. She’s written about all of the above for Washington City Paper, CityLab, Slate, The American Conservative, Cleveland Magazine, Strong Towns, and Greater Greater Washington.

This week, City Observatory is addressing, in a series of posts, how Derek Hyra’s Race, Class, and Politics in the Cappuccino City doesn’t stick its landing. This third installment explains that though Hyra’s theories, ”black branding” and “living the wire,” are not inaccurate descriptors for what is happening in Shaw, Hyra’s work is not likely to dismantle the structures he purports to critique.  Parts one and two were appear here.

Cultural Displacement In the Cappuccino City

Hyra began interviewing Shaw residents, attending ANC 2C and 2F meetings, and integrating himself with Organizing Neighborhood Equity D.C.—whose staffer, in this agreeable recap, does not disclose that Hyra was closely affiliated with her employer—eight years ago. At that time, I was assistant editor of Washington City Paper, another oft-cited source in Cappuccino City. Hyra references dog parks and bike lanes as signifiers of new, whiter residents in Shaw; we covered the ins and outs of these D.C.-specific memes so extensively that I wrote a cover story about the symbolism of bike-facility implementation, in 2011.

There is an enormous responsibility that comes with having an amplified voice on local issues, and that pressure has only been sharpened over the past half-decade by a constrained housing supply in cities where economic mobility is greatest. Everyone is scrambling for justification as to why housing is so expensive, because nearly all Americans are rent-burdened, regardless of where they live.

While D.C.’s economy, thanks to the entrenched presence of the federal government, had not tanked as significantly as the rest of the country’s, the city was most visibly bounding upward by the aughts. Around 2010, dog parks, bike lanes, and snowball fights were convenient shorthands for change in general—and no one likes change. I quickly learned that my peers (white; college-educated; rarely native to D.C. proper; and in possession of a certain cultural capital, if not financially stable personally) preferred to blame, say, restaurants serving truffled mac-and-cheese for undermining their neighborhood’s “authenticity.” This, of course, is a more accessible and more fun conversation than ones about D.C.’s housing production trust fund, municipal bonding capacity, or, trickier still, one’s own role in perpetuating a process they might regard as unilaterally culturally destructive.

To that end, Hyra is not wrong in concluding that amenities drive gentrification. As Jackelyn Hwang and Jeffrey Lin conclude in this working paper, people do move to be close to things they enjoy. But Hyra’s hyper-focus on luxury signifiers misses the basic things that people in general, regardless of income, want to be proximate to, like where they work, where their family and friends are, or religious and cultural institutions in which they are invested. Shaw offers proximity to these things to many people, and so it is a popular place to live. That it is perceived to be safer, cleaner, and more marketable than in previous decades stems from this popularity; that it is expensive speaks to the fact its supply—of legally affordable housing, of missing-middle housing, and of luxury housing—is not meeting demand.

Flickr: Bossi, V Street.

Further, what a particular person values enough to pay a certain price to be close to will have infinite variations. A condo-owning white resident of Shaw may like that the neighborhood is just a few Metro stops from downtown D.C., may enjoy consuming pricey dinners, and may feel self-satisfied in their social-justice priors while touring by bike the murals that grace the buildings identified on the Heritage Trail. This, I gather, is Hyra’s prototypical “tourist in place.” But this isn’t the only persona contained within Shaw, or within any neighborhood. “Amenities” are not solely luxury; Giant, public libraries, corner stores, and bus stops are amenities, just as Whole Foods, WeWorks, third-wave coffee shops, and bikeshare stations are amenities. It is perhaps comforting to assign Whole Foods as “for” newcomers, and Giant “for” oldtimers. But the incalculable array of personal affinities that are inherent to each of us means that those assignations are often reductive. From there, the next logical step is to believe that taste is pathological, which completely misses how and why things like safe streets, which should be fundamental to all places, have become harbingers of the gentrification bogeyman.

Shaw is expensive to the point that newcomers and oldtimers alike may not be able to afford to stay in the neighborhood. And its current retail and resident mix is likely discomfiting to those who preferred the Shaw they thought they knew to the one they think they don’t. But the rent is too damn high for far-reaching, multitudinous, and sometimes counterintuitive reasons that are much greater than preferences alone. Depressed wages, supply and demand, exclusionary zoning, ill-fitting regulations, inadequate public transit networks, and the stubborn spatial mismatch between where jobs are and where people live are inextricably linked to why Shaw is what it is today. Given that our neighborhoods are ongoing referendums on the complexities of urban policy, history, and regional governance, Hyra’s satisfaction in taxonomizing $14 cocktails is frustrating.

There is a intersection of these points. It echoes the cultural-studies discourse of appropriation of the subaltern—what non-academics can instantly identify as the co-opting and subsequent commodification of everything from music, to food, to transportation (see: tech titans attempting to reinvent the bus). Writing in The New York Times Magazine, Willy Staley discusses how gentrification has become less about housing, and more about the bougie-fication of things associated with the poor, like ”raw water,” tiny homes, and kale:

“Unlike housing, poverty is a potentially endless resource: Jeff Bezos could Hoover up all the wealth that exists in the world, then do nothing but drink rainwater collected from the roof of his ‘70 Vanagon, and it wouldn’t stop the other seven billion of us from being poor. What this metaphorical gentrification points to instead is dishonesty, carelessness and cluelessness on the part of the privileged when they clomp into unfamiliar territory. When they actually profit from their ‘discovery’ and repackaging of other people’s lifestyles, it’s a dispiriting re-enactment of long-running inequalities. But what seems most galling isn’t that they’re taking dollars off the table. It’s that they’re annoying.”

The professor could argue that Cappuccino City is a monument to exactly that. He could also argue that parsing cultural and physical displacement, and whether they are actually triggered by gentrification, is not the point of his work. To be sure, no text is a panacea. But it is irresponsible to imply, as Hyra does in his conclusion, that Shaw-like housing crises, increasing segregation, and social discomfort are solvable primarily by more integrated third spaces and the continued preservation of—to say nothing of the addition of—affordable housing. That might work if the goal is to simply convey a veneer of social mixing, so that all existing oldtimers and newcomers feel better about the authenticity of their neighborhood. But authentic-feeling neighborhoods mean very little on a practical level if people, regardless of when they arrived, can afford to live in them.

Hyra, I think, would agree. After all, his book about the negative effects of “neighborhoods, but fancy,” was the the buzziest in a year of buzzy books on the topic. While I find that his text treats existing scholarship glibly, Hyra is clearly reacting to something powerful: the aggressive pace of change that leads people to feel displaced in their own neighborhoods, if they can afford, and choose, to stay. That he centers his book around that, and elevates the voices of those who are bearing the negative externalities of Shaw’s growth, should not be dismissed. As Staley puts it, “The poor are still gentrification’s victims, but in this new meaning, the harm is not rent increases and displacement—it’s something psychic, a theft of pride.”

And yet, this is why the case of Cappuccino City is so unfortunate. “Living the wire” and “black branding” are tantalizing terms. They have been treated by a number of outlets as legitimate discursive frameworks. The book itself is a quick and easy entry point into urban studies, and deals with the real, true upscaling of neighborhoods that many residents of formerly distressed cities are experiencing. But Hyra’s analysis of Shaw’s particulars—his “cappuccino lens”—rests on flawed premises of originality, and does not provide meaningful policy blueprints. Rather, it reinforces the popular, yet surface-level, notion that newcomers’ tastes and preferences are the primary drivers of unaffordability and displacement, and treats neighborhoods as closed loops rather than components of regional ecosystems.

Flickr: Ted Eytan: Frazier Funeral Home (slated for condo development).

Some planning theorists have been dismantling this unproductive paradigm for years. Their work has gone unacknowledged. We are desperately in need of a prominent narrative that blows it up for good. We cannot craft good policy without first establishing an ideological framework that appropriately considers what is going on in America’s Shaws, so that we can future-proof neighborhoods, cities, and regions for as many scenarios as possible. That necessitates admitting that both increased housing supply and strategies to mitigate physical and cultural displacement have a role in contemporary urban policy. Hyra had that opportunity. He squandered it to double-down on an out-of-date discourse.

If the “cappuccino lens” is anything at all, it’s a definitive claim that the consumptive preferences of new residents are what drive neighborhood change. This is primarily a disservice to Hyra’s subjects. But it’s also is a mindset that leads people in expensive, gentrifying, and distressed neighborhoods to—understandably—protest new housing. At its worst, it’s an axis that pays lip service to cultural appropriation while lazily lumping together social discomfort and physical displacement. It’s the idea that bike lanes, or dog parks, or new restaurants cause rents to rise in isolation. It’s a view that conveniently dismisses America’s legacy of constitutionally implemented, segregationist housing policies, and is one that’s unwilling to imagine what truly equitable, regionwide investment might look like. It’s one that is too cowardly to take to task the massive levels of exclusion perpetuated by relatively wealthy neighborhoods and suburbs, which have, in turn, resulted in heretofore unimagined pressures on walkable, inner-city neighborhoods.

In castigating newcomers, the “cappuccino lens” sets an impossible bar for authenticity and belonging while tokenizing long-term residents. It mocks in its ignorance the ways that federal actions introduced segregation where there was none, as well as the country’s legacy of fair housing and integration efforts. It wrings its hands, but is not likely to testify in support of housing and transit development that so often meets death by a thousand public-comment cuts. It elevates the threat of displacement without examining it.

The “cappuccino lens” is a way of viewing neighborhood change that allows us as individuals to avoid interrogating—and thus, changing—the structures and systems from which we’ve benefited. It’s an explanation that always points the finger at someone newer, someone fancier, someone richer, someone with even more precious taste.

These are commonly held beliefs that have guided local-level politics for decades, ones that have directly contributed to the pain and loss of community that Hyra extracts through his interviews. These are the entrenched views against which we must organize to demand better, more fair, and more just investment in basic goods, services, and human needs at local, state, regional, and federal levels. It’s time this disastrous ideology had a proper name. Fortunately, the frothy and whitewashed “cappuccino lens” fits the aesthetic splendidly.

Editors note: This post has been revised to add references to the earlier commentaries in this series.

Sprawl, stagnation, and NIMBYism: Animated maps of metro change

A picture of metropolitan growth: Sprawl then, stagnation now.

We’re in awe of Issi Romem’s prodigious data skills.  Romem is the economist and big data guru BuildZoom, the web-based marketplace for construction professionals.  His latest report is a multi-decade look at the growth (and non-growth) of housing at the neighborhood level in the nation’s largest metro areas. What Romem has done is gathered data from 1940 through 2016 on the number of housing units built in each census tract in large metro areas. Then, for each of a series of multi-decade periods (1940 to 1960, 1960 to 1980, 1980 to 2000 and 2000 to 2016), he’s classified those census tracts depending on the number and kind of housing units built in that tract. His coding system shows which neighborhoods built primarily single-family homes, which built smaller multi-family buildings, and which built large (50-unit plus) multi-family buildings. In addition, he also flags census tracts where little or no new housing was built. (It’s this latter category that’s a real eye-opener, as we’ll see shortly.)

The new report is called “America’s New Metropolitan Landscape: Pockets Of Dense Construction In A Dormant Suburban Interior.”  That’s an accurate title, but the we think a shorter, catchier alternative might be “Sprawl, Stagnation and NIMBYism.”  Let’s explore some of the highlights of this report.

Instinctively, we know that the footprint of metro America has expanded greatly in the automobile era. This report paints a vivid picture of that process. The secret sauce in this data exercise is animation:  Romem has crafted maps that show how, decade after decade, development proceeds in each of the nation’s largest metro areas.  Here, for example, is the Los Angeles map:

 

Source: Issi Romem, BuildZoom.com

There are similar maps of each of the nation’s largest metro areas on the BuildZoom website. Helpfully, you can also download metro summary data as well. A note on the color coding: Red and orange areas are those census tracts where most new housing was in the form of multi-family units. Light blue is tracts where single family housing predominates. And dark blue are those tracts that had no significant housing construction during the time period in question, using a definition of a net increase of less than one-tenth of a housing unit per acre per decade.  There are maps for all the nation’s largest metro areas, and they show similar patterns.  During the two earlier periods, (1940 to 1960, and 1960 to 1980) metro areas are expanding, hugely you might say.  The expansion continues after 1980 in most metros, though at a generally slower pace.  Since 2000, outward expansion has become more muted.  The really big change over this nearly eight decade period is the increase in the area in nearly all metros that has seen little or no new housing construction. This widespread stagnation of residential development is really the signal finding of this report.

Two suspects: NIMBYISM and economic stagnation

There are two different forces at work behind the dark blue areas where little or no housing is being built. In some cases, its the lack of supply: there’s little vacant land in a neighborhood, and zoning restrictions make it costly or impossible to build new, denser housing. In tracts where most of the land is designated for single family homes, once a home has been built, it’s not possible to replace it with apartments, and there may be little reason to tear down one single-family home and replace it with another.  In large parts of Los Angeles, NIMBY land use restrictions explain the dark blue. Between 1960 and 1990, according to Greg Morrow of UCLA, Los Angeles downzoned most of its residential areas, lowering its population capacity by 60 percent from 10 million to about 4 million.

The other competing explanation is a lack of demand. Some struggling industrial cities (Cleveland, Detroit) also have large swaths of dark blue neighborhoods, but not because of NIMBY restrictions. Instead, the problem in these places is declining demand for housing. Little or no new housing is being built because so few people live there (and want to live there) and prices are low.

For both of these reasons (a limited supply in growing metro areas; limited demand in shrinking ones), a bigger share of the US metro area consists of these dark blue neighborhoods where little development is happening. Over time, more and more of US metropolitan areas consist of neighborhoods (Census Tracts) where little if any new housing is being built.  In our Los Angeles example, the area with little or no new housing construction increased from less than one percent between 1940 and 1960, to 5.8 percent between 1960 and 1980, to 24.1 percent from 1980 to 2000, and fully 52.3 percent in the most recent period, through 2016.

Which cities have the most land area with little or no new housing?

Let’s focus in on the big increase in “no new housing” neighborhoods in the past 16 years. There’s considerable variation across metropolitan areas in the amount of land that’s experienced little or no residential development since 2000.  The chart below shows Romem’s estimates of the percentage of the land area in developed census tracts in each of the nation’s largest metropolitan areas.  (In contrast to the data we usually present at City Observatory, Romem’s work uses “consolidated statistical areas” to define most large metropolitan areas.  These areas are slightly larger that the metropolitan areas we routinely use; they generally include adjacent metropolitan and micropolitan areas just beyond metro area boundaries).

The areas with the highest levels of land with little new housing construction include Hartford, Buffalo and Boston. In these cities, more than three-fourths of all developed neighborhoods have seen little or no new housing construction since 2000. Unsurprisingly fast-growing and sprawling sunbelt metros, like Las Vegas and Austin have the smallest amount of land area that not seen new residential development in the past couple of decades.  In these places, few neighborhoods are untouched by development; in Austin for example, fewer than a fifth of all neighborhoods are classified has having had little or no new housing built.

One of the biggest factors driving housing growth at the metropolitan level is overall population growth.  Fast growing metros have a higher demand for housing, and are both adding more housing at the fringe, and seeing more infill housing in close in neighborhoods. We’ve plotted each metro area’s 2000 to 2016 population growth against the percentage of land area that experienced little or no housing construction. The vertical axis of this chart shows the percentage of each metro areas land that had little or no housing development and the horizontal axis shows the percentage growth in population.  Each dot corresponds to a metropolitan area (mouse-over dots to see metro area names and data values). As expected, there’s a strong negative correlation, places with less growth had larger shares of the metro area unaffected by development; fast growing areas had a proportionately smaller area untouched by residential construction.

The regression line in this chart shows the typical relationship between metropolitan growth and areas with limited or no new housing construction. You can think of communities that are above the regression line as being communities where even less area was touched by development, and communities below the line as ones where a larger portion of the area was affected by development, in both cases, controlling for the level of growth in the metropolitan area. In a sense, the deviation from this line reflects in part the stringency of local land use controls in the metropolitan area.  A likely reason that some areas are above the line is that local zoning or NIMBYISM more greatly limits the extent of new residential development. Conversely, areas that are below the line seem to have land use regimes that allow more widespread development than is typical.  These data show that the Portland area, famous for its urban growth boundary, has a much higher fraction of neighborhoods affected by residential development than is typical for a metro growing as rapidly as it is. Portland grew by 21 percent between 2000 and 2016, but only 31 percent of its area was classified as having little or no new housing development, which is about 10 percentage points less than one would expect given the typical relationship between growth and area affected by housing. Washington, DC, which grew at about the same rate as Portland, had more than half of its land area unaffected by development.

A very useful resource

BuildZoom’s mapping of long term development trends in US metro areas is a huge resource for planners, and those looking to get a handle on sprawl and housing development. You should definitely take a look at your metro area to see how its changed, and compare its development patterns with those of other metro areas. The data analysis here provides a rich picture of the extent and pace of urban expansion, especially during the 1960s and 1970s, and, in sharp contrast, the relative paucity of change in the housing stock in most neighborhoods since then. As we think about how America might change in the decades ahead, this is indispensable framing for understanding the path that’s brought us to where we are today.

 

2017 Year-in-review: More driving, more dying

We’re driving more, and more of us are dying on the roads.

Four days before Christmas, on a Wednesday morning just after dawn, Elizabeth Meyers was crossing Sandy Boulevard in Portland, near 78th Avenue, just about a block from her neighborhood library. She was struck and killed, becoming Portland’s 50th traffic fatality of 2017.

Vision Zero, a bold road-safety campaign with its origins in Scandinavia has been sweeping through the US for the past decades, prompting all kinds of tough-talking, goal-setting traffic safety campaigns. And admirably, Vision Zero is designed to be a results-oriented, no-nonsense, and data-driven effort. Fair enough.

But judge by the grisly traffic statistics of 2017, we’re failing. Almost everywhere you look, traffic injuries and crashes are increasing. The final national numbers aren’t in, but the trend is clearly toward higher road deaths. To focus on Portland for a moment, where Elizabeth Meyers was killed, the 50 traffic deaths recorded in 2017 were the highest number in two decades.  After years of declines, traffic deaths in Portland have spiked in the past three years:

After averaging 31 traffic deaths per year between 2005 and 2014, traffic deaths have jumped 60% over the past three years.

There’s a lot of finger-pointing about distracted driving (and red herrings, like distracted pedestrians), but there’s a simpler explanation for what’s at work here.  Americans are driving more, and as a result, more people are dying on the roads. As the Victoria Transportation Policy Institute’s Todd Litman noted, international comparisons make it clear that miles driven are an significant and independent risk factor that’s much higher in the US than in other developed countries. As Litman puts it:

 . . . don’t blame high traffic death rates on inadequate traffic safety efforts, blame them on higher per capita vehicle travel, and therefore automobile-dependent transportation planning and sprawl-inducing development policies; those are the true culprits.

The effects are big enough to show up in mortality statistics: American children are twice as likely to die in automobile crashes as are children in other advanced countries, which is a major contributor to the higher child mortality rate in the US.

After more than a decade of moderation in driving (motivated largely by high gas prices), driving in the US started increasing again when oil prices collapsed in 2014.  Data from the US Department of Transportation trace a clear uptick in driving in the past three years.

The result, inevitably has been increased carnage on the highways.

There’s some good news out of the Oregon Legislature in the past year. The legislature gave the city permission to set lower speed limits on city streets, and the city has just forwarded a new speed limit of 20 miles per hour that will apply to many of the city’s residential neighborhoods.

As important as this move is–excessive speed is a key contributor to fatalities–it does nothing to address the conditions that led to the death of Elizabeth Meyers. Sandy Boulevard is a multi-lane arterial street, the kind that the region’s safety analysis has determined to be the deadliest part of the roadway system. The city has been working on pedestrian improvements, and efforts to reduce speeding and red-light running. But in the area just east of where Meyers died, a section of roadway controlled by the Oregon Department of Transportation, the state agency rejected city efforts to lower posted speeds:

In response to a community request to reduce the posted 35 MPH speed on the east end of NE Sandy Blvd, traffic speed counts were taken east of 85th Avenue in early 2014 as part of the High Crash Corridor evaluation. 85th percentile speeds were 40.3 MPH. The Oregon Department of Transportation (ODOT) reviews and makes decisions on posted speed reduction requests. ODOT will not consider speed reductions that are 10 MPH or more below the 85th percentile speed. Therefore, ODOT would not approve a speed reduction on outer NE Sandy Blvd near 85th Ave.

(City of Portland, Bureau of Transportation, NE SANDY BOULEVARD HIGH CRASH CORRIDOR SAFETY PLAN, 2014, page 5.)

The grisly trend indicated by the traffic death data of the past three years tells us that as hard as we’re trying to achieve Vision Zero, we’re not trying hard enough. The biggest risk factor is just the sheer amount of driving we do, and with the boost to driving in recent years from lower fuel prices, it was predictable that deaths would increase. If we’re serious about Vision Zero, we ought to be doing more to design places where people can easily live while driving less, and where people can walk without regularly confronting speeding automobiles. We clearly have a lot of work to do.

A modest proposal: Extend the Americans with Disabilities Act to highways

Let’s require that highways really be accessible to those who can’t drive:  State highway departments should provide bus service on state roads for the disabled

The Americans with Disabilities Act was landmark legislation to make sure that the disabled were not denied equal access to the public realm. The ability to travel freely is an important part of accessibility, and the ADA has made considerable progress in making public buildings, restrooms, city sidewalks and transit vehicles easier for the disabled to use. Highway departments have generally been laggards.

For example, fewer than 3 percent of the sidewalk ramps along Oregon highways comply with the requirements of the American’s with Disabilities Act (ADA), according to a recent report. The lack of ramps is a major mobility issue; state highways are often main streets of smaller towns, and these key bits of infrastructure are critical go giving everyone equal access to their communities and not being halted by barriers.” After threatened legal action, the Oregon Department of Transportation has agreed to start spending several tens of millions of dollars each year to build ramps.  With luck, the department hopes to have the backlog addressed by 2032.

Here’s an example (from California) of the kind of ADA compliance we’re talking about.

This is what ADA compliance looks like now:  A ramp with a little yellow raised-bump warning pad  abutting a crosswalk for crossing a two-lane freeway on-ramp.  Notice that the disabled can’t actually use the freeway.

There’s a ramp that connects the sidewalk on one side of the freeway on-ramp to a similar ramp on the other side of the freeway on-ramp.  Notice that this doesn’t actually make the freeway “accessible” to a disabled user–it simply makes it slightly less likely that they will be killed or injured trying to cross the freeway on-ramp. Such ramps don’t make it possible for the disabled to “use” the freeway, so much as to make it slightly less hazardous when they actually don’t use the freeway.  Even with the ramps in place, it’s still the case that its both physically impossible or simply illegal for a disabled person to use the freeway itself for its intended purpose (traveling from one point to another) unless they have a car and someone who can drive it for them.

Freeways aren’t accessible to the disabled. Why doesn’t the ADA address this?

So while the settlement making these crossings somewhat less pedestrian-hostile  is definitely a positive step, we think it might be a good idea to go further. All the ADA requires now is that those who can’t drive a car have a safe way to cross a road without one. But its still the case that if you don’t own or can’t drive a car, there’s no way for you to use a state highway, unaided, for its intended purpose–getting from one part of the state to another. The highway is only usable by those with the physical wherewithal (and financial resources) to operate a vehicle (usually a car, but in many places, bikes as well).

In contrast, look at how the American’s with Disabilities Act applies to transit providers. There’s a wide disparity between how ADA requirements affect transit agencies as compared with highway departments. Not only do transit agencies bear the cost of retrofitting transit vehicles with wheelchair ramps and kneeling capability, but most also operate complementary on-demand or dial-a-ride services for those with disabilities. As long as you’re want to travel in a transit agency’s service area, they’re generally required by ADA to provide such services.

Transit agencies make their systems accessible to the disabled; highway departments don’t. A lift equipped bus.

These paratransit services are expensive. Portland’s transit operator, Tri-Met, spends about $35 million annually for its LIFT and cab services, providing slightly more than one million rides at a cost of more than $30 per passenger, almost 10 times the cost per passenger of its fixed route transit services (Tri-Met Proposed Budget, FY2016, Exhibits 5 and 6). Fares cover about 4 percent of total LIFT costs, meaning that more than 95 percent of the costs of this service are subsidized from other sources.

There’s a profound double-standard in the ADA when it comes to transportation. Bus companies have to provide separate vehicles for those who can’t use buses or trains, but highway departments have no similar obligation to make provide vehicles to those who can’t drive.

The point of these expenditures is to assure that the destinations served by the transit system are equally accessible to all citizens regardless of their disability status. The same standard, of course, doesn’t apply to highways. If you can’t operate a vehicle due to a disability, the destinations served by the highway system are inaccessible to you, but the state Department of Transportation provides no auxiliary service to those who can’t operate vehicles.

A modest proposal:  have the highway department provide vehicles for those with disabilities

The Americans with Disabilities Act, passed in 1990, was a major expansion of rights for the nation’s differently abled citizens. It generally provides that in employment, and in the public realm, governments, employers and landlords have to provide reasonable accomodation for those who are disabled.

The ADA has led to a wealth of changes in the way buildings, streets, and bathrooms are designed. Cities are rebuilding sidewalk intersections with wheelchair ramps. Parking lots regularly set aside the most convenient parking spaces for use by the disabled. Bathrooms have wider stalls and lower sinks to accomodate those with limited mobility.

But highways, particularly our interstate freeway system is legally inaccessible to those with disabilities.  In most states, its simply against the law to walk (or use a mobility aid) to travel on an interstate freeway.  Millions of Americans, due to disability (and also due to age) are legally or practically barred from making independent and equal use of the taxpayer supported highway system.

According to the US Department of Transportation more than 25 million Americans have travel-limiting disabilities.  Of those with such disabilities, fully 40 percent (or roughly 10 million) are unable to drive.

What we’d propose is extending the ADA to require that highway departments offer passenger vehicle service (like buses or ride hailed cars) to provide access to state highways for everyone. If you have a state highway that connects Portland with Salem, and it is really going to be “accessible” to the disabled, then the state ought to provide transit or some form of dial-a-ride service for those who are so disabled that they can’t operate their own vehicle. On lightly traveled routes, the highway department might subsidize vans; on larger roadways it could subsidize scheduled bus service.

That’s not entirely unimaginable. Already, the Oregon Department of Transportation runs  bus service from the Portland area to Multnomah Falls to reduce traffic congestion to this tourist attraction. The agency could just as easily contract for bus service along interstate routes and major state highways.

The highway department runs buses for tourists, why not for those with disabilities?

If highway departments were required to act with anything approaching the same degree of responsibility to disabled citizens who are unable to use the freeway system, they would provide a wide-ranging system of scheduled buses and on-demand vehicle services. It’s worth asking how that might be paid for. Again, applying an analogy to the transit system, highway departments could charge users of buses or dial-a-ride alternatives exactly the same they charge freeway users:  the combined per gallon gas tax paid by a typical car works out to about 2-3 cents per mile of vehicle travel, so it would be entirely fair to have the disabled pay an equivalent amount for their access to the system as well.

If we’re serious about giving the disabled equal access to the public realm, they ought to be able to travel on the expensive roadway system we’ve built that is so vital to connecting to almost every aspect of modern life.  Ours is really a modest proposal that the agencies operating the nation’s highways provide the same accomodation to the disabled that the nation’s transit operators have provided for decades.

 

Cities continue to attract smart young adults

The young and restless are continuing to move to the nation’s large cities

One trend that highlights the growing demand for city living is the increasing tendency of well-educated young adults to live in the close-in urban neighborhoods of the nation’s largest metropolitan areas. At City Observatory, we’ve been tracking this data closely for more than a decade. The latest estimates from the 2016 American Community Survey provide more evidence of this trend.  Here are some key findings:

  • The number of 25-34 year olds with four-year degrees living in large cities is growing almost five times faster than the overall rate of population growth in these cities: a 19 percent increase in 25-34s with a college degree compared to a 4 percent increase in overall population in these cities.
  • The number of well-educated young adults living in the nation’s largest cities increased 19 percent between 2012 and 2016, about 50 percent faster than the increase outside these large cities.
  • Well-educated young adults were already highly concentrated in large cities, and are more concentrated today; in 2012, a 25 to 34 year old with a four year degree was about 68 percent more likely to live in a large city than the typical American; by 2016, they were 73 percent more likely to live in a large city.
  • The number of well-educated young adults increased in 51 of the 53 principal cities in the nation’s largest metropolitan areas. (Only Rochester, New York and Tucson recorded declines).
  • Even troubled rust belt cities with declining populations recorded increases in 25 to 34 year olds with a four year degree over the past four years. Detroit added about 6,700 well-educated young adults in four years; Buffalo, Cleveland and Hartford all recorded significant gains.

Today, we’re focusing on the change in population in cities, as defined by their municipal boundaries. City limits are far from the best units for analyzing population trends, and especially for making inter-metropolitan comparisons. Cities are defined quite differently in different states; some cities represent only a fraction of the urban core (Miami, Atlanta, Hartford) , while others with generous annexation laws spill over into areas that chiefly are low-density suburban-style development (San Antonio, Jacksonville, Phoenix). As we’ve noted in the past, some cities are simultaneously experiencing population growth in denser more urban neighborhoods while recording population loss in more peripheral neighborhoods. So in many respects, these city-wide estimates can understate the movement of young adults to the urban core–a trend that’s better captured by looking at census tract level data (which is not yet available for 2016).

It’s important to keep in mind that cities are recording this growth in young adults in spite of serious headwinds. Cities, as we’ve frequently noted, haven’t made it easy to build additional housing, especially in the dense urban neighborhoods that are in highest demand. And young, well-educated workers are moving to cities in spite of rising rents. If housing supply were more elastic in cities, and rents were more affordable, its likely that even more young adults would live in cities.

We’ve heard the naysayers on the urban revival, with claims that we’ve somehow hit peak millennial, and observations that suburban growth is again (slightly) outpacing population growth in cities. But our analysis suggests that far from being disenchanted with cities, young adults, especially those with a higher education, are increasingly drawn to urban living. Our ability to accomodate the demands they are making on the scarce and slowly growing supply of great urban spaces and nearby housing is the real challenge we need to focus upon.

Data on the young and restless in large cities

We define the young and restless as 25 to 34 year olds with at least a four-year college degree. For each of the nation’s 53 largest metropolitan areas, we’ve tabulated the number of 25 to 34 year olds with a college degree living in the most populous (first-named) city in each metropolitan area. These data are compiled for the population within city limits, as reported in the American Community Survey. Data are subject to sampling error. In addition, individual city data may be affected by annexations or other boundary changes over time.  We present data from 2012 and 2016.

Diverging diamond blues

A key design element of the supposedly pedestrian friendly Rose Quarter freeway cover is a pedestrian hostile diverging diamond interchange

One of the main selling points of the plan to spend nearly half a billion dollars widening the Interstate freeway near downtown Portland is the claim that the improvements will somehow make this area safer for pedestrians. Much of the attention has been devoted to a plan to partially cover the freeway and state and city officials say the covers would provide more space for bike lanes and wider sidewalks.  While that sounds promising, in practice the covers are really just a slightly wider set of highway overpasses chiefly dedicated to moving cars, and with badly fragmented and un-usable public spaces. The covers/widened overpasses are part of a re-design of the on-ramps and approaches to Interstate 5, including a re-working of the local street system.

The centerpiece of that redesign is a miniature version of a diverging diamond interchange.  Currently, two one-way couplets, one going East-West (N.E. Broadway and N.E. Weidler) and the other going North-South (Vancouver Avenue and Williams Avenue) intersect atop the freeway, and also lead directly to freeway on-ramps and off-ramps. This arrangement produces a high number of left turn movements at this pair of intersecting streets.  In an effort to reduce turning movements and speed the flow of auto traffic, the engineers are proposing a double diamond arrangement, which converts a block long portion of N. Williams Avenue into a two-way street, with traffic running on the left-hand side of the the road (i.e. the opposite of the usual American pattern). This reverse-flow two-way street is the central feature of the larger of the two “covers” over the freeway, further emphasizing that these are hardly recreational green space but are an auto-dominated zone.

Here’s an overview of the project.

Proposed Rose Quarter Street Plan (City of Portland)

 

And here’s a detailed view of the diverging diamond feature.  The two reverse direction lanes are separated by an island which contains a two-way bike and pedestrian path (shown in green).  The purpose of the diverging diamond is to straighten the approach routes to the freeway’s on-ramps and speed automobile traffic.  Notice that the radius of curvature of the corners from eastbound NE Weidler onto the northbound couplet and from westbound NE Broadway onto the southbound couplet have been increased to allow higher speed turns than normal city blocks.

Detail of proposed Rose Quarter street plan (City of Portland)

This arrangement is hostile to pedestrian and bike movements for a number of reasons.  For east-west traffic, the number of crossings is increased:  pedestrians have to cross from one side of the couplet, first to the center island, and then across the other side of the couplet.  For both of these crossings, traffic is moving in the opposite direction of every other two-way street in the city.  In other diverging diamond installations, engineers have put down pavement markings to warn pedestrians that traffic is coming from an unexpected direction. (These and several following illustrations are drawn from a North Carolina State University study of diverging diamond interchanges published by the Transportation Research Board.)

Bastian Schroeder, NC State University, Observations of Pedestrian Behavior and Facilities at Diverging Diamond Interchanges

For pedestrians and bikes going north/south on Williams, they’ll be channeled onto a narrow island between two opposing lanes of traffic, with both lanes serving as accelerating lanes for vehicles entering the freeway northbound and southbound.  The southbound portion of the couplet consists entirely of vehicles getting on the freeway. The higher speeds at turns and on these “on-ramp” streets are particularly dangerous to pedestrians. That’s an identified problem with the diverging diamond approach.

Bastian Schroeder, NC State University, Observations of Pedestrian Behavior and Facilities at Diverging Diamond Interchanges

As they’re getting ready to accelerate to freeway speeds, drivers may not be looking for pedestrians. Schroeder’s study of diverging diamonds reports that vehicles accelerating to freeway speeds are unlikely to yield:

Our colleague Chuck Marohn at Strong Towns took a close look at arguments that the diverging diamond creates a pedestrian friendly setting. In his view, that’s a claim that would only fool a highway engineer. He’s got a video walk-through of a diverging diamond in Missouri that shows how hostile these intersections are to foot-traffic. His conclusion:  the diverging diamond is an “apostasy when it comes to pedestrians and pedestrian traffic.”

Despite claims that the Rose Quarter freeway widening project is designed to improve pedestrian access and knit together this neighborhood’s fragmented street grid, pretty much the opposite is happening here.  Introducing a diverging diamond into the landscape is plainly designed to move more cars faster. It creates a more difficult and disorienting crossing for pedestrians and hems in the area’s principal North-South bike route between two reverse-direction roadways that are essentially freeway on-ramps. Its wider turns and straighter freeway approaches encourage cars to go faster, and make drivers less likely to yield to pedestrians. If Portland is really interested in making this area more hospitable to pedestrians, this almost certainly isn’t the way to do it.

Reference:

Bastian Schroeder, Ph.D., P.E. Director of Highway Systems, NC State University, Institute for Transportation Research and Education, Observations of Pedestrian Behavior and Facilities at Diverging Diamond Interchanges. (2015)

How the g-word poisons public discourse on making cities better

We’re pleased to publish this guest post from Akron’s Jason Segedy.  It originally appeared on his blog Notes from the Underground. Drawing on his practical experience in a rust-belt city, he offers a compelling new insight on the casual way that “gentrification” is invoked in serious discussions about the future of our cities.

By Jason Segedy

Gentrification (noun) – the process by which people of (often modest) means who were once castigated for abandoning the city are now castigated for returning to the city

Gentrification. It is a word that we hear with increasing frequency in contemporary discussions about American cities. But what does that word really mean? And, even more importantly, what does it mean in the context of the region that I live in and love – the Rust Belt?

Does gentrification mean the displacement of the poor – pushed aside to make way for the affluent? Or does it mean reinvestment in economically distressed neighborhoods that haven’t seen any significant investment in decades?

It is important to be clear about the meaning of this increasingly ambiguous term, because what needs to happen in the vast majority of urban neighborhoods in the legacy cities of the Rust Belt is far less ambiguous.

Despite over 50 years of well-intended social programs, concentrated generational poverty, entrenched socioeconomic segregation, and the resulting lack of social and economic opportunity for urban residents, still remain the biggest challenges for the older industrial cities of this region. 

As Joe Cortright says, in his brilliant piece, Cursing the Candle, “Detroit’s problem is not inequality, it’s poverty…The city has a relatively high degree of equality at a very low level of income.”

And, as the Brookings Institution’s Alan Berube says, “It’s hard to imagine that the city will do better over time without more high-income individuals.”

High poverty rates in cities like Akron, Buffalo, Cleveland, and Detroit, are partially due to regional economic conditions and structural economic challenges related to deindustrialization.

But, overwhelmingly, concentrated poverty in these cities is due to private disinvestment in the urban core, made manifest by upper and middle-class flight to the suburbs, socioeconomic and racial segregation, and the loss of neighborhood retail and basic services. Today, the geographic disparities in household income between the central city and the surrounding suburbs remain profound.

In Akron, Buffalo, Cleveland, and Detroit, respectively, 24%, 31%, 35%, and 36% of the population lives in poverty, as compared to 14%, 14%, 15%, and 15% in these cities’ respective metropolitan areas. Keep in mind that these metropolitan area figures include the core city – meaning that poverty rates in the remainder of the metro area are even lower.

Gentrification is a hot topic of conversation in coastal cities, like New York, Washington, and San Francisco, with expensive living costs that are also home to influential journalists.

Writing about gentrification is becoming a cottage industry for many pundits and urban policy wonks. Many of the pieces that have been penned on the topic are important, thought-provoking, and well-reasoned.

But as more and more people in the Rust Belt read these accounts, and take them out of their geographic context, alarm over gentrification (particularly on the left) is steadily growing in metropolitan areas and housing markets where it should be the least of our urban policy concerns.

In the eastern Great Lakes region, with its low-cost of living, depressed housing markets, and surfeit of vacant and abandoned properties, most of the changes that are being held out as disturbing examples of gentrification, and are provoking hand-wringing in places like Buffalo, Cleveland, and Detroit, simply amount to the return of the middle class (with a sprinkling of the truly affluent) to several small pockets of the city.

The degree to which these fledgling positive examples of private reinvestment in long-neglected neighborhoods have truly taken root and have begun to influence regional housing markets is still uncertain. As for documented cases of low-income residents being uprooted and displaced by spiraling housing costs – these have proven even more elusive.

While it can be unclear whether the return of middle class and affluent residents to a neighborhood will really do anything to improve economic conditions for the poor, it is an ironclad certainty that a continued lack of socioeconomic diversity, and its concomitant concentrated poverty, will improve nothing and help no one in these cities – the poor most of all.

For 50 years now, people, jobs, and economic opportunities have steadily left our cities for the suburbs. The status-quo in our region is, indisputably, one of widespread, entrenched urban poverty, geographically separated from (predominately suburban) economic opportunity.

Yet, even the earliest signs of neighborhood revitalization, and nascent attempts at building new housing and opening small businesses in these cities are frequently opposed by people who are convinced that they are acting in the name of social justice.

Sincere as these anti-gentrification sentiments might be, I believe that they are harmful, and, if allowed to derail incipient efforts to reinvest in urban neighborhoods, simply serve to ensure that the existing dynamic of socioeconomic segregation will remain unchanged.

In many cases, the very people who claim to be fighting the current unjust system are inadvertently perpetuating it. Gentrification alarmists have yet to come to grips with the fact that their position usually serves to reinforce the existing, highly inequitable, situation.

Many critics of Rust Belt gentrification are holding cities to an unreasonable standard, and placing them in an impossible situation.

If much of the city remains poor and run-down, this is proof that the city does not care, and is not trying hard enough.

If, on the other hand, parts of the city begin to attract new residents and investment, this is proof that the city does not care, and is not trying hard enough.

Heads I win. Tails you lose.

Sometimes, it seems that the only thing that people dislike more than the status-quo, is doing anything substantive to change it.

In Akron, 81% of the people who work in the city, and earn over $40,000 per year (hardly a king’s ransom), live outside of the city. It is unclear how Akronites living in poverty will be better off if these people remain in the suburbs.

Let’s get concrete. If you are a well-educated, middle, or upper income person (and if you’re reading this, you probably are), and you live in an economically diverse urban neighborhood, is your presence a bad thing for your community?

Should you move, instead, to a suburban community that is likely to be highly-segregated and economically homogeneous?

If you are an entrepreneur starting-up in the urban core, should you decide to open your business somewhere else? And how, precisely, will doing that help the community that you are leaving behind?

When middle class people return to urban neighborhoods, they have some disposable income, which helps create markets for retail and small business, that, in turn, provide basic services and job opportunities for the urban poor.

This means that urban residents who are struggling to get by may no longer need to over-extend themselves to purchase a car, or endure long and inconvenient bus rides to access entry-level jobs and basic services in far-flung suburbs, but instead may be able to save time and money by walking to businesses in their own neighborhoods.

With the return of middle and upper income residents, business districts and housing markets, long dormant, may begin to approach at least minimum levels of functionality and attractiveness to prospective entrepreneurs, investors, and residents.

For existing urban homeowners, the gradual rise in property values, in areas with extremely depressed and artificially low home prices, often means the difference between a house ultimately being rehabilitated, or it beginning a tortuous cycle of neglect and decline, culminating in demolition.

This is especially important in the legacy cities of the eastern Great Lakes, where low property values and a glut of vacant and abandoned properties, rather than financially crippling housing costs, are the largest real estate challenge. And, unlike superstar cities on the coasts, cities in this region still have large percentages of households that are comprised of working-class homeowners living in single-family homes.

Take it from someone like me, who lives in a city with 96,000 housing units, where only 16 single-family homes were built last year, while nearly 500 were torn down, and where the median value of an owner-occupied house is $78,000.

To be sure, the return of new housing, small businesses, and more affluent residents is not a panacea, and there may be legitimate concerns, at some point, about how people moving back to the city might result in rising rents and higher property taxes for existing residents.

But in the end, I have yet to see a proven model for improving economic conditions in an urban neighborhood that is predicated on ensuring that concentrated poverty remains. Maintaining the status-quo in urban neighborhoods, in the name of opposing gentrification, will do nothing to help the poorest and most vulnerable residents.

Cities typically begin to rebound with small successes in individual neighborhoods, attracting new housing and jobs, and eventually building upward and outward from there – setting the stage for further incremental investment by the private sector.

If we urbanists truly believe that socioeconomically and ethnically diverse neighborhoods are as important as is often claimed, we cannot panic every time a new house is built, a new person moves in, or a new business opens. These are overwhelmingly good things for neighborhoods and cities that have seen precious little investment for decades.

Should we remain vigilant, and work together, in a cross-sector manner, to help ensure that the rising tide is actually lifting all the boats?

Absolutely.

Should we double-down on the status-quo in our region – one of entrenched poverty and racial segregation, because we are afraid of what any type of socioeconomic change could mean for a neighborhood?

Absolutely not.

Squelching private investment in the urban core is the wrong solution to the wrong problem. It will only serve to ensure that lower income, middle income, and upper income people continue to live apart in separate and unequal enclaves, and it will make social and economic conditions in our urban neighborhoods worse, rather than better.

If we are really serious about breaking down barriers in our neighborhoods, and celebrating socioeconomic diversity, then we have to come to grips with what that means and what that looks like.

Yes, it is complicated, and messy, but it is simply not good enough anymore to say that the status-quo is unacceptable.

We need more than words. We need to act. We need to fight the correct enemy. We need to do more than curse the darkness. We need to light a candle.

We don’t need more top-down economic silver bullets. We need collaborative, incremental change – person-by-person, neighborhood-by-neighborhood, informed by humility, prudence, sensitivity, wisdom, and love for our neighbors.

Working together, we can become a much better-connected, more cohesive, coherent, and equitable place. The only people who can stop us from becoming that place are we ourselves.

It’s not enough anymore to be against something. It’s time to be for something.

Jason Segedy is the Director of Planning and Urban Development for the City of Akron, Ohio. Segedy has worked in the urban planning field for the past 22 years, and is an avid writer on urban planning and development issues, blogging at Notes from the Underground. A lifelong resident of Akron’s west side, Jason is committed to the city, its people, and its neighborhoods. His passion is creating great places and spaces where Akronites can live, work, and play. 

Is inequality over?

After a long, slow recovery, wages are finally rising for the lowest-paid workers, but we’re no where close to rectifying our inequality problem; in fact, it’s going to get worse.

The very smart Jed Kolko, who now writes for labor market website Indeed, offers some keen insights from the latest Bureau of Labor Statistics data. According to Kolko, “The labor market made impressive gains this past year. October 2017 was the 85th consecutive month of job growth. So far in 2017, monthly job growth has averaged 169,000—down modestly from previous years, but more than we’d expect after so many years of recovery and expansion.” The highlight of Kolko’s latest report is the impressive wage gains in the last four quarters for the lowest income workers. Whether measured by educational attainment or wage level, those at the bottom recorded higher wage increases than those at the top.

In a bout of premature triumphalism, Steve LeVine at Axios, is ready to call the whole inequality issue over and done with.

Income inequality — the stubborn curse of the current era and, many think, a key factor in the global uprising against establishment powers — appears to be on a solid decline in the U.S., according to a new report.

While the recent gain in wages for the lowest paid workers is definitely good news, it is hardly the reversal of America’s yawning income gap.  Far from it.  Not only does a single sparrow not make a spring, there’s plenty of evidence that the inequality problem is not only not going away, but that it’s likely to get worse.  There are four reasons why:

First, the wage gains for low income workers so far are small

it would take a decade or more of such performances to offset the widening in wage inequality that’s happened in just the last decade or so. The tale of the tape comes from the Economic Policy Institute, which uses Census data to track cumulative changes in real hourly wages since 2000 by income group.  Those in the 90th percentile have seen a nearly 16 percent increase in real wages, while those in the bottom 10 percent have seen just a 2 percent increase in real wages. In the past year, real wages for the least skilled (per Kolko’s analysis of BLS data) have outpaced those of higher skilled workers by about 1 percent.  So we’d have to see these gains, year-in and year-out, between now and about 2030, just to get back to the level of wage inequality we had in 2000.

EPI‘s headline finding has it exactly right: “More broadly shared wage growth from 2015 to 2016 does little to reverse decades of rising inequality.”

Second: Wages aren’t all income

Inequality is actually a much bigger issue that the variation in wages between high skill and low skill workers.  Wages are just one component of income.  Other forms of income (dividends, interest, rent, business profits) are all much more unequally distributed than wages, and it’s been these latter forms of income, per Thomas Piketty and others, that have fueled much of the growth in inequality. Aggregated across all forms of income, the gains over the last third of a century have been heavily skewed to those at the top, as we reported at City Observatory a few months back.

Third, The Fed is likely to break up the game

In a macroeconomic sense, the gains that the lowest income workers are seeing is the long-awaited result of more than eight years of economic growth. The recovery from the Great Recession (which actually began exactly 10 years ago, in December 2007, according to the NBER), has been slow. Higher skilled and higher income workers have seen the best employment and wage gains throughout the recovery. It’s only as the national unemployment rate has finally fallen below five percent that lower income workers are starting to see similar results. But its very much an open question as to whether a skittish Federal Reserve Board–the one that sees incipient inflation around every corner–is willing to continue to allow the economy to grow.  It’s already started raising interest rates, and promises to raise them further in the coming year. That will likely slow the economy, and with it, curb the gains that the least well-off workers have enjoyed in the past couple of years.

Fourth, The new tax bill is about to make inequality far worse.

You can’t talk about inequality without talking about the tax system. One major source of the growth in inequality has been the differential taxation of high income households and non-wage income. Since 1980, in the US effective tax rates have been slashed for the highest income households and for the kinds of income and wealth they hold, including property, stocks, bonds and estates. (That’s why the difference between the red line “pre-tax” line  and the blue “post tax” line in the chart above is so small).  And if the tax plan currently before Congress moves forward, that’s going to get dramatically worse.  The trillion dollar plus tax cut goes overwhelmingly to those at the top of the income spectrum. Vox has an analysis showing the average tax cut by income group in 2025:

Moreover, the tax cuts will undoubtedly trigger a big increase in the deficit, which will likely be used as an excuse for shredding of the social safety net, with cuts to Medicare, Medicaid, Social Security, and other programs that ameliorate the effects of income and wealth inequality.

In sum, it’s a very good thing that the labor market has finally recovered to the point where the lowest income workers are enjoying a small increase in their wages. But that’s hardly a justification for calling America’s decades-in-the-making income inequality chasm closed. It’s still dauntingly large and promises to get even worse.

 

 

The great freeway cover-up

Concrete covers are just a thinly-veiled gimmick for selling wider freeways

As you’ve read at City Observatory, and elsewhere (CityLab, Portland Mercury, Willamette Week), Portland is in the midst of a great freeway war. The Oregon Department of Transportation is proposing to widen a mile-long stretch of Interstate 5 opposite downtown Portland from 4 lanes to 6, at a cost currently estimated at just under half a billion dollars. There’s notable opposition to this idea, which flies in the face of the city’s stated aims of reducing greenhouse gas emissions and promoting Vision Zero.

One of the chief selling points of the project is the claim that it will “cover” the Interstate 5 freeway. Calling it a cover conjures up visions of a roadway completely obscured from public view, and topped by a bucolic public space.What that immediately calls to mind, especially for those in the Pacific Northwest is Seattle’s “Freeway Park” constructed over Interstate 5 in the city’s downtown. It provides public space in the form of nearly five acres of tree-shaded, lawns, plaza’s and stairways.

Seattle’s Freeway Park (Dazzling Places.com).

But what’s actually being proposed for Interstate 5 is less of thick, verdant freeway-obscuring park and more of a skimpy concrete g-string. When you look closely at the project’s own illustrations, its apparent that the covers are actually just slightly oversized overpasses, with nearly all of their surface area devoted to roadway.

Each of the covers is bisected by one more of the four principal arterials that cross over the Interstate 5 freeway in the project area.  One cover carries N. Vancouver Avenue over the freeway; the other carries N. Williams, NE Broadway and NE Weidler.  Each of these streets is a heavily traveled automobile thoroughfare in its own right, so these covers are mostly devoted to carrying cars, not providing public space.

The clearest way to appreciate the absurdity of describing these covers as public space is to map them and subtract out the portion of the covers that is devoted to roadway. Portland’s Jim Howell, a long time transit advocate has done just that.  What you really have is seven irregular trapezoids, hemmed in on nearly every side by roads or the freeway itself.  Here’s Howell’s diagram:

These are the pieces of the “Lids” available for development (Jim Howell)

 

For reference, Portland’s celebrated small city blocks are about 200 feet on a side; so none of the areas offered up here is much more than about a quarter of block of useful space. These remnants are hardly the site of any viable public space, and certainly not places where anyone, surrounded by car traffic, is likely to linger. In fact, they greatly resemble existing orphan land near the current I-5 freeway interchange, owned by the  Oregon Department of Transportation, and not improved in any way, or even maintained:

Weed-choked. litter-strewn triangles: a model for freeway “covers” ODOT property at I-5, NE Broadway & Williams (Jim Howell photo)

These tiny fragments don’t work as a public space, and can’t ever be made to be an actual public space, because they that was never the intention: The “covers” exist only to provide a deceptive talking point to help sell a freeway widening project. Plus, enveloped in freeway noise and pollution, and surrounded by fast moving car dominated arterials and freeway ramps, this will be a supremely hostile environment for pedestrians and cyclists. Anyone familiar with the Oregon Department of Transportation knows what a low priority it attaches to pedestrian improvements urban streets it controls. A few blocks East of this project, on Martin Luther King Boulevard, ODOT refused to add pedestrian improvements to a traffic island being landscaped to include a memorial to the civil rights leader.

These covers can’t support the Albina Vision

Project advocates have also seemed to conflate the freeway widening project with a speculative redevelopment scheme called the Albina Vision, which so far consists almost entirely of the following rendering, which shows entire area between the freeway and the Willamette River, including the parking lots surrounding the region’s two principal arenas–the Moda Center and the under-used Memorial Coliseum–being redeveloped into high rise apartments and offices, along with vast new public spaces. (Where the money would come from to pay for such a project hasn’t been identified).

The project’s rendering has neatly made both the Interstate 5 freeway and its extensive on- and off-ramps disappear under a welter of new high rises, details very much at odds with the project proposed by the Oregon Department of Transportation.

The Long History of Using Misleading Images to Sell Urban Highways

It’s tempting to imagine that a “cover” could magically erase the scar created by running a multi-lane freeway through an urban neighborhood. Using this kind of illusion  and creatively mis-representing the visual impact of a new construction has a long history in the world of selling highways. Robert Moses famously skewed the illustrations of his proposed Brooklyn Battery Bridge (which would have obliterated much of lower Manhattan and Battery Park); we turn the microphone over to Moses’ biographer Robert Caro, from The Power Broker

Moses announcement had been accompanied by an “artist’s rendering” of the bridge that created the impression that the mammoth span would have about as much impact on the lower Manhattan Landscape as an extra lamppost. This impression had been created by “rendering” the bridge from directly overhead—way overhead—as it might be seen by a high flying and myopic pigeon. From this bird’s eye view, the bridge and its approaches, their height minimized and only their flat roadways really visible, blended inconspicuously into the landscape. But in asking for Board of Estimate approval, Moses had to submit to the board the actual plans for the bridge. . . .

The proposed bridge anchorage in Battery Park, barely visible on Moses’ rendering, would be a solid mass of stone and concrete equal in size to a ten-story office building. The approach ramp linking the bridge to the West Side Highway, a ramp depicted on the rendering as a narrow path through Battery Park, would actually be a road wider than Fifth Avenue, a road supported on immense concrete piers, and it would cross the entire park—the entire lower tip of Manhattan Island—and curve around the west side of the island almost to Rector Street at heights ranging up to a hundred feet in the air. Not only would anchorage and piers obliterate a considerable portion of Battery Park, they—and the approach road—would block off much of the light not only from what was left of the park but also from the lower floors of every large office building they passed; because the approach ramp was really an elevated highway that would dominate the entire tip of Manhattan, it would depress real estate values throughout the entire area.

If Portland wants to have more green space, less impact from the Interstate 5 freeway, and something resembling the higher level of density depicted in the Albina Vision, it shouldn’t squander half a billion dollars widening the freeway and creating badly fragmented, noisy and unusable trapezoids of concrete on a pair of oversized over-passes. Instead, it ought to ask how half a billion dollars could be invested to make this neighborhood, the city and the region a better place in which to live.

 

The death of Flint Street

A proposed freeway widening project will tear out one of Portland’s most used bike routes

At City Observatory, were putting a local Portland-area proposed freeway widening project under a microscope, in part because we think it reveals some deep-seated biases in the way transportation planning takes place, not just in Portland, but in many cities. Today we turn our attention to plans to tear out a key  local street which serves as a major bikeway in North Portland as part of the Interstate 5 Rose Quarter Freeway widening project.

A quick refresher:  I-5 is the main North-South route through Portland, and the Oregon Department of Transportation is proposing spending at least $450 million to widen the roadway from 4 lanes to 6 in a one-mile stretch in North Portland, opposite downtown. A growing coalition of community groups has organized to fight the project as wasteful, ineffective and at odds with the region’s climate change and Vision Zero goals.

One of the supposed rationales for the project is that it will “knit together” the fabric of community that was rent asunder by the original construction of the Interstate 5 freeway in the 1960s. (The freeway runs adjacent to areas that then had the largest concentration of African Americans in Portland). The freeway includes several over-sized multi-street overpasses that the project grandly describes as “covers” over the freeway.

While the project touts the so-called covers, it downplays the fact that one element of the project is eliminating the current over-crossing that carries N. Flint Street over I-5. Flint street is a two-lane, two-way neighborhood street that runs parallel to the busier N. Williams/N. Vancouver couplet that funnels traffic on and off I-5.  The plan for widening the freeway calls for tearing down the Flint street overpass–and not replacing it.

Disconnecting the street grid

City officials have repeatedly claimed that a key purpose of the project is to knit together a community split apart by freeway construction, in part by improving bike and pedestrian links.  Portland Bureau of Transportation Manager Art Pearce  told the Oregonian:

“We see the Rose Quarter project as really reconnecting the central city,” said Art Pearce, the Transportation Bureau’s manager for projects and planning. “It has the potential to reconnect the area, make it more of a destination … and having more of the bike and pedestrian streets people have come to expect in other parts of Portland.”

But this project does nothing of the kind, and if anything, severs an important local street. This point was made strongly by long-time local transportation advocate Jim Howell of AORTA (the Association of Oregon Rail and Transit Advocates), in his testimony to the City Council on November 30, 2017.  Howell noted that “Flint Street is not going to be replaced, it will be lost to the neighborhood.  This  is one of the major North-South routes through the neighborhood, it’s been there since it was platted as the City of Albina.” (Howell’s illustration, produced below, shows the overpasses to be replaced by covers or lids (white rectangles) and on the right, the current Flint Street overpass (marked with red x’s) which will be eliminated.

X-ing out North Flint Street (Courtesy, Jim Howell, AORTA)

And this corridor is one of Portland’s busiest, and fastest growing bike routes.  In total, about 10,000 bikes per day travel north and south through this project area–a  five-fold increase from 2001 levels, according to city bike counts. North Flint street, with its two-way traffic, lower volumes and slower speeds, is an attractive route for many of these cyclists. While the city doesn’t have street level counts, it appears that a majority of South bound cyclists use N. Flint Street to cross the freeway and reach the Broadway Bridge across the Willamette River: City of Portland bike counts show that 56 percent of south-bound morning peak hour trips on N. Williams Avenue turn on N. Russell (a takes them to N. Flint Street).  In addition, N. Flint Street is home to the Harriet Tubman middle school, which though currently vacant, is scheduled to be re-opened to serve students from North and Northeast Portland. The project is proposing a steeply graded new extension of N. Hancock Street that would run East to West as a partial substitute for Flint Street.

If this project goes forward, Flint Street will dead-end at the new, wider freeway.  Rather that “connecting” the community better, the project actually disconnects it. It’s coming to be recognized that a grid of local streets better manages traffic flow and enables pedestrian safety. And this project is a step backwards, concentrating more vehicle movements as well as more bicycles on main arterial streets, and eliminating a slower-speed, local serving street.

Why amputate Flint Street?

Given Portland’s reputation (mostly well-deserved) for progressive policy on transportation, you might think that the city would have a clear rationale for killing Flint Street. But according to the discussion at last week’s Portland City Council meeting, the Director of the city’s transportation bureau didn’t know what it was. After Howell and other presenters questioned the claim that the project would “re-connect the neighborhood,” City Commissioner Amanda Fritz asked Portland Bureau of Transportation Director Leah Treat why the city had eliminated Flint Street. Treat didn’t know.  Here’s their colloquy, (which was repeated for the record due to a glitch in the city’s closed caption hearing system–we report both versions here):

November 30 Meeting at 1:44:11

Commissioner Amanda Fritz “I was wondering about the taking out the Flint; What’s the rationale was behind that?”

Portland Bureau of Transportation Director Leah Treat: “I can’t answer that; I’ll have to get back to you on that.”

November 30 Meeting at 1:45:25

Fritz:  “My question was, do you know what the rationale was for taking out Flint?”

Treat:  “I don’t; I’ll will get back to you and follow up on that.”

Portland transportation bureau director Leah Treat (Portland City Council video)

 

To put this in context:  the Rose Quarter freeway widening project is the largest transportation investment in the central city contemplated in the City’s current land use plan. It’s being sold as somehow reconnecting the community and benefiting cyclists and pedestrians. And yet, when asked the simple question as to why a city street is being removed and displacing a major bike thoroughfare, the city’s Director of transportation has no idea of what the rationale for this step is.

This is a bureau that regularly agonizes over the loss of a handful of parking spaces and which has detailed and copious justifications for road diets when they are implemented. But in the case of a $450 million freeway widening project, they don’t immediately know, when asked, why their amputating a key local street as part of an initiative which is supposedly all about re-connecting the neighborhood. Given the objective and the stakes, Portlander’s deserve a clear and compelling answer to that question.

Remember: There’s no such thing as a “Free” way

Congestion pricing is a win-win strategy and the only way to truly reduce traffic congestion

The urban transportation problem is a hardy perennial: no matter how many lanes we add to urban freeways, traffic congestion is just as bad, if not worse than ever. In the face of “free” road travel, induced demand means that supply side strategies like widening freeways simply encourage more people to drive, and congestion gets worse. Houston successively widened its Katy Freeway to 23 lanes, but only found that congestion got worse and travel times got longer.

Fortunately there is a solution: peak-hour road pricing. It’s been successfully implemented in cities around the world, from London, to Singapore, to Milan to Stockholm. Charging a modest price for the use of roads at the peak hour encourages a small fraction of drivers to change their route or mode, with the result that traffic levels and travel times improve for remaining users. And while there’s always skepticism and resistance to charging for something that everyone pretended was free, once these systems are put in place, they have broad popular support because they actually make travel better and easier. Stockholm’s system was initially opposed by the public, but after six months of operation, was approved in a popular referendum by 51 percent; today more than 70 percent of Stockholm residents support the system.

The economics of this problem are simple and straightforward: as long as we charge a zero price for the use of scarce and valuable peak hour road capacity, more people will use it than it can handle. Ultimately, we end up rationing road space by the degree of road user’s desperation and their tolerance for delays. As we’ve suggested at City Observatory, it’s as if we’re trying to run the urban road system the way Ben & Jerry’s runs their ice cream business the one day of the year that they give cones away for free, and with similar results: People line up around the block for “free” ice cream, which is actually only available to those with the patience to wait long periods in line. There’s no way for Ben and Jerry’s to build enough ice cream stores to meet the demand for free cones–and free ice cream everyday would soon bankrupt the company.

This is one instance where charging a price for a “free” good will make things better for everyone. Here are two important things to keep in mind. First, not everyone attaches the same value to a peak hour roadway trip. Second, traffic is non-linear.  Here’s why these things matter.

One working assumption seems to underly popular discussions about peak hour travel: It’s that somehow everyone who is using the road now “needs” to be on the road at that time. But in fact, a big fraction of travelers have options. They could travel at  a different time (a little earlier, or a little later), they could take a different route or mode (like bus or bike), or they could combine or re-route trips. Some people would find it just as convenient to travel at a different time or to a different destination, and while others might forego a trip altogether.  Perhaps a majority of those currently traveling at the peak hour would find it difficult or inconvenient to change, but many people could be flexible, if they had the incentive to do so. They key question is how many people would have to change their behavior to get a positive result.

That’s the second key fact: The answer is that only a few people would need to forego current peak hour car trips to get a big improvement in congestion. A critical feature of traffic congestion is that it’s non-linear.  What that means in practice is that roads work really, really well, right up to the point where they get congested. A highway with heavy traffic moving along at 45 to 50 miles per hour carries the most vehicles of any roadway. But add just a few more cars, and the traffic level reaches a tipping point, where things slow down, and the road actually loses its ability to move people. The key to making roads work well is to keep them just below this critical tipping point. The good news here is reducing traffic volumes just 5 to 10 percent is all that is needed in many instances to keep the roads moving.

We’re not talking about big charges either:  in most cases, modest toll levels trigger big changes in commuting patterns. Louisville, Kentucky recently started tolling the I-65 crossing over the Ohio River. Regular commuters pay a toll of $1 per crossing; and tolls have reduced the volume of cars using the I-65 crossing by more than 40 percent. Today, you encounter almost zero traffic at rush hour leaving downtown Louisville.  (As it turns out, Louisville wasted roughly a billion dollars widening this bridge, because they doubled it’s capacity before levying a toll; if they’d tolled first, they would have discovered they had plenty of road space).

So to make our roads work better, all we need are tolls that persuade maybe one in ten travelers to change their behavior. And that’s what peak hour pricing does: sends a signal to those people who have alternatives or who least value traveling at that time on that route, to do something different. The result is that every other traveler gets a big benefit in terms of improved performance.

A well-designed system of road pricing could be an enormous win-win:  It would shorten travel times, and actually improve the carrying capacity of the highway system. It would eliminate the need for expensive, environmentally destructive and usually wasteful road-widening projects. The funds from road pricing could be used–as they are in London–to fund other transportation alternatives. There’s really no such thing as a “freeway,” and a correctly priced road system would be better for all of us.

 

Renters move up-market

What to make of the high credit scores of new renters in some markets: alarm bell or success signal?

RentCafe–one arm of Yardi Matrix, a real estate data and services firm–has a very interesting new data series on the credit scores of successful and unsuccessful apartment seekers in different cities.  Rent Cafe runs a tenant screening service on behalf of landlords, and a key variable that they report to their clients is a household’s credit score (which ranges from a low of 300 to a high of 800).  Rent Cafe’s research has shown a strong correlation between a strong credit score and a high probability of having your rental application approved.  Nationally, if you have a credit score over 750, your chances are 98%, if it’s between 500 and 600, then you have only about an 67% chance of getting the apartment.

In some markets–like Boston, San Francisco and Seattle–successful rental applicants have sterling credit. In these top cities, new renters have credit scores over 700.

To Rent Cafe, the high scores in these cities are a kind of alarm bell. You apparently need really great credit to rent an apartment in these places.

What Credit Score Do You Need to Rent an Apartment? Insanely High, If You’re in Boston or San Francisco

But there’s another possible view: it’s likely that high credit scores signal just how robust the local economies are in these communities, and furthermore, how much households with strong credit are choosing to rent rather than to buy.  Given that most of these communities have very expensive home prices, it may be that many credit-worthy households who might otherwise prefer to buy a home end up renting.  And because Rent Cafe is reporting the average credit score, the average renter in these cities just has better credit.

Conversely, in some other cities, if you have a high credit score, you may find it more affordable to purchase a home rather than renting. This selection effect would explain why renters have much higher average credit scores in high-priced markets than in low-priced markets.

It’s also likely that credit scores are rising because the personal financial situation of the typical renter is getting better.  Rent Cafe notes that the credit score of the typical approved applicant has increased about 12 points in the past three years, from 638 in 2014 to 650 this year. This is consistent with other evidence showing steady income gains in most metro economies. We’ve reported on national data at City Observatory. Oregon economist Josh Lehner has a series showing income gains for the Portland area (which ranks 8th in Rent Cafe’s list of metros with the highest credit scores for renters). Rising incomes are making housing more affordable.

 

Clearly, the rental market is very competitive in places like Boston and San Francisco, meaning landlords can pick and choose among tenants with a strong financial background; but the very strong credit scores of successful rental applicants in these cities  is as much an indication of strong local economies  (and prosperous tenants) as it is of the choosiness of local landlords. In the long run, the growing average creditworthiness of tenants may reflect the limited opportunities for homeownership in these increasingly expensive markets, where households with sterling credit who would be buyers almost anywhere else, find themselves renting instead. A new report issued by the Philadelphia Federal Reserve Bank shows a continued decline in the number of first-time home buyers nationally, a trend consistent with rising average credit scores among renters.

You’l want to take a look at the full report. The RentCafe report has detailed information on average credit scores of successful and unsuccessful apartment seekers in 100 of the nation’s largest cities. As we’ve noted, the data are drawn from RentCafe’s own client work, and so may be influenced by how complete and representative a sample of properties it covers in each market.

Uber and Lyft: A dynamic duo(poly)?

Will two firms produce enough effective competition to benefit consumers?

The use ride-hailing services continues to grow in the US, and while there are a range competitors in some markets, like New York, in most places, nearly all ride-hailing is dominated by Uber and Lyft. The good news for consumers is that fierce competition for market share between these firms is helping to keep fares low, and promote steady innovation and change.

While our view has long been that we want to encourage a range competitors–”Let a thousand Ubers Bloom,” we said–that doesn’t seem to be happening yet. Austin, which was temporarily evacuated by both companies after a local voter-approved measure which enacted some restrictions (such as background checks for drivers), seemed to be a keen example of how new entrants could step in and fill the void. But this summer, after the Texas Legislature pre-empted Austin’s municipal laws, Uber and Lyft re-entered the market with a vengeance, and have apparently recovered most of their lost market share. Local startups are struggling against the bigger firms brand recognition and and deep pockets. One local startup, RideAustin, saw its business decline 62 percent after Uber and Lyft came back to town.

In the longer term, the burgeoning number of businesses developing autonomous vehicles are potential competitors for Lyft and Uber. Waymo, the Google/Alphabet subsidiary, has just launched its fully driverless cars in Phoenix, and as they scale up, they threaten to be a major disruptor in this market. But that development seems to still be a ways off.  Will two firms in this industry be enough to assure vigorous price competition, and provide consumers with good value? Let’s look to see what’s happened here in the past few months.

Will a duopoly be enough?

Uber has been the dominant firm in the ride-hailing market since its inception, but in recent months there are signs that its dominance is ebbing. The company has experienced a series of widely publicized gaffes, ranging from sexual harassment claims against its executives, to a video of the company CEO disparaging one of its drivers, to the ultimate resignation of founder Travis Kalanick. The bad publicity, coupled with a #deleteUber campaign has had an impact. During most of the past calendar year, rival ride-hailing firm Lyft has grown faster, and picked up market share. Nationally, estimates are that Uber’s market share has fallen from more than 80 percent to less than 75 percent.

But the national statistics obscure a fiercer battle in many cities. Instead of being a weak also-ran, Lyft appears to becoming a sizable competitor in many markets.  Data compiled by credit card analytics firm Second Measure estimates the market share of the two companies in different cities around the country. As reported in Recode, these data show strong improvements in Lyft’s position in many markets. Across the dozen markets shown here, Lyft has recorded 10 point or better gains in share in three-quarters of them. In general, Lyft has made the biggest inroads in major West Coast markets; in Portland, Lyft has a 45 percent market share, making it a very close rival to Uber.

Numbers on red bars show percentage point increase in Lyft market share from December 2016 to September 2017.

A companion article in Recode examines the connection between the #deleteUber campaign and consumer use of the two services.  According to Second Measure’s data, most ride hailing customers exclusively use just one of the two leading services (71 percent exclusively use Uber, 19 percent exclusively use Lyft).  Only about 10 percent of ride-hailing customers use both. While customer loyalty to a single firm is something that every company no doubt wants to cultivate, the sharp change in the market share of the two companies is good evidence that switching costs are relatively low. Thus, even though only a small fraction of customers are regular comparison shoppers (using both services routinely), those who rely primarily on one or the other can easily switch if they’re dissatisfied. Low switching costs are one of the keys to getting the benefits of competition in a duopoly situation.

An intensely competitive duopoly in ride-hailing is likely to help assure better rates and better services for consumers than would be the case if one company achieved monopoly status. If customers are dissatisfied over prices, the service the receive or the company’s policies, they can vent their displeasure by switching to the alternative. This market share shift however suggests that the market valuation attached to Uber–almost more than $70 billion last year–may be far too high. While we may not see a thousand flowers blooming, an effective duopoly may offer the next best thing.

Kevin Bacon & musical chairs: How market rate housing increases affordability

Building more market rate housing sets off a chain reaction supply increase that reaches low income neighborhoods

Households moving into new market rate units move out of other, lower cost housing, making it available to other households; the propagation of this effect produces additional housing supply in lower income neighborhoods

There’s a lot of resistance among many people to what seems blatantly obvious to economists:  If housing prices are too high, building more housing will help bring prices down. To economists, it’s a simple matter of supply and demand. If you want to reduce displacement, build more housing.  To many non-economists, it seems like a non-sequitur or an impossibility. Building more expensive housing makes housing more expensive.

But there’s a new research paper that offers a powerful explanation of why more market rate housing makes other housing more affordable to lower income households. The paper is by Evan Mast, an economist at the Upjohn Institute. The paper exploits a rich, detailed new set of data about households and migration and uses some pretty sophisticated techniques to work out its results, and Mast is after all, an economist, so rather that emphasize the technical findings, let us translate them using to widely known metaphors:  musical chairs and Six Degrees of Kevin Bacon.

Housing markets explained with musical chairs and Kevin Bacon; no economics required.

We all know the old child’s game musical chairs, where children circle  a group of chairs (where there’s one less chair than children). The kids constantly move while the music plays.  But when the music stops, everyone has to sit down. Whoever doesn’t find a seat and is left standing is removed from the game.  Tight housing markets are like that: if they’re aren’t enough houses (chairs) someone ends up on the outside looking in, and in markets it’s not the slowest kid, but the poorest household.

The second game is the Hollywood brain teaser of connecting any actor to Kevin Bacon by way of other actors that they worked with in a particular film.  So for example, John Turturro’s  “bacon number” is 2:  he and Julianne Moore were both in the cast of The Big Lebowski, and Moore in turn played opposite Bacon in Crazy Stupid Love. Six Degrees of Kevin Bacon shows that you can connect almost any two people in show business with fewer than six connections, no matter how widely separated they may be in time or genre. Instead of applying this to movie stars, we can connect houses–when someone moves out of one house, it creates a vacancy in another house, and so on, and this chain of moves produces much more widespread change than we might initially anticipate.

Musical chairs and “Six degrees of Kevin Bacon” shed light on housing markets

Our friends at the Sightline Institute have already applied the musical chairs metaphor to housing markets. Chairs are houses, people are like the kids scrambling to find a seat. It doesn’t matter whether you add fancy overstuffed arm-chairs or or simple folding metal chairs to the game, both make it equally likely at the end of the day that there will be a closer match between chairs and hind-ends than otherwise.

The second part of our model, which we’ll illustrate with Kevin Bacon’s help, shows how adding chairs or houses produces much wider effects. What’s often hidden from view is the chain of connections from one part of the housing market to another. How does it happen that building more units for high income households (fancy arm-chairs) result in more affordable housing for moderate income households. (Some have argued that metropolitan housing markets are somehow rigidly segmented and that building in one tier can have no effect on other tiers). What our six degrees of Kevin Bacon exercise suggests is that movements into new market rate housing by some people, ripples through the marketplace in a series of consecutive moves–like the connection of one actor to another to another–in ways that ultimately generates change in neighborhoods and housing types far removed from the price tier or neighborhood in which new construction occurs.

Economists have accepted this more or less as a matter of faith. But the new paper from Evan Mast uses some remarkable big data to actually trace out those connections (just as the handy, on-line Oracle of Bacon lets you connect any actor to almost any other). Mast uses data from the commercial directory service Infutor Data Solutions, which uses private databases to track changes of address by households. Mast used this data to determine the previous residence of households that moved into new housing units, and then looked to see who moved into the housing units those households had vacated, and so on.  Mast used this data to track seven successive moves and model additional rounds of movement.  The analysis is complex, and relies on some interpolation to compute the typical characteristics of successive generations of movers, but the end result is a robust estimation of the net effect of successive changes in housing location generated by new market rate construction.  Bottom line:  building 100 new market rate units is likely to trigger new occupancy in about 60 more affordable units elsewhere in the metropolitan area. These effects extend well outside the area in which the new housing is built, and reaches areas with lower levels of income and different racial and ethnic compositions.  Mast explains:

One hundred new luxury units create about 60 equivalent units in below-median income tracts. The estimates are also large for areas that are even less similar to high- income areas, with about 30-40 equivalent units created in black and below-median income, bottom quintile income, and heavily rent-burdened areas. Additionally, the effect extends outside of the multifamily market—about 50 EUs are created in high single-family home tracts. Note that equivalent unit numbers in different housing types should be considered separately rather than summed together, because an equivalent unit in one type starts a migration chain that may nest an equivalent unit in another.

Graphically, Mast shows the successive rounds of new housing being occupied in different neighborhoods.

This chart shows how successive rounds of migration lead to additional units opening up in other neighborhoods.  Each of the colored lines corresponds to a different neighborhood characteristic. For example, the gray line with square markers shows the fraction of units in neighborhoods with household incomes below the metropolitan median (labeled “<P50 Inc. (Local)”).  Initially, nearly all of the newly occupied housing is  in the higher income neighobrhoods where new housing units are built (round 1), but in successive rounds, more and more of the newly occupied housing is in lower income neighborhoods, and by round 10, the number of newly occupied units in below median income neighborhoods is equal to nearly 60 percent of the number of new market rate units built in round 1.

The rate at which new housing occupancy ripples out to other neighborhoods varies by city, and Mast’s work shows that this is directly affected by vacancy rates. In cities with low vacancy rates, the effects are much more pronounced (the supply effect ripples out further), while in cities with high vacancy rates, the effect is slower and more muted. This makes sense:  when there are lots of vacancies, adding new units doesn’t trigger as much migration as when housing markets are tight.  What this also means is that the beneficial effects of building new market rate housing will be greatest in just those markets that are experiencing the biggest housing shortages.  Building more market rate housing say, in San Francisco, will rapidly ripple through the areas housing stock and free up units in lower income neighobrhoods that would otherwise have been occupied by higher income households.

Mast’s findings are roughly comparable to a study undertaken a couple of years ago by U. C. Berkeley’s Karen Chapple and Miriam Zuk. Chapple and Zuk looked at a parallel question, the effect of building market rate and subsidized housing on rates of displacement in California.  They found that each two additional market rate units built in a jurisdiction had about the same effect on reducing displacement as building a single unit of affordable housing. Chapple and Zuk’s work confirmed an earlier finding reported by California’s Legislative Analyst Office that building more market rate housing reduced displacement.

Policy implications: The cost of inclusionary requirements

What Mast’s paper shows is that the successive knock-on effect of building more housing at the top end of the market has a significant and measurable effect on housing availability in other neighborhoods and at other price points.  This underscores the economist’s argument that building more market rate housing is a critical policy for promoting housing affordability for low and moderate income households.  Policies that discourage new market rate housing are likely to worsen affordability.  That includes policies like inclusionary housing requirements, that drive up the cost and discourage the construction of new market rate units.  Mast’s study shows it doesn’t take a large reduction in market rate construction to wipe out all of the supposed gains from requiring developers to build new affordable units:

. . . the results are also important for evaluating affordability and inclusionary zoning requirements, which require developers to fund a certain number of income-restricted units per market rate unit constructed. My estimates imply that market mechanisms create a larger number of affordable units than such requirements. Moreover, they imply that affordability requirements could lead to a net decrease in the stock of available affordable units under relatively small crowd-out rates. Since each market-rate unit creates 0.6 equivalent units in below-median income areas, lost equivalent units in such areas will outnumber the gain in income-restricted units if each income-restricted unit crowds out more than 1.67 market-rate units.

Mast’s study provides some important insights into the mechanisms at work in the housing market. The process by which households move, one after another, from home to home has essentially been hidden from view from lack of data.  This paper shines a bright statistical light on that process and definitively disproves the largely uninformed conjectures of supply skeptics and some journalists, that what happens in one part of the housing market in a metro area has no effect elsewhere. Housing markets are highly inter-connected, and building more units at the top end creates a powerful chain-reaction that ultimately adds supply in neighborhoods throughout the region.

Note:  This post has been revised to correct spelling errors in Dr. Mast’s name; we regret these errors.

 

The end of the housing supply debate (maybe)

Slowly, the rhetorical battle is being won, as affordable housing advocates acknowledge more supply matters

There’s been a war of words about what kind of housing policies are needed to address the nation’s affordability problems. Economists (from the White House to academe) argue that increasing housing supply is essential. Low income housing advocates of many stripes push for efforts to pay via taxes or require via regulations that more units be built specifically for low and moderate income households.

Everyone agrees that much of the affordability problem is due to national policies that provide massive subsidies to homeownership by the wealthy (mostly through the tax code) and parsimonious and chronically underfunded programs that provide subsidies for the poor (which reach less than a quarter of those technically eligible), a fair share of responsibility lies in the hand of local governments.  Is the key relaxing zoning limits to allow more market rate housing, or does it require regulating or subsidizing more affordable units into existence.

The battle cry of the low income housing advocates is “you can’t build your way to affordability.” As Bloomberg’s Noah Smith put it, among these advocates:

 . .  it has become an article of faith that building market-rate housing raises rents, rather than lowers them.  The logic of Econ 101 — that an increase in supply lowers price — is alien to many progressives, both in the Bay Area and around the country.

Sightline Institute has tackled that notion directly. Not only can  you build your way to affordable housing, in fact, building more supply may be the only effective way to reduce the pressure that is driving up rents and producing displacement. There’s ample evidence for this position, but there’s still the strong sense that addressing our housing problem by building more high end housing is a cynical and ineffective kind of “trickle down” economics.”

Building more market rate housing isn’t so much about “trickle down” as it is building enough new housing to keep higher income households from moving down-market and bidding up the price of older housing that would otherwise be affordable to moderate and lower income households. When there isn’t enough supply, demand from higher income households floods down to older housing stock, driving up rents and reducing housing options for those with lesser means. Which, as why, as we’ve observed, in some markets, modest 1950’s-era ranch homes are a mainstay of affordability, while in others, they cost more than a million bucks.

When supply does catch up to demand, rents tend to fall across the market. Last month, we showed how the completion of thousands of new, market rate apartments in Portland was having the knock-on effect of creating a growing number of vacancies and a flood of “FOR RENT” signs in the city’s older apartment stock. Rent increases, which were measured in the double digits eighteen months ago, have gone negative.

As Sightline Institute’s clever musical chairs metaphor makes clear, it doesn’t matter whether you add fancy overstuffed arm-chairs or or simple folding metal chairs to the game, both make it equally likely at the end of the day that there will be a closer match between chairs and hind-ends than otherwise.

What about “filtering up?”

The latest salvo in the rhetorical battles over the merits of expanding market rate housing supply comes from Miriam Axel-Lute, writing for Shelterforce.  Her latest article “Trickle Up Housing: Filtering does go both ways” makes the case that building new market rate housing does somewhat ameliorate displacement and affordability issues, that building more new low and moderate income is a more direct and powerful solution. The argument here is that if we build more housing for the poorest among us, that will free up some units for other perhaps slightly less poor households.

She buttresses the case for “trickle-up” housing by citing a study from U. C. Berkeley’s Karen Chapple and Miriam Zuk, that claims that building affordable units is twice as effective in reducing displacement as building more market rate housing. The exact claim, quoted from Chapple and Zuk is:

“At the regional level, both market-rate and subsidized housing reduce displacement pressures, but subsidized housing has over double the impact of market-rate units”

“Double the impact” sounds more like a pitch for a new and improved laundry detergent than a calculated analysis of housing policy options, but it did pique our curiosity.  How did Chapple and Zuk determine the relative effectiveness of these two policies?

As it turns out, their work is a response to a widely cited analysis developed by the California Legislative Analyst’s Office (LAO), which looked at the connection between displacement and housing construction in the Golden State.  The LAO’s conclusion was that building more market rate housing reduced displacement. Chapple and Zuk questioned the LAO report for omitting the possible positive effects of building more subsidized housing. They ran a regression analysis looking at both market rate and subsidized housing and controlling for the impact of a number of other factors, including age of the housing stock, racial/ethnic composition of the population, and education. They find that building more market rate units decreases measured displacement, as they put it:

Consistent with the LAO Report, we find that new market-rate units built from 2000 to 2013 significantly predict a reduction in the displacement indicator . . .

They also find that the construction of more subsidized housing also results in a reduction of the displacement indicator.  The claim of “twice the impact” comes from comparing the coefficients of the two variables, the coefficient on market rate housing is -.002; the coefficient for subsidized housing is -.005.  So it is literally the case that the coefficient of one is more than twice as large as the other.

Two market rate houses reduce displacement as much as one affordable house

But let’s step back and consider what that means in practice. What is measured in each case is the number of new market rate homes and the number of new subsidized homes built in a community over a decade. Put another way, the construction of one, new market rate home has almost half as much impact on measured displacement as building a subsidized home. What this means in practice is that two or three new $600,000 single family homes or condominiums built in the Bay Area in the last decade or so reduced displacement in the region by as much as building a new subsidized unit. On its face, this study puts to rest the old saw that building more market rate housing leads to more displacement: it doesn’t. In fact, building more market rate housing is an effective anti-displacement strategy.

In addition to effectiveness, we also have to consider cost. If government’s faced a costless choice between so many market rate units and and equal number of subsidized units, and the only policy objective were reducing displacement, the answer would be clear. But building subsidizing housing is hugely expensive for the public sector. As we pointed out last month, in the Bay Area, new subsidized housing in San Francisco, and in the East Bay costs as much as $700,000 per unit.

There’s a further point to be made here:  one of the key reasons that “affordable” housing is so expensive to build in places like San Francisco is that land prices are very high because of zoning restrictions. As a result, the same policies that facilitate market rate housing–more density, fewer parking requirements, clear and certain approval processes–would also make it less expensive to build affordable housing.

So it is literally true that building more subsidized units for the lower income households is more powerful in reducing displacement. But its tremendously expensive as well. What the data here confirm, is what economists–and musical chairs aficionados–have long maintained: increasing supply is critical to solving the housing affordability problem.

 

Using Yelp to track economic growth

We review Yelp’s new index for rating local economies:  It’s a good start

For a long time, the only comprehensive and reliable means we’ve had of tracking and comparing economic activity across state and regional economies has been official government statistics, such as those compiled by the Census Bureau and the Bureau of Labor Statistics. While these data have many virtues, there are often significant lags between the time data is collected and the time it is reported, especially for data with high levels of geographic or industry detail. In addition, as a rule, government data strictly protect the confidentiality of individuals, and so suppress data that might the identity or precise location of a particular household or business.

The advent of big data, in the form of massive databases augmented with crowd-sourced information, adds a new dimension to our ability to track and measure local economies. One of the most exciting sources is Yelp, which tracks and publishes user reviews of millions of businesses. Yelp has just introduced its new “local economic outlook” which rates cities and neighborhoods based on their “economic opportunity.” The rankings are based on Yelp’s extensive data, and are summarized in the form of national rankings of cities (and a parallel rankings of the top 50 neighborhoods).

What color is your city’s dot? (Yelp: Local economic outlook)

Yelp has done its homework. Ed Glaeser, the dean of the nation’s regional economists, has authored a paper with two other economists testing the validity of Yelp’s business count data against the Census Bureau’s County Business Patterns data (CBP).  CBP is generally only available with a lag of a year or more and so isn’t a good guide to what’s happening right now in regional economies.  Glaeser and his co-authors find that Yelp’s data does a good job of predicting future trends in CBP data.  There is some variability: Yelp’s counts tend to be most accurate in dense, well-educated and higher income areas.  Also–unsurprisingly–Yelp’s coverage and accuracy has been steadily improving over time, and more closely agrees now with official statistics than it did just a few years ago.

It’s fantastic to get this data, but at least in its first iteration Yelp provides only rankings and doesn’t publish specific numerical estimates. We don’t know, for example, whether the cities in the top ten are 1 percent, 10 percent or 50 percent better than the median. All we have, in effect, is a top-to-bottom ranking of cities according to Yelp’s economic opportunity index. As we’ve always maintained, simple rankings that omit scalar data tend to generate a lot of heat, but actually shed little light. The Yelp report would be much more useful and interesting if it reported the actual numerical values of their index, and updated these figures on a monthly or quarterly basis. Yelp’s Carl Bialik tells us via email that they are working to extend the opportunity scores for future quarters, including releasing more raw data. We’re looking forward to this.

Because it is different from most conventional measures of economic activity, and because it is so new, it’s still a bit difficult to know exactly what the opportunity index measures, and whether its survival probabilities reflect short-term or more enduring differences in economic climate. It’s apparent from the methodological explanation (below) that they are looking at the opening and closing of business establishments, and using this information to compute survival probabilities in particular areas. But since they don’t publish numerical values for individual cities, or precisely reveal the formula for computing the index, its hard to interpret the rankings, or see how the opportunity index squares with other widely used measures of local economic activity. As we see more of the detail from the index, and can calibrate it against other measures, we’ll get a better idea of what the index is signaling and what it means.

The Yelp data is a promising and tantalizing look at how big, crowd-sourced data can help us develop a more timely and nuanced understanding of local economic activity. While their current rankings are a good way to promote awareness of the data, we hope they’ll do even more in the months ahead to publish and regularly update specific metro market indicators that others can use.

 

Winners and losers from rent control

A new study of San Francisco’s rent control shows it raises rents for some

Rent control is a perennially contentious issue. Many housing activists see it as a logical and direct way to make housing more affordable. Economists are almost unanimous that it makes things worse by promoting disinvestment and decreasing supply. Particularly in the US, the very limited experience with actual rent control means that there are relatively few places where it can be studied, and so often we’re forced to try and generalize from the experience of the few cities that have it (New York), or from foreign lessons (we reported on a study of Berlin‘s rent control last year.)

Credit: David Yu, Flickr

San Francisco has had rent control for some time, and a new paper shines a very bright light on how rent control there has affected housing affordability and the local rental market.  The paper is entitled  “The Effects of Rent Control Expansion on Tenants, Landlords, and Inequality: Evidence from San Francisco,” and is by Stanford economists Rebecca Diamond, Tim McQuade and Franklin Qian. From a research design perspective, the paper harnesses a unique aspect of San Francisco’s rent control law.  When originally adopted in 1979, the city rent control ordinance exempted small, “mom and pop” structures of four units or less. That changed in 1994, when a voter initiative extended rent control to smaller units built before 1980.

This discontinuity in San Francisco rent control system provides a convenient quasi-experiment for testing various hypotheses about the effects of rent control. As an experimental or treatment group, the study uses those pre-1980 1-4 unit homes that were subjected to rent control by the later 1994 changes to the ordinance. The control group is 1-4 unit homes built after 1980 but before 1992. One to four unit buildings may be the “missing middle” in other cities, but in San Francisco, they’re a major component of the housing stock, making up nearly 30 percent of rental housing in the city in 1990.

Rent control confers benefits for some, but raises rents for others

Those who are lucky enough to have a rent-controlled apartment garner substantial economic benefits, which Diamond, et al estimate as being worth about $2,300 to 6,600 per person annually, with total benefits of nearly $400 million per year. The benefits tend to go disproportionately to older and longer tenured renters. Those who are younger and have live in an apartment a shorter period of time are more likely to move out, allowing landlords to adjust rents upward (so called “vacancy decontrol”).

The standard economic argument against rent control is that it decreases the supply of rental housing: owners of current buildings have strong incentives to convert them to other uses (reducing the housing stock, and thereby pushing up market rents). Proponents of rent control point to some jurisdictions where rent control has had little apparent affect on housing stock, but these tend to be cities that have very weak or lenient systems of rent control: Laws that don’t limit rent increases don’t have adverse housing supply impacts. So what about San Francisco: It obviously lowers rents for those covered by the ordinance, but what has been the effect on housing supply?

Because rent controlled apartments give landlords a lower rate of return on their investment, many are looking for ways to opt out of rent control. Under California law, owners can evict tenants if they plan to occupy units themselves or if they resell the units as condominiums. In addition, landlords can offer tenants a buyout if they agree to vacate. As Diamond et al report, a key effect of San Francisco’s rent control ordinance has been to lower the supply of rental housing:

. . . compared to the control group, there is a 15 percent decline in the number of renters living in these buildings and a 25 percent reduction in the number of renters living in rent-controlled units, relative to 1994 levels.

As the pool of rental housing shrinks due to rent control, lower supply tends to push up rents market-wide. Diamond et al estimate that rent control has diminished rental housing supply by about 6 percent, and had the effect of driving rents up 7 percent. It doesn’t sound like much, but summed over the entire market, the value of the loss in welfare to renters is estimated at $5 billion.

Has rent control backfired?

One of the principal reasons for imposing rent control was to protect the ability of households of limited means to continue to be able to live in the city of San Francisco. And for some renters, it has clearly held down rents. But Diamond, et al point out that the indirect effect on housing supply have had the opposite effect, for several reasons. First, as we’ve related, owners have engaged in condo-conversions and buyouts, reducing the rental housing stock, with the effect that market-wide rents have risen. Landlords can also demolish older buildings, and their replacements aren’t subject to rent control. In addition, because landlords can raise rents to cover the costs of improvements, some landlords have renovated older units, raising their rents beyond the means of existing tenants. All of these actions are disproportionately concentrated in the cities higher income and better educated neighborhoods. Diamond and her co-authors explain the results:

Taken together, we see rent controlled increased property investment, demolition and reconstruction of new buildings, conversion to owner occupied housing and a decline of the number of renters per building. All of these responses lead to a housing stock which caters to higher income individuals. Rent control has actually fueled the gentrification of San Francisco, the exact opposite of the policy’s intended goal.

Wonky details: Big data and welfare economics

If you’re looking for a creative application of “big data” this is it. Diamond and her co-authors have assembled an impressive and extraordinarily detailed database to facilitate their analysis. They’ve identified and tracked all the individuals reported living in San Francisco in 1990. Similarly they’ve identified all the apartments in the city, including their size, age, and whether they’ve been renovated or sold. They match this to data that shows condo conversions, track city rent levels, and estimate zip code level rents for the city.

One of the criticisms of rent control levied by economists is that it leads households to stay in apartments which they be happier moving out of. Because rent control is tied to continued tenancy in a particular apartment, one might face a big rent increase if one moved to a different apartment, say in response to a change in jobs or household membership. Over time this produces a mismatch between the kind and location of apartment a household might like to live in and their rent-controlled apartment. Those for whom the mismatch grows too extreme, give up their rent controlled apartment and move elsewhere. But many stay, with their gains from rent control compensating them for the disutility of an apartment that no longer matches closely matches their preferences. Diamond, et al, examine these welfare effects and find that they are largest for younger households, who are more likely to experience changes in jobs or household circumstances that make remaining in their rent controlled apartment a larger departure from their preferences.

This study offers some of the most reliable and detailed analysis of the effects of rent control yet undertaken. It suggests that while there are some winners from rent control–those who have the good fortune to obtain rent-controlled apartments (especially older, and long-term residents whose housing preferences are unlikely to change), the net effect of rent control is at best mixed, and while some pay lower rents, others will end up paying higher rents.

 

 

 

 

Signs of the times

“For Rent” signs are popping up all over Portland, signaling an easing of the housing crunch and foretelling falling rents

A year ago, in the height of the political season in deep blue Portland (in a county which voted 76 percent for Hillary Clinton) only one thing was rarer than Donald Trump lawn signs:  For Rent signs. Portland was facing a housing shortage.  Vacancy rates had been plummeting, and in early 2016, apartment rents were going up at double digit rates. The housing crisis prompted the City to adopt an ill-advised inclusionary zoning ordinance, and led the state to flirt with authorizing rent control. That was then.

What a difference a year makes. More new apartments are coming on line, and now for rent signs are more plentiful than mushrooms after the first autumn rains.  Take a look:

We photographed these signs in the space of about an hour on a recent sunny afternoon in Portland’s close-in East side neighborhoods. These signs are all in front of existing, older apartments. In addition, the Portland area is seeing an increasing number of new apartments coming on to the market.  Between freshly built new apartments and a profusion of vacancies in existing buildings, the rental market appears to be changing. After lagging well behind growing demand for the past several years, housing supply is catching up. And this is just starting to have an effect on rents.  According to data compiled by Zillow, inflation in Portland area rents fell from a peak of more than 10 percent year over year in 2015 and 2016, to less than zero in August 2017.

The big question going forward is whether rents will decline in earnest.  The current abundance of vacancies and the 19,000 or so new apartments in the pipeline in the City of Portland and coming on the market in the next few months suggest that this is a distinct possibility.

What’s happening here is a good example of how the market works. To be sure, Portland, like a lot of cities, has experienced a temporal mismatch between demand and supply. In the wake of the great recession, demand turned around quickly as more people moved to the region and job growth returned, but new apartment construction has taken several years to rebound from the downturn. For several years, culminating in 2015 and 2016, demand outpaced supply, and pushed down vacancy rates, causing rents to surge. Now it appears the reverse is true–the number of new units being delivered to the market is growing faster than demand for housing–which is producing this bumper crop of for rent signs. The more apartments stand vacant, and the longer they go unfilled, the greater the pressure on landlords to drop prices. Already, some newer apartments are offering move-in bonuses, like a months’ free rent. It’s a sign that the market is turning.

What the signs mean

Ultimately, we think this flowering of “for rent” signs disproves two of the most durable myths about the housing markets.

The first myth is that you can’t make housing affordable by building more of it, particularly if new units are more expensive than existing ones. The surge in vacancies in existing apartments is an indication of the interconnectedness of apartment supply, and an illustration of how construction of new high end, market-rate units lessens the price pressure on the existing housing stock. When you don’t build lots of new apartments, the people who would otherwise rent them bid up the price of existing apartments. The reverse is also true: every household that moves into a new apartment is one fewer household competing for the stock of existing apartments. This is why, as we’ve argued, building more “luxury” apartments helps with affordability.  As our colleagues at the Sightline Institute recently observed, you can build your way to affordable housing. In fact, building more supply is the only effective way to reduce the pressure that is driving up rents.

The second myth is that high rents are somehow the product of an epidemic of greed on the part of landlords. There’s no evidence in Portland that landlords have suddenly had a change of heart, renounced avarice and decided to stop raising rents. Landlords find it difficult to get new tenants if they’re charging higher rents than the apartment down the street.  With so may “for rent” signs, landlords who want to hike the rent are going to have to wait a long time to find tenants. As our colleague Daniel Kay Hertz observed, the reason that housing is more affordable in Phoenix than San Francisco is not because Arizona landlords are somehow uniformly kindlier and more generous, but because the supply of housing is so much more elastic. The most effective check on “greedy landlords” is lots of competition, in the form of more supply.

We’ll watch the situation closely in the next few months, but all the signs point to an improvement in Portland’s housing affordability.

Metro economies pulling away nationally

Unemployment rates are down in cities, especially for those with less education

One of the trends we’ve been following at City Observatory has been the increasing shift of the driving forces of the nation’s economy to large metro areas, and within these areas to cities. The industries that are flourishing in today’s economy are ones that are most competitive and productive in urban environments. As a result, the locus of opportunity in the US is shifting toward cities, and away from rural areas. There are a variety of ways of measuring this shift, and one is to look at trends in the labor market, with a focus on variations in unemployment rates over time between cities and rural areas.

A new study from the Board of Governor’s of the Federal Reserve System, Labor Market Outcomes in Metropolitan and Non-Metropolitan Areas:  Signs of Growing Disparities, charts the relative progress of urban and rural areas in reducing unemployment. Fed Economist Alison Weingarden notes that as the economic expansion has proceeded, unemployment rates have come down much more sharply in large metro areas than in smaller ones and in rural areas.  In the 1990s and the first decade of the 2000s, unemployment patterns among prime aged workers (those between 25 and 54) were fairly similar, and followed similar paths in large metros, smaller ones and rural areas.  But since the Great Recession, a widening gap has emerged between relatively prosperous metro economies and struggling rural ones.  Overall, unemployment rates in rural areas, which were roughly comparable to those in metro areas during the depths of the recession, are today noticeably higher. The latest data show rural areas unemployment rates seemed to have bottomed out at 5 percent, small metro areas have settled around 4.5 percent, and large metro areas have overall unemployment rates below 4 percent.

The economic fate of those with lower levels of education also seems to be relatively worse in rural areas.  The Fed report compares the unemployment rate of those with a high school diploma or less in rural areas with those with the same level of education living in metro areas.  Prior to the Great Recession, the two rates were similar, and during the worst of the recession and in the earlier years of the recovery, it was noticeably better for your employment prospects to be in a rural area if you had a limited education. But in the past four years, the unemployment rates for high school graduates in metro and non-metro areas have diverged sharply.  Today, unemployment rates for those with just a high school diploma are nearly 2 percentage points higher in rural areas than in metropolitan ones.

These data are consistent with a continuing shift of economic activity to urban areas, and the limited opportunities, especially for those with less education, living in rural areas. Another implication of this data point is that even those with relatively less education (just a high school diploma) have better prospects of being employed if they live in a metro area than if they live in a rural one.

Weingarden, Alison (2017). “Labor Market Outcomes in Metropolitan and Non-Metropolitan Areas: Signs of Growing Disparities,” FEDS Notes. Washington: Board of Governors of the Federal Reserve System, September 25, 2017, https://doi.org/10.17016/2380-7172.2063

Portland’s Inclusionary Zoning Law: Waiting for the other shoe to drop

Developers stampeded to get grandfathered before new requirements took hold, will the pipeline run dry?

In December, Portland’s City Council adopted one of the nation’s most sweeping inclusionary zoning requirements.  Most new multifamily housing projects will have to set aside 20 percent of their units for families earning less than 80 percent of area median income (or alternatively 10 percent for families earning less than 60 percent). While the ordinance is intended to increase the supply of affordable housing, it creates a major burden on new developments, and may therefore actually reduce the housing supply. We and others will be watching closely to see what happens.

The measure took effect on February 1, 2017, giving developers a narrow window to file land use applications prior to the new rules taking effect. As of February 1, the city had an inventory of development approval requests for nearly 19,000 units of housing, about a 3-4 year supply given the rate at which new multi-family housing has been built in Portland in the past several years. Developers of these projects have until 2020–and if they nurse their projects through the permitting process, even longer–to move forward with their construction plans.

Last week, Portland’s Bureau of Planning and Sustainability released a short report describing some of the results of the first six months of experience under the new inclusionary zoning ordinance.Writing for Portland for Everyone, Michael Andersen has a very upbeat, glass-is-half full story stressing the 60 or so affordable units that might be attributable to the new ordinance (if they move through the permitting process and actually get the promised incentives). Last week, City Observatory attended a meeting of city staff and other interested parties that reviewed the report.  Here’s what we learned.

So far, between the glut of projects filed just before the new rules took effect, and the uncertainty and cost associated with the new requirements, new private apartment development proposals in Portland have all but disappeared. Since February 1, according to city officials, there have been no new private apartment projects of more than 20 units submitted for land use review. Five publicly sponsored projects are moving forward, and three projects permitted under the old rules have submitted new applications seeking the density bonuses and parking requirement waivers available under the new inclusionary ordinance.

Smaller units and the small building exemption

The new inclusionary zoning rules appear to be creating incentives for developers to “go small” either to avoid the inclusionary zoning requirements entirely or to minimize the cost of compliance. At the city-sponsored meeting to review the first six months of progress under the inclusionary zoning ordinance, several observers pointed out that the projects that seem to be going forward under the new law are coming from developers of “micro-apartments”:  very small studios. The city’s ordinance requires that “inclusionary” units be comparable in size and amenities to a building’s market rate units, so if a developer builds tiny market rate apartments, it faces comparably smaller costs of building its “affordable” ones. Meanwhile, it may be able to qualify for the full value of incentives (parking waivers, floor-area bonuses, and property tax exemptions).  City staff agreed that micro-apartments are more likely to be financially viable under the inclusionary zoning program than are larger apartments.

Some developments may avoid the new inclusionary requirements entirely: the city’s inclusionary zoning ordinance applies only to buildings with 20 or more units. City officials are looking to see whether there’s been an increase in applications for 15- to 19-unit buildings.  Data for the first six months of the inclusionary program (February 1 to August 1, 2017) show that 10 projects of 15- to 19- units were submitted.  In the previous 12 months, the city permitted 16 units in this size group. This fragmentary data suggests that on an annual basis, the number of such units permitted has increased 25 percent (from 16 units per 12 months to 20 units per 12 months).  Given the time lags in developing projects and seeking permission, however, its unlikely that the market has yet had time to react to the under 20 unit exemption. City staff will want to track this statistic closely in the months ahead.

For the immediate future, the city’s housing supply and rental affordability will benefit from the land rush triggered by the inclusionary housing requirements. It will take two to three years for developers to construct the 19,000 or so units that are now grandfathered under the old rules. As this inventory is gradually liquidated, however, the big question is whether developers will step forward and propose projects under the new requirements.

What to watch for

The key measure of program success will be whether new privately sponsored apartment projects move forward. If Portland doesn’t start seeing proposals for new 20+ unit developments soon, that’s a bad sign. It will mean that developers are being deterred by the cost and uncertainty associated with the inclusionary housing requirements.

A dearth of new apartment projects would be the clearest signal of trouble, but there are a couple of other measures to look at as well:  First, how does what happens in Portland compare to what happens in the rest of the region? Portland’s housing market is regional, and while city locations are in high demand, it’s possible that some development could shift to other locations. In effect, the region’s suburbs represent a kind of control group: none of them has a similar inclusionary housing requirement. If apartment construction proceeds apace or accelerates in nearby suburbs (some of which abut Portland), while stagnating in Portland, that would be a sign that the inclusionary zoning program is pushing development away.  It would be ironic if a program that labels itself “inclusionary” actually has the opposite effect by excluding housing and additional population from the city.

Another indicator of impact is what kind of units get built in Portland.  If the new development projects are disproportionately under the 20 unit threshold, or if new projects skew heavily toward micro-apartments, that would be an indication that the inclusionary zoning requirements are warping the housing market. Incentivizing developers to stay under 20 units may mean that some sites that could accomodate greater density are under-built, which essentially wastes public resources and promotes sprawl. Similarly, encouraging developers to build more micro-apartments may push down the price of studio apartments, but will do little to help the affordability of larger units which can accomodate families.

Finally, if it’s concerned about the impacts on housing supply, the city ought to be carefully monitoring changes in land prices. It’s often argued that inclusionary requirements won’t affect development because developers will simply bid less for land, effectively passing at least some of the cost of compliance on to landowners. If that theory is correct, we ought to observe a lessening in prices paid for land that could accomodate apartments in Portland.

In short, its too soon to tell what the effects of the inclusionary housing mandate will be. The negative effects of the ordinance will be concealed and delayed by the big backlog of housing permitted under the old rules. But when that inventory is gone, the real effects of the ordinance will be more apparent.

 

 

Transportation equity, part 2: the Subaru and the Suburban

Flat per vehicle registration fees charge lower rates to wealthier households with more road damaging vehicles

The prospect of shifting from using a combination of vehicle registration fees, fuel taxes and general revenues to pay for roads, to a system of road pricing, which would charge vehicle users only for the amount of time they travel on expensive and congested roadways has provoked claims that its somehow inequitable. Last week, we took a look at the average incomes of people who drive to work at peak hours, compared to those who walk, bike or take transit. Peak hour drivers have average household incomes nearly double those of other commuters. On average, road pricing systems that employ peak hour fees will tend to put more of the burden of paying for roads on those with higher incomes.

One of the assumptions of those who question the “fairness” of road pricing is the notion that somehow our current system of paying for roads is a fair one. But what’s equitable about our current system of paying for roads? Not much, as it turns out. Today, we’ll take a close look at a principal way that many states pay for road:  vehicle registration fees. In Oregon, for example vehicle registration fees account for about 35 percent of the roughly $900 million the state collects annually in fees on passenger vehicles and light trucks. In Oregon and many other states, these registration fees or “car tabs” are flat fees that aren’t related to a vehicle’s value, or how much its driven, or how much wear and tear in causes on public roadways.

Two of the foremost principles of public finance are the “ability to pay” principle and the “benefit” principle.  The ability to pay principle means those with higher levels of income ought to be expected to pay more toward the cost of public services than those with more limited means.  The benefit principle means that costs ought to be allocated to people in proportion to the benefit they receive from public services. Fixed, per vehicle registration fees violate both these tenets of public finance.  Vehicle fees aren’t related at all to how much damage a vehicle does to the roadway (a proxy for repair and maintenance costs), nor are they related to the income or ability to pay of the vehicle owner.

To show how inequitable this can be, let’s look at a particular case in point:  two car-owners in the Portland metro area. One is a City of Portland resident who owns a 15-year old Subaru Outback; the other is a resident of suburban Clackamas County who owns a nearly new Chevrolet Suburban. The Portland resident lives close to most of her common destinations, and has good access to transit and bikeways, and so drives her aging Subaru about 6,000 miles per year. The Suburban-owning suburbanite has to commute a long distance to work, and drive to most common destinations, and ends up putting about 15,000 miles on her vehicle.  (We’ve chosen our two examples to represent some of the most blatant inequities in the current system; as it turns out, Multnomah County’s local fee is primarily to pay for a bridge across the Willamette River that’s used both by residents of Multnomah and Clackamas Counties; but Clackamas County declined to impose a registration fee on its residents).   Here are our two sample vehicles.

2002 Subaru Outback2017 Chevrolet Suburban
$4,000$53,000

In Oregon, all vehicles currently pay a flat registration fee.  In most counties, vehicles pay just the state fee; in Multnomah County, where Portland is situated, there’s an additional local registration fee, collected by the state.  We’ve gathered data from current Craigslist advertisements for the two different vehicles shown below. Here’s the annual cost of registering our two vehicles, in 2017.

Vehicle 2002 Subaru Legacy 2017 Chevy Suburban
Market Value $4,000 $53,000
County Registered Multnomah Clackamas
Registration Fee (Annualized) $62 $43
Fee/Value 1.6% 0.08%
Miles/Year 6,000 15,000
Fee/Mile $0.010 $0.003

The 2002 Subaru pays a $62 fee per year; its Suburban counterpart pays a $43 fee. How do these fees relate the their owner’s ability to pay and their use of the roadway? First, as a share of each vehicle’s value, there’s a huge disparity.  The Subaru’s annual registration fee works out to about 1.6 percent of its value; the Suburban is taxed at a rate of just 8 one-hundredths of one percent of value–about 20 times less. In general, there’s a strong relationship between the value of vehicles owned and personal income, so viewed from the perspective of ability to pay, flat vehicle registration fees are highly regressive. If registration fees in Oregon were tied to a vehicle’s value and the Suburban owner paid the same rate as the Subaru owner does today, her registration fee would be over $800, rather than $43.

A second way to look at registration fees is to work out how much per mile driven each vehicle owner is asked to pay.  As a practical matter, the cost of vehicle registration per  additional mile driven is zero. But let’s focus on how much money state and local governments are collecting when visualized on a per mile basis.  The Subaru driver, who travels 6,000 miles per year, pays a vehicle registration fee that works out to about 1 cent per mile. In contrast, the Suburban driver (living in a suburban county with a lower annual fee) who drives 15,000 miles per year, pays just three-tenths of one cent per mile.  So the net effect of fixed, per vehicle registration fees is to load more of the costs of driving on those who burden the system least, and effectively subsidize those who drive the most.

When people talk about road financing, we usually focus on the gas tax.  But for those who drive fuel efficient vehicles relatively few miles per year, the vehicle registration fee is actually almost as big a cost as the gas tax.  Oregon currently charges a 30 cent per gallon state gas tax.  Our Subaru driver, going 6,000 miles per year at 25 miles per gallon buys about 240 gallons of gasoline annually, and pays a total state gas tax of $72–only $10 more than her registration fee.

The violation of the “benefit” principle by the flat registration fee is amplified by the fact that larger vehicles cause dramatically more wear and tear on pavement than lighter ones. Estimates vary, but a good rule of thumb is that road wear increases with the fourth power of the weight of a vehicle. This means that a 5,600 pound Suburban causes about ten times as much road wear as a 3,100 pound suburban. (The math:  5,600 raised the the fourth power is about ten times 3,100 raised to the fourth power). And in our example, since the Suburban is driven three times as many miles as the Subaru, the combination of its greater weight and more driving means that on an annual basis its doing 30 times as much damage to the roadway. (And we’ll completely ignore whether the Suburban has studded tires, which collectively do an estimated $50 million in damage to Oregon roadways each year).

Whether we look at it from the standpoint of ability to pay or from the standpoint of the benefit principle, there’s little that’s fair about the system of charging flat registration fees that bear no relationship to a vehicle’s value, how much its driven, and how much wear and tear it causes to the roadway. It’s reasonable to ask whether any new proposed system of road pricing is equitable; but if we apply the same test to major features of the existing system, we find it wanting to an even greater degree. If we’re really interested in making the road pricing system fairer–especially to those low and moderate income households with older, less valuable cars, who drive shorter distances–we ought to move away from flat per vehicle fees, to charges that are related to how much people drive, and how much wear and tear their cars cause to the roadway.

 

Transportation equity: Why peak period road pricing is fair

Peak hour car commuters have incomes almost double those who travel by transit, bike and foot

The Oregon Legislature has directed the state’s department of transportation to come up with a value pricing system for interstate freeways in the Portland metropolitan area.  A key idea behind value pricing is that it would charge those who use freeways at congested peak hours a higher toll than at other times; tolls might even be zero in off-peak hours, giving travelers strong incentives to use the system when there was available capacity. One of the concerns that’s been raised about value pricing is that it will be an undue burden on the poor or low wage workers who might have little choice but to travel at the peak hour. Bike Portland‘s Jonathan Maus explains how this came up at a recent Portland City Council meeting, in testimony presented by the Oregon Department of Transportation’s public affairs staffer Shelli Romero:

Surprisingly, she [Romero] also attempted to impugn congestion pricing in general with a strange and unfounded jab. “Several people have brought up the issue of congestion pricing,” she said, “but there has been very little mention about how equity considerations, when you look at congestion pricing on this section of I-5, would be taken into consideration.” This is an odd statement from the staffer of an agency that has a mandate from the legislature to create and implement a congestion pricing program.

Of course, it’s always possible come up with an anecdote of a struggling minimum wage worker who drives an hour or more each way, always in peak hour traffic, and who would find tolls burdensome.  Although, strangely, this argument seldom seems to be made about those who pay bus fare that’s the same amount regardless of income.

Rather than rely on dueling anecdotes, we thought we’d take a look at the data.  How do the household incomes of those who drive to work compare to those who don’t? The American Community Survey gives us a window into the commuting patterns of the nation’s workers, and lets us look at variations in family income. We’ve used the indispensable IPUMS website to extract data on commuting choices and family income for the Portland metropolitan area.  The data are drawn from the five-year 2011-15 sample, and include all adults (persons aged 18 and older).

First, let’s compare the family incomes of those who travel to work by car with those who either aren’t in the labor force (non-workers). On average, adults who aren’t working (including not in the labor force, unemployed, retired and students), live in households with average incomes of just under $40,000 per year; those who commute to work by car have incomes about 75 percent higher ($73,600). Similarly, those who take transit (median household income of just under $45,000) and those who walk or bike to work (median incomes of just over $42,000) have much lower incomes than those who drive to work.  Overall, the median income of those who commute to work by car is more than 50 percent greater than those who aren’t workers or who travel by other modes.

Peak hour drivers have higher incomes

A second thing to keep in mind is that congestion pricing programs invariably charge higher fees to those who travel at peak travel times. But many workers, especially those with lower wages, don’t work a regular 9 to 5 schedule, and as a result, don’t travel during peak hours.

The American Community Survey provides a useful glimpse of these daily travel patterns. One of the questions it asks is what time workers usually depart on their journey to work.  (Unfortunately, the Census doesn’t ask what time workers usually leave work to travel home).  Nonetheless, ACS  gives us a good picture of which workers are traveling to work in the morning rush hour, so we use this data to divide workers into “peak” and “non-peak” travelers; treating those who routinely leave for work between 7:02 AM and 8:02 AM as “peak” travelers, and everyone else as “non-peak.”

For this analysis, we examine only those who commute to work by car — our objective here being to sort out the relative incomes of peak hour car commuters compared to those who drive to work at non-peak hours.

The following chart shows the results:

The median family income of those who drive at the peak hour (estimated from the morning peak) is nearly $83,000, about 20 percent higher than for those who drive to work at other hours.

These data suggest that peak hour road pricing predominantly affects those with the highest incomes. Those who don’t work, who travel to work by walking, cycling or transit, and those who commute to work in off-peak hours have incomes that are significantly lower than peak hour road users.

Its also important to note that accommodating peak hour drivers is the most expensive component of the transportation system:  One of the most unfair aspects of our current system of paying for roads is that it charges everyone the same amount, regardless of whether they use the road when its congested or whether they use it when few people are on the road.  A system that shifts more of the cost of the road system to peak hour users is fairer and more progressive than one that ignores mode and time of travel, as today’s road finance system largely does. Far from being inequitable, peak hour pricing asks those who place the greatest demand on the transportation system and have the highest ability to pay, to take financial responsibility.  That, more or less, is the definition of fairness.

Many thanks to the IPUMS team at the University of Minnesota:

Steven Ruggles, Katie Genadek, Ronald Goeken, Josiah Grover, and Matthew Sobek. Integrated Public Use Microdata Series: Version 6.0 [dataset]. Minneapolis: University of Minnesota, 2015. http://doi.org/10.18128/D010.V6.0.

Racial wealth disparities: How housing widens the gap

The wealth of black families lags far behind whites, and housing markets play a key role

There’s a great article from The New York Times’ Emily Badger about a new study that shows just how much Americans (especially white Americans) underestimate the gap in the economic circumstances between black and white families. The study also makes the point that we tend to greatly overestimate the amount of progress that’s been made in closing that gap.

The Times’s story is based on research by Yale’s Michael Kraus, Julian Rucker and Jennifer Richeson, entitled “Americans misperceive racial economic equality.” Their paper that compares a series of surveys about perceptions of earnings, income and wealth gaps between blacks and whites with data gathered by the Census Bureau. The headline finding is that the average respondent thinks that black wealth is about 80 percent that of whites; whereas Census data suggest that black wealth is about 5 percent that of whites.

Let’s zero in for a moment on the question of the wealth disparity. While we have multiple measures of income, we have actually relatively few measures of the wealth of American households. One survey conducted by the Census Bureau (the Survey of Income and Program Participation, SIPP) asks questions about financial holdings and debts. The other survey is undertaken on a triennial basis by the Federal Reserve Board (the Survey of Consumer Finance, SCF). The SCF asks more detailed questions about investments, banking, credit, automobile and home ownership and related issues. There’s actually a terrific analysis by the Federal Reserve’s Jeffrey Thompson and Gustavo Suarez, entitled “Exploring the Racial Wealth Gap Using the Survey of Consumer Finances,”

We’ve plotted data from the Thompson & Suarez report  for the period 1989 through 2013 to chart the median net worth of black and non-Hispanic white households. The data are shown in 2013 dollars. The red line corresponds to the net worth of black households; the blue line non-Hispanic white households (values on the left axis) and the gray bars show median net worth of black households as a percentage of the median net worth of non-Hispanic white households (measured on the right axis).

A couple of observations: First: as of 2013, the net worth of the typical household hadn’t rebounded to pre-recession levels. This was true for white and black households alike. But the decline for black households was proportionately greater than for whites. The median net worth of black families fell 42 percent, from $19,200 in 2007 (on the eve of the Great Recession) to $11,100 in 2013.  The median net worth of white families decline as well, but by only 27 percent, from $183,500 in 2007 to $134,100 in 2013.

Second, as we look back at the longer historical record it was quite clear that during the 1990s in particular, black households were actually closing the wealth gap with their white counterparts.  In 1989, the typical black household had a net worth than was only 5.6 percent of the typical white household.  By 1998, black households net worth was 16.3 percent of that of whites. Black households treaded water during the early years of after 2000, and have clearly lost ground relative to whites in the wake of the Great Recession.  Today average black wealth stands at just 8.3 percent that of whites.  (This figure is slightly higher than the 5 percent reported in The New York Times story; excluding the value of owner-occupied homes, the SIPP reports that black wealth is about 5.3 percent that of white households in 2013.)

So what’s the explanation?

A lot of this has to do with housing markets, housing policy and the housing cycle. Households with good access to credit prior to the housing bubble were in the best position to profit from the run up in house prices (and note that white net worth outpaced black from 2001 onward). As we’ve explained at City Observatory, low income households generally, and households of color in particular tend to suffer from bad market timing: buying a home later in the housing cycle (when prices were higher) exposed them to more risk when housing markets collapsed. Moreover, housing is a larger fraction of the net worth of low income households and households of color, so when housing prices went down, they were harder hit that the typical white household (which had a much more diversified wealth portfolio).

There’s also an important spatial bias in black household homeownership. Black households tend to buy and own homes in neighborhoods with greater price volatility, especially on the downside. As Zillow demonstrated, the housing bust produced sharper and more sustained declines in home prices for households of color than for whites.

The takeaway–while it’s certainly true that white households have a huge (and widely under-estimated) edge in wealth, it’s not the case that we have not made progress. The decade of the 1990s stands out as a period in which the wealth of black households increased significantly relative to their white counterparts. What’s remarkable is that the housing bubble and the Great Recession essentially erased all of the relative gains in black household wealth from the 1990s. The lesson of the last twenty years seems to be that encouraging greater homeownership is not just ineffective in reducing the racial wealth gap, but is actually counterproductive.

And there’s a post-script here:  As startling as the wealth gap is between blacks and whites, its even sharper between owners and renters. According to the Census Bureau, the median net worth of a homeowner in the United States was $199,600.  The median net worth of renters is $2,200, barely one percent of that amount.  This disparity speaks strongly to the subsidies and tax preferences for housing as an investment. But it also shows that we have little if anything to offer in the way of a wealth-building strategy to the third to forty percent of the nation’s households who rent their homes. Given the financial perils of encouraging homeownership for those with modest incomes, we ought to be devoting more attention to mechanisms to help families build wealth without having to go long in the real estate market.

 

Cities lead national income growth, again

Average household income in cities is increasing twice as fast as in their suburbs

Earlier this week, the Census Bureau released its latest estimates of national income based on the annual Current Population Survey. The data show some good news: a continued improvement in household incomes and a reduction in poverty. Median household income increased 3.2 percent to $59,039.  Significantly, poverty rates declined in 2016, by almost a full percentage point, from 13.5 percent to 12.7 percent.

But drilling down more deeply into these data shows that city economic growth, as measured by changes in average income has been especially strong. Median household income in “principal cities”–the most populous municipality in a metro area–were up 5.4 percent over a year earlier. Meanwhile, the increase in incomes in areas outside the principal city but still inside a metro area were up only about half as much: 2.1 percent. Although using municipal boundaries and principal cities to demarcate of “city” and “suburbs” is imperfect, these data suggest that income growth in cities is significantly outstripping that of suburbs.

This performance echoes a similar strong gain in city incomes that we reported on last year.  In 2015, city incomes out-paced suburban incomes according to Census Bureau tabulations 7.2 percent to 4.0 percent.

While the Census data show a clear pattern of city incomes outpacing suburban ones, they shed little light on the exact causes.  Is it the incomes of existing city residents that are increasing faster than the incomes of existing suburban residents, or is it the product of migration?  For example, if high income households are moving from suburbs to cities, and low income households are moving in the opposite direction, that would tend to accentuate city income growth and retard suburban income growth. The fraction of the population that moves in any one year is so small, however, that it’s unlikely to be the major cause here.

And for those who are worried about a so-called “Great Inversion”–the idea that cities are for now dominated by the rich and suburbs are populated largely by the poor–that’s clearly not the case.  Even after a couple of years of much faster income growth in cities, average household incomes in cities are still, in the aggregate, much lower than in the suburbs.  The median income of households in principal cities is $54,800, which is about 17 percent lower than the average of those living outside these principal cities in metropolitan areas ($66,300).  In 2016, cities closed the gap in incomes with suburbs by about 2.5 percentage points.  In addition, poverty rates in cities (15.9 percent) are still noticeably higher than in suburbs (10.0 percent).

This is yet more evidence of the growing strength of city economies. As we’ve pointed out at City Observatory, job growth in urban centers has been robust in this economic cycle, outpacing that of suburbs in many metro areas around the country. The demand for urban living is also fueling city income and economic growth.

Cognitive dissonance on the Potomac

How can a city be named the first “LEED Platinum” city and be building freeways in its suburbs?

Submitted for your approval: Two recent news items from our nation’s capital.  In the first, Washington DC proudly announced that has been proclaimed the world’s first LEED Platinum city–based on the number of LEED-certified buildings it’s built in the past decade.  Here’s a story from Washington’s WAMU:

 

The second news item is about the region’s progress toward building a $2.3 billion, 22-mile long I-66 freeway widening project to fuel ever longer commutes from its most distant, and still sprawling suburbs.  To be fair, several of the lanes will be high occupancy toll lanes, and will price travel and encourage carpooling; but the net effect of the project is to greatly expand the capacity for car travel. Meanwhile, massive maintenance and repair problems plague the DC Metro subway system, and its financial health seems to be in even worse shape.

To us, this raises a big question:  How can you bee a LEED platinum city, if you are spending billions widening freeways, and your public transit system is in physical and financial disarray?

The LEED Platinum City status is handed out by the US Green Building Council, and while it’s a step forward from the structure-by-structure approach to sustainability, it’s looks a lot like a lifetime achievement award for getting (and ponying up for) LEED certification for new buildings.  And keep in mind, there are real questions as to whether LEED certified buildings are, on average, any more energy efficient than other new buildings. Washington DC has more than 120 LEED certified projects according to the Green Building Council but that’s still a tiny fraction of the entire building stock of of tens of thousands of commercial, office, industrial and residential properties in the city.  As related by WAMU, erecting lots of LEED certified buildings seems to be a key reason Washington earned platinum status:

On the green building front in particular, the District leads the way within the United States. The city has more LEED-certified projects per capita than any state. Many of those facilities are part of the D.C. public school system, including Brookland Middle School, which recently became the third D.C. Public School facility to receive a LEED Platinum certification.

Like many other US city leaders, DC Mayor Muriel Bowser makes a point of declaring the city’s commitment to following the Paris Climate Accords. But a rhetorical commitment to fighting climate disruption requires something more than slightly more efficient buildings.  In most cities, transportation is a principal source of greenhouse gas emissions.  No matter how many LEED buildings you build inside the beltway, if your regional transportation system is built on an ever-expanding freeways, it’s hard to see how you can think of your city as “sustainable.”

We seem to be stuck in this unfortunate world of cognitive dissonance, where mayors and architects proudly attach LEED plaques to relative handfuls of green buildings in one part of town, while at the same time, just down the road, massive amounts of public resources are subsidizing auto travel and sprawl.

As we suggested when it came to the solar powered 1,400 space parking garage put up by the National Renewable Energy Laboratory, and a Zero Net Energy Home with a sub-50 walk score, it takes a particular kind of tunnel vision to look only at the energy used by particular buildings, and to completely ignore the energy and pollution intrinsically associated with the urban (or suburban) landscape and transportation system of which they are a part: A kind of tunnel vision that puts you squarely in the twilight zone.

An affogato theory of transportation

Coffee and ice cream and jam (or traffic jams)

Just once, we are going to sugar-coat our commentary.

Affogato (1912Pike.com)

At City Observatory, we know that a lot of what we present is highly technical, especially when it comes to understanding the complex and dynamic problems of transportation. But when it comes to transportation policy, two of the most important lessons you can remember can be summarized in an analogy to a single beverage: the affogato.

The affogato, as you know, consists to two parts:  gelato and espresso.  You pour a freshly brewed espresso over a healthy dollop of gelato, and there you go.  To us, each of the ingredients of the affogato ties directly to a simple story about how transportation systems work.

Gelato:  The Ben and Jerry’s Theory of Traffic Congestion

The first part of our affogato theory is the gelato, or to American tastes, ice cream.  Here we have what we call our “Ben and Jerry’s Theory of Traffic Congestion,” which is inspired by this national ice cream chain’s annual practice of “Free Cone Day.” One day a year, the company gives away ice cream for free.  When they do, people are lined up around the block at Ben and Jerry’s stores.

Long queues for a “free” ice cream cones are exactly the same as traffic congestion. The reason that people are all waiting in line is because what they’re going to consume is way, way under-priced. Peak hour road delays occur because we don’t send price signals to consumers that its really expensive to provide road capacity at rush hour. And the Ben and Jerry’s model also tells us why state and city transportation agencies are chronically short of money: if you don’t charge people for your product, especially your most expensive product, you’re going to go broke pretty quickly. Free cone day only works because Ben and Jerry hold it one day a year:  your local highway department is trying to run free cone day every single day, with predictable results for traffic and their budget.

Espresso:  The Cappuccino Congestion Index

The second part of the affogato is the espresso, which in this case symbolizes our Cappuccino Congestion Index. Let me explain. Periodically, you’ll read scary sounding estimates of the economic costs of traffic congestion. Typically, they’re arrived at by computing how much longer a rush hour trip takes than one taken at say, 2 am, and then multiplying additional minutes of travel by some high value of travel time, and voila, you’re into the billions pretty quickly.

The troubles with this exercise, as we’ve pointed out before, are manifold, and we could point you to our 40 page critique of this research. But it’s simpler, and more memorable, we think, to point to the second part of our affogato analogy:  The Cappuccino Congestion Index.

If you’re like most Americans, you periodically, and often daily, patronize your local coffee shop for an espresso, a latte, or a cappuccino.  If you show up at say, 8:30 am or 10 am, its a virtual certainty you’re going to have to wait in line behind others who are also looking for their morning coffee fix. Using the same techniques that underlie the traffic congestion cost estimates, we were able to compute how much this cappuccino delay costs American coffee drinkers–it too, runs into the billions of dollars per year.

But just as with traffic congestion, there’s no feasible way that Starbucks (and your local coffee shops) will hire enough baristas and buy enough coffee machines and rent large enough storefronts that you’ll have a zero wait time during peak coffee consuming hours. And as a consumer you know that if you want your coffee at the same time as everyone else, you can expect to wait a couple of minutes.

It’s possible to compute how many minutes of time we spend waiting in lines (whether in our cars or not), and multiply that by a value of time and get very large numbers. But that doesn’t mean that there’s any feasible way to build enough capacity that we never encounter delays.

Transportation is a complex topic. But a couple of simple parables, one about free ice cream and the other about the every morning line at the coffee shop, tell us much of what we need to know about the economics of traffic congestion. Think about that the next time you order an affogato; you’ll probably have to stand in line for a while, but it’s worth the wait.

 

 

Inequality in three charts: Piketty, the picket fence and Branko’s elephant

Rising inequality in the US isn’t new; Declining inequality globally is.

Scratch just beneath the surface of many daily problems, and you’ll find income inequality is a contributing factor, if not the chief culprit.  Whether its concentrated poverty, soaring housing costs, disparities in educational attainment and public services, or the nation’s political divide, it all seems related to growing inequality.

US Inequality: The picture is getting sharper, but the trend has been evident for 25 years

A new chart published earlier this month in the New York Times brings the magnitude of the inequality problem into sharp focus. Based on work by Thomas Piketty and his colleagues, it shows how much incomes have changed at every point in the income distribution. As the chart makes plain, income gains in the US have been highly concentrated in the top 1 percent of the population (and within that group, within the top 0.001 percent). This chart is from an excellent anlaysis published by Vox which explains Piketty’s research in more detail).  What the chart shows is that for those in the bottom quintile of the income distribution (under 20 percent), gains in real income, prior to taxes, were negative between 1980 and 2014; only the net effect of tax changes got their income to zero change.

But while sharply detailed and shocking, this isn’t really new.  A quarter of a century ago (1992), then relatively obscure though outspoken academic economist Paul Krugman wrote an article (“The rich, the right, and the facts”) for The American Prospect contrasting what he called the “the picket fence and the staircase.” Though much cruder that the measures generated by Piketty, Krugman’s computations show the same trend. While during the growth period of the 1945 to 1973 period  the growth in US incomes resembled a picket fence (each quintile of income seeing roughly the same percentage increase), after 1980, the growth in incomes resembled a steep staircase (with income growing faster for those with the highest incomes).

Paul Krugman’s Picket Fence & Staircase

Its easy to imagine that the same processes are at work globally.  Its still the case that global income is highly unequally distributed, and those in Europe and North America have a disproportionate share of world income. And while still highly unequal, the global distribution of income has become much less skewed in the past couple of decades. And to contrast with Piketty’s curve and Krugman’s picket fence and staircase, we have Branko’s elephant.  Drawn by economist Branko Milanovic, this chart depicts the global distribution of income in exactly the way that the other charts depict US income gains.  The horizontal axis corresponds to average income and the vertical axis measures the percentage increase in income between 1988 and 2008.

The elephant shape of the curve shows low gains among the very poorest of the global poor (at the far left), sizable gains for those in the 10th through 70th percentiles of the global distribution of income, and a sharp falling off of gains for those in the 75th through 90th percentiles–with actual declines in income for some near the 80th percentile.  The elephant’s soaring trunk (C) is the income gains of the the top 1% and 5% of the global distribution.

The global richest are getting richer, but so are the bottom two-thirds of the world’s population

Branko’s elephant contrasts the startling improvements in measured income, led by growth in Asia, and particularly the development of India and China. Those in the middle of the income distribution actually recorded the fastest income gains. The residents of these countries have seen their real incomes increase substantially in the past two decades. And as Milanovic points out, this development has produced the first decline in global income inequality since the industrial revolution began. The gains at the very top (the tip of the trunk) reflect the superstar rich, throughout the world. And the fall off in the 85th percentile (the drooping bottom  of the trunk at B)  reflects the plight of those in advanced economies who haven’t benefited from globalization and technological change. These include the bottom half of the income distribution in many “developed” economies, and many modestly educated workers in the US.

So when it comes to inequality, there’s bad news and good news here. The concentration of income among the top 1 percent is real and growing, both in the US and globally. But its also the case that a huge fraction of the world’s population enjoys a higher income level today than two decades ago, and as a result, the global distribution of income is less skewed (at least for the bottom 95 percent of the population). And its important to note that the inequality problem isn’t playing out the same way everywhere: some nations have experienced much less of an increase in inequality than the US. Differences in national policies and institutions, play a critical role in determining income distributions.

 

What Dallas, Houston, Louisville & Rochester can teach us about widening freeways: Don’t!

Portland is thinking about widening freeways; other cities show that doesn’t work

Once upon a time, Portland held itself out as a national example of how to build cities that didn’t revolve (so much) around the private automobile. Back in the 1960s and 1970s, it recognized that building more freeways just generated more traffic, and it tore out one downtown freeway, and cancelled another, and instead took the bold step of investing in transit and encouraging greater urban density.

But now the region is confronted with proposals to spend upwards of a billion dollars on three freeway widening projects. The idea that widening freeways will reduce congestion has been thoroughly debunked. Economists now talk about the “Fundamental Law of Road Congestion“–each incremental increase in highway capacity generates a proportionate increase in traffic, with the effect that congestion quickly rebounds to previous levels–accompanied by more sprawl, longer trips and increased pollution. As it contemplates spending upwards of a billion dollars on three proposed freeway-widening projects, Portland might want to spend a little time looking at what’s been learned in other cities around the country.  The experiences of four cities confirm the lessons that Portland thought it learned four decades ago.

Houston

Add as many lanes as you like, you’ll just get more traffic and congestion

Adding lanes doesn’t end congestion. (Houston Chronicle)

America’s largest freeway is Houston’s 23-lane Katy Freeway. Its been widened many times, always, ostensibly with the idea of eliminating congestion. But no matter how wide it gets, added capacity just induces further flung development and more peak hour driving, with the result that the freeway is even slower today than it was when it was widened just a few years ago. Texas spent $2.3 billion to widen the road, but just 3 years after in opened, the morning commute has increased by 25 minutes (or 30 percent) and the afternoon commute has increased by 23 minutes (or 55 percent). It’s just one of many examples of how expanding freeway capacity to fight congestion is simply futile.

Dallas

Even in the Lone Star State, they’re willing to cancel big road projects

In Dallas: A park instead of a highway.

For decades, Portland has prided itself on its 1970s era decision to tear out one freeway (Harbor Drive) and to forego building another one (the Mount Hood Freeway). Meanwhile, in much of the Sunbelt, cities like Houston built more and wider freeways. But even in Texas, the tide is turning. Just this month, the City of Dallas junked decades old plans to build a six-lane tollway to relieve downtown traffic congestion. Called the Trinity Parkway, the billion dollar road would have been built in the floodway of the long-neglected Trinity River that flows in and near downtown Dallas. For years, the project has moved forward with a steady, and familiar refrain:

Supporters of the road have long said it is crucial to relieving daily congestion on the knot of highways surrounding downtown.

But earlier this month, the Dallas City Council voted 13-2 to cancel the tollway.  Instead, the Trinity River floodplain will be developed as a park. Kinda like what Portland did with its waterfront four decades ago.

Louisville

If you widen first, and toll later, you’ll waste millions or billions

One aspect of Louisville, Kentucky’s transportation system looks a lot like Portland’s. Louisville lies just south of the Ohio River, and every day, tens of thousands of suburban Hoosiers use the interstate freeway to commute to jobs in Louisville, mostly on the I-65 bridge. (In Portland, it’s tens of thousands of Washingtonians crossing the Columbia River, principally on Interstate 5, to jobs in Oregon). Until a couple of years ago, the I-65 river crossing, like I-5, consisted of six travel lanes. Six months ago, Kentucky and Indiana completed a billion dollar freeway widening project that expands I-65 to twelve lanes (by twinning the existing Ohio River bridge). To help pay for the new bridge, the state’s started charging a toll that averages about $2 (with big discounts for regular commuters). The result: despite doubling capacity, the number of people using the I-65 crossing has fallen by almost half. Now the new super-sized river crossing is grossly under-used, even at rush hour.

This is rush hour on I-65 in downtown Louisville, with tolls (and a billion dollars of un-needed freeway).

If Louisville had tolled the river crossing before committing to constructing additional capacity, it would have realized it didn’t need anything like 12 lanes over the Ohio River–the existing bridges would have sufficed.

In Oregon’s case, the Legislature has directed the Oregon Department of Transportation to get federal permission to toll Interstate 5 and a parallel route (I-205). Given Kentucky and Indiana’s experience, it would be wise to implement tolls first, before making any additions to existing freeway capacity. The overwhelming evidence is that tolling could reduce, delay or even eliminate the need for costly freeway widening.

Rochester

Tearing out a freeway makes a better city.

Going, going . . . (Stantec, via CNU)

Rochester, New York is in the process of removing and filling in a depressed (and depressing) urban freeway it built in the 1960s.  Removing the “Inner Loop” freeway is reconnecting downtown neighborhoods, and helping stimulate development.  The city has just approved a new mixed use development on former freeway land that includes 120 units of housing. More housing and fewer roads are the cornerstones of revitalizing the city’s downtown, according to the Congress for the New Urbanism.

Lessons learned?

Looking at the experience of other cities should tell Portland’s leaders that freeway widening projects like the proposed Rose Quarter expansion are ineffective, costly, unnecessary, and out of date. Houston’s experience shows that adding more lanes doesn’t reduce congestion, it just induces more traffic. Louisville shows that if you’re going to toll freeways, you can expect a big drop in traffic that will likely obviate any perceived need for more lanes. And Dallas shows, than even in traditional auto-dominated cities, its possible to simply walk away from out-dated freeway expansions plans. For those who are really serious about reclaiming valuable urban space for people, it makes sense to tear out freeways, as Rochester is currently doing, rather than building more. Portland was once a leader in re-thinking how to reduce auto-dependence; today, there are valuable lessons it can learn from other cities.

Uber’s Movement: A peek at ride-hailing data

Uber’s lifting the veil–just a little–to provide data on urban transportation performance

Uber’s new Movement tool provides valuable new source of data about travel times in urban environments. We’ve gotten an early look at Movement, and think its something that you’ll want to investigate, if you’re interested in urban transportation.

Uber likes to bill itself as a technology company, rather than a transportation company: technically, it’s the independent driver-owners of vehicles that provide the transportation service; Uber uses an array of information technology to arrange, monitor, finance, and evaluate the transaction.  In the process, Uber generates a huge amount of data about the trips that people take and the level and speed of traffic in cities. Access to this ride data has been hotly debated for a number of reasons. Customers, rightly, are interested in protecting their privacy. Ride-hailing companies naturally are seeking to keep this valuable market information from their competitors.

Ride-hailing companies have also been reluctant to share this data with public authorities. New York has managed to force disclosure of some information (which has served as the basis of the Bruce Schaller’s report, which shows ride hailing having a material impact on New York travel speeds). San Francisco working with IT experts from Northeastern University, figured out how to scrape information about ride-hailing trips within the city based on the company’s public facing web sites. Now Uber has stepped forward and started making at least some of its data directly available to everyone.

Movement: a portal to Uber’s travel time data

Uber’s made its new Movement data analysis tool open to the public this week.  Initially it’s just providing data for a handful of cities including Boston, Washington, Manila and Singapore, but the company promises to add more cities as time goes by.

The Movement interface is straightforward and simple to use.  Its greatest utility is the ability to easily generate data on actual travel times for a given route over a number of different dates. This kind of simple time-series analysis tool can help identify where travel times are increasing or decreasing compared to some base period.  This can be extremely useful for diagnosing the effect of transportation investments or observing the effects of system disruptions (like the Atlanta Freeway collapse).

An Example:  How has a typical Washington DC commute changed in the past year

Suppose you live in Bethesda Maryland, and commute by car to the Brookings Institution near Dupont Circle in Washington. How has your commute changed in the past year?  We used the Movement tool to select an address in central Bethesda and 1775 Massachusetts Avenue NW as our origin and destination, respectively.  We chose two time periods (the first quarter of 2017 and the first quarter of 2016), and restricted our search to weekdays, and the AM Peak period (from 7 am to 10 am).  The results are shown below:

 

On average it takes about 31 minutes and 4 seconds to take this morning commute, down almost 3 minutes from the time required in the previous year (33 minutes 57 seconds).  The map’s color coding shows that most commute destinations from Bethesda are shorter trips (shaded green) than they were in the previous year. Helpfully, the interface also shows the range of travel times for trips taken during these periods; this range reflects the geometric standard deviation about the arithmetic mean of the travel time data.  Morning commutes on this route ranged from 23.5 to 41.0 minutes  in the first quarter of 2017, compared to a range of 26.5 to 43.5 minutes in the prior year. So while the mean commute is down nearly three minutes, the range is still broadly the same as it was in the prior year.

There are some important limitations to this data. The Movement interface reveals trip times only for origin-destination pairs that have a sufficient number of trips (undertaken by Uber drivers) to enable them to calculate average trip times. While this is not a problem in the dense, urban environments which are the richest market for ride-hailing companies, data are sparse in lower density areas, and don’t appear at all for some suburb-to-suburb trips. While this is understandable (Uber can’t generate data for trips that no one buys from it), it’s important to keep this in mind when looking at the data. Fortunately, Uber has disclosed the threshold it uses for presenting data for any set of origin-destination pairs: in general, there have to be at least 5 trips between the origin and destination during the time period examined, and for privacy purposes, the trips have to be made by at least 3 different customers. In addition, Uber filters out origin-destination pairs that have fewer than 30 observations in a given month. (And for those concerned about privacy, the origins and destinations of actual Uber trips aren’t disclosed in the Movement interface, just the estimates of how much time a trip would take based on the average of all trips recorded by Uber along these routes.)

As a result of its service patterns and these filtering provisions, Uber’s data has a heavily urban focus.  Their data for the Washington DC area covers the entire area within the Beltway. (Areas shaded blue, green and yellow are reported in Movement; areas shaded gray are not).

It’s also worth remembering that Movement data tell us a lot about traffic speed, but essentially nothing about traffic volumes. Uber vehicles are essentially a sample of vehicles traveling at different times, but Uber lacks data about how many other vehicles are on the road. So essentially, we’ll still have to rely mostly on old-school traffic counting technology for vehicle counts.

Keeping it smart: Transparent and consistent

Going forward, we hope Uber extends its Movement tool to all the major markets it serves. It’s a great example of how “big data” can be made easily available for ordinary citizens, and its a terrific public service for Uber to share this. That said, we have a couple of pieces of advice for Uber.

First, in order to be useful, especially for time series analysis, it has to be consistent. For now, the data in Movement goes back to 2015, but not earlier.  Future data availability hinges in part on the company’s continued existence, but another risk is that methodology changes and “series breaks” may make it difficult to track change accurately over time. Much as we appreciate Uber’s civic mindedness in sharing this data, we’re also aware of how vulnerable this makes us. For several years, Inrix–another major provider of real time travel data,similarly derived from vehicle based GPS measurements–published monthly data on travel times in major US markets. But then abruptly in 2014, the company simply discontinued publication of its city-level data. Since then, the company has produced a series of reports decrying the congestion problem, but not presenting data that was consistent with its early methodologies, making it impossible to independently verify its claims. There’s little doubt that the performance data generated by Uber and other ride-hailing companies will be central to public policy debates about transportation and the impact of ride-hailing; we hope they’ll be willing to provide this data on an ongoing basis in an open format, using consistent methodologies.

Second, and relatedly, the definitions and methodologies used to produce the data need to be as transparent as possible, allowing for appropriate concerns about customer privacy and the competitive value of this data. In our beta testing of Movement, Uber did a terrific job of answering our questions. You can download the data from their website for use in other programs, and as noted above, the site reports the range of observed travel times, as well as averages, so that users can get a sense of the variance in travel times as well. All these details make the data more useful for meaningful analysis.

How do I get access to the data?

Access to Uber’s momentum data is available to anyone with a free Uber account. If you already have an account, navigate to movement.uber.com.

Editors note: Uber provided City Observatory with the opportunity to be a beta tester of the Movement data and interface. City Observatory was not compensated for this testing.

 

Housing Policy Lessons from Vienna, Part II

Allowing multi-family housing in all residential zones, and aggressively promoting private bidding lowers housing costs

We’re pleased to welcome a guest commentary from Mike Eliason of Seattle. Mike is a passivhaus designer with Patano Studio who is interested in baugruppen, mass timber, ultra low energy buildings, and social housing.  Vienna is often mentioned as a model for how American cities might do a better job of providing more widespread affordability. While tantalizing, many of the descriptions of the secrets of its reported success are cryptic and incomplete. A couple of weeks back, Mike shared with us a number of interesting insights about what’s behind Vienna’s policy, and at our invitation, he’s presenting them at City Observatory. You can follow Mike and his musings on urbanism on twitter (https://twitter.com/bruteforceblog).

Part II:  Zoning and development in Vienna

Previously, I discussed demographic and funding allocations of social housing in Vienna, compared to Seattle. (Part I of this series is here.) This segment will look at the zoning and development policies, compared to Seattle.

Urban Planning in Vienna is incredibly comprehensive, and the city undergoes a rigorous citizen engagement  and planning exercise when developing or redeveloping areas. The Bauträgerwettbewerbe are a critical component of achieving high quality, dense, and livable housing cost effectively. Teams compete to develop and receive subsidies for individual projects, and are judged by a diverse panel on the economics of the project, the architecture, ecology of the building, and the social mix. The city has effectively leveraged its purse to push the price of construction down, making developers compete on the merits and economics. This results in buildings that are incredibly innovative across the board – you can see past winners here, and there’s even a 24-story cross laminated timber tower underway in Seestadt Aspern.

Seattle’s planning process is less comprehensive. Much is left to the market to decide where things should go. Large developments are the exception, rather than the norm. And while Seattle has attempted to stimulate innovation , those efforts have largely fallen flat. Not only does Seattle thwart innovation, but we also allow homeowners to slow and kill housing units through excessive design review and other forms of predatory delay.

The amount of land zoned exclusively for single family houses in Vienna is zero. Just 9% of the dwelling units in Vienna are single family homes. In Seattle, 44% of dwelling units are single family homes and almost 75% of non-industrial parcels are reserved for this least dense, least sustainable form of housing. We’re constantly digging out of a hole, and until we start thinking more holistically and at a drastically larger scale, we’ll never get out.

2010 Vienna population density map, source: Stadt Wien

​Vienna has neighborhoods, but density isn’t limited to just a few urban villages. The density of Vienna is largely centralized and relentless. Seattle’s Urban Villages held promise, but many are severely gerrymandered and the zoning within them is largely deferential to single family zoning inside and outside the demarcated areas. Vienna’s zoning is broad and deep – generally 6-8 stories over several blocks – whereas Seattle’s density usually starts well under 8 stories and steps down to single family zoning quickly, making for weak, car-dominated urbanism and high housing prices. Vienna has walksheds that are nearly three-quarters of a mile in new development. We hope that Seattle get to ½ mile walkshed one day.

Additionally, Vienna, isn’t afraid to convert former industrial lands to housing, a topic that is taboo in Seattle. Vienna recently redeveloped one of its airports just minutes from the city to the Seestadt Aspern district which will house 20,000 residents and 20,000 jobs, on less than 600 acres – plus parks, schools, stores, etc. Likewise, the former railyards in Nordbahnhof district are also being redeveloped into dense urban housing for 25,000 – with parks, schools, institutions, jobs. Seattle could be doing this, but tragically, we have more land zoned for industry than we do for multifamily housing. We do redevelop a fair amount of industrial land – but mainly for just different types of jobs (e.g. offices instead of manufacturing). And while we’ll likely never turn Boeing Field into a larger version of Seestadt Aspern – we should be looking more critically at how and where we can easily accommodate more housing.  

Crosscut’s Joe Copeland wrote in his piece describing Vienna housing policies that much of the new social housing in Vienna is low rise – and while projects on the outer districts do trend toward lower buildings, much of the social and market housing built today would not be considered low rise by Seattle residents. Most projects are greater than 4 stories, and though they are dense, they provide amenities unheard of in Seattle – especially for non-luxury housing. To get a glimpse of recent housing projects in Vienna, go here.

Vienna’s history of extensive density allows for another innovative form of gentle urban renewal – rooftop additions (aufstockungen). Rooftop additions are everywhere in Vienna, and in some places the city is allowing market rate to be built to offset rents for the remainder of the building. In others, the city itself is adding roof top units as it rehabs existing buildings. While several hundred are underway each year, it is estimated that 25,000 rooftop housing units could be added over existing buildings.We would be hard pressed to easily accommodate 25,000 new units *over* existing single family homes in Seattle.

Tenure in Austria is wholly different from the U.S. For one, housing contracts in Austria are primarily indefinite, versus one-year contracts. This reduces the stress of constantly having to find new housing or accept rent increases. As an added security, because Vienna’s social housing is intended to result in economically diverse communities, there is only a limit upon starting tenancy, and increased wages do not result in households being pushed into market rate rentals. Additionally, depending on the type of unit, some can be passed on to family members. This ensures that there are no neighborhoods that are overwhelmingly wealthy or poor, but rather a diverse mix. To this point, some of the most expensive single family houses in Vienna sit opposite the street of gemeindebauten. Seattle’s rampant exclusionary zoning prevents this sort of mixing from happening. Instead of slowing gentrification, our restrictive zoning accelerates it.

Vienna, like Seattle, is also majority renter. This means the city does not have land use policies that are easily co-opted or dominated by homeowners. The city encourages very broad participation in planning. There are no policies preserving single family homeowners views, street parking, etc. And unlike Seattle, most parks are surrounded by dense, multifamily neighborhoods. Fifty percent of Vienna is green space, and going forward – the city is aiming to keep that ratio despite adding new housing. Such a goal would be extremely difficult given the plethora of land dedicated in Seattle to sprawl, essentially. Vienna’s planning documents are a thing of beauty.

Vienna could be an ideal model for Seattle. It’s dominated by left-leaning politics. It’s majority renter. It builds more social housing than market rate. But our zoning, our lack of vision and leadership, our lack of comprehensive planning, our lack of innovation, and most importantly, our lack of funding make such a model difficult to obtain. Vienna is doing almost everything right. Perhaps it is time for Seattle too, as well.  

A Nobel Prize with a solution for climate change

Let’s put a price on using the atmosphere as a garbage dump for carbon

Earlier this week, Yale economist William Nordhaus was announced as this year’s co-recipient of the Nobel Prize in Economics (along with Paul Romer, who we profiled yesterday).  Nordhaus is a pioneer in environmental economics and has his research has laid the foundation behind using a carbon tax to counter climate change. While Romer’s research deals with knowledge as a “public good,” Nordhaus has explored climate as a public good. His views are summarized by global advocates for carbon pricing:

“Climate change is a member of a special kind of economic activity known as global public goods.” To solve this problem, “At a minimum, all countries should agree to penalize carbon and other GHG emissions by the agreed upon minimum price.”

Charging a fee for using the atmosphere as a garbage dump for carbon would create incentives both to cut down on damaging emissions, to invest in cleaner sources of energy and transportation, and to more quickly come up with ideas and technology for fighting global warming. While it seems like a heavy lift, the carbon tax would be the most subtle and systematic way to inform the decisions of producers, consumers and investors in a way that would lead to lower carbon emissions.

A simple analogy: a fee for disposable bags

For some time, Chicago has been charging shoppers a 7 cent fee for using disposable grocery bags. Rather than banning the bags outright, the city settled on the fee as a way to preserve consumer choice and yet encourage less use of plastic bags. Those who don’t bring their own bags to the store pay a the 7 cent fee, which is itemized on their receipt; grocers keeps 2 cents for their trouble, and the nickel per bag goes to the city. (Other places have enacted similar fees: in the UK, shoppers pay 5 pence for plastic bags).

If we’re willing to charge 7 cents for this, why not 2 cents per pound of carbon?

Initial reports offer a good news/bad news story about the impact of the fee. It’s generating substantially less revenue than the city had hoped, but that’s because with just this small financial incentive, shoppers have quickly changed their behavior. Consumers are bring their own re-usable bags to the store, and plastic bag consumption is down by more than 40 percent. The Chicago Sun Times quotes one of the researchers studying the impact of the bag fee on consumers:

They recognize “that this bag is something that was free, and now it’s not,” Palmer said. “Every time customers go to a grocery store, they see that 7-cents-a-bag tax on their receipt.”

The relative ease and simplicity of the bag fee got us thinking about how we might apply the same idea to another, somewhat more serious environmental problem: climate change. What would happen if we asked consumers to pay, say 2 cents per pound, for every pound of carbon that they emitted into the atmosphere? If consumers got some small signal that dumping carbon into the atmosphere wasn’t “free” then they’d have a strong incentive to change their behavior.

Of course, this actually isn’t a new idea. But how we package it is important. Carbon tax advocates have always talked about pricing in “dollars per ton” but that puts it a little bit out of the reach of daily life and the average consumer. Talking about pounds of carbon makes it a little bit more comprehensible, and puts it in the same context as the plastic bag fee. Is it unreasonable to ask everyone to pay, for example,  just two cents for every pound of carbon they emit?

And two cents is pretty darn close to the correct number. While there are various recommendations for the appropriate level for a carbon tax, currently a number of experts are suggesting something like a tax of $40 to $80 per ton.  Divide that number by roughly 2,000 (we’ll just ignore whether the experts want that tax for a metric or an imperial ton) and a $40 per ton tax on carbon works out to about a 2 cents per pound tax.

Just to put that in perspective with our shopping bag, recognize that a typical polyethylene shopping bag weighs about five or six grams. So Chicago is charging consumers about 1 cent per gram for their shopping bag.  That’s roughly 200 times more than a 2 cent per pound tax on carbon (about 450 grams in a pound, so our 2 cents per pound tax works out to less than .005 cents per gram).

What does that mean in practice? Consider our most common form of carbon emissions: driving a gas-powered car.  If your car gets 20 miles per gallon, it produces about one pound of carbon per mile. There are slightly more than twenty pounds of carbon generated by burning a gallon of gas, so a five-mile round-trip to the store would generate about five pounds of carbon, which would cost you a dime with our proposed  2 cents per pound carbon fee. So in this scenario, if you’re buying two bags of groceries, your bag fee (in Chicago) would be 14 cents and your carbon emission fee would be 10 cents. Although if instead of driving your car, you rode your bike, and brought your own bags, you could save almost a quarter.

As a result, the fee we’re talking about to save the planet is not out of line with what we’re perfectly willing to ask consumers to pay to discourage the visible, but largely nuisance effects, of plastic bags.

A small fee, say 2 cents a pound on carbon would send consumers small, but pervasive signals about the effects of their buying choices and travel behavior on the environment. Sometimes–just as when you forget to bring your own bag, you might be willing to spend the 7 cents to have the convenience of a plastic bag, you pay for the privilege (and in the case of a carbon fee, generate revenue that could be used to reduce our dependence on fossil fuel, and offset the regressive effects of the tax). But overall, the fee would bias consumer (and investor) decisions in favor of all kinds of things that resulted in lower carbon emissions. It would make solar energy, and electric cars, and walkable urban places more economical, and make fossil fuel, gas-powered cars, and sprawl even less attractive than they are today. It would automatically reward businesses, inventors, and investors who came up with lower carbon ways to get all of the goods and services we value. It would gradually, but powerfully push us in the direction of lower carbon emissions and greater sustainability.

Shopping bags are a visible, annoying form of pollution. The are a regular feature of litter almost everywhere in the world. And while they’re a blight, and an unnecessary one, the fact is we’re willing (at least in Chicago) to make consumers pay a fee that reflects the environmental damage they cause, and to give a gentle nudge to their behavior in a direction that is better for the environment. And it’s working–plastic bag use in Chicago has dropped 42 percent already.

So why aren’t we willing to do the same with carbon? Perhaps its as simple as this: Carbon dioxide (the most common form of carbon pollution) is invisible.  We can’t see it.  If you’re car exuded fist-sized lumps of carbon at the rate of one per mile and they cluttered the roadway, we’d probably acknowledge the problem and agree to do this almost instantly. But the carbon evanesces into an invisible–and global–atmosphere, slowly, but surely raising global carbon levels and steadily raising the planet’s temperature. Plastic shopping bags aren’t an existential threat to the planet, so why are we willing to charge consumers 200 times as much (per pound) for these bags as we would charge for carbon emissions?

Is a couple of pennies a pound for carbon pollution too much to ask? The work of Nobel Laureaute William Nordhaus suggests that it would be the cheapest and most effective way to make a difference on climate change.

 

 

 

Climate Change: A 2-cent solution

Let’s put a price on using the atmosphere as a garbage dump for carbon

It works for plastic bags; let’s use the same idea for carbon

Consider the plastic bag:  It’s a highly visible environmental problem, one that we all encounter.  Around the world, retailers routinely provide shoppers with “free” plastic bags to carry home their purchases.  The bags show up as refuse in municipal disposal systems (when they’re properly binned) and but are a major litter nuisance, and turn out to be a threat to wildlife.

For economists, the solution to the plastic bag problem is straightforward:  Charge consumers for taking them.  And in Britain, where this pricing policy has been in places for some time, the results are compelling. Large retailers in the UK are required to charge 5p (about 7 cents) for plastic bags. The fee has dramatically reduced plastic bag consumption, as the BBC reports:

Asda, Marks and Spencer, Morrisons, Sainsbury’s, the Co-op, Tesco and Waitrose sold 549 million single-use plastic bags in 2018-19, down from one billion in the previous year.

Since 2015, when a 5p charge was introduced to tackle plastic pollution, the number being used is down by 90%.

Customers now buy, on average, 10 bags a year compared to 140 bags in 2014.

Similarly Chicago has been charging shoppers a 7 cent fee for using disposable grocery bags. Rather than banning the bags outright, the city settled on the fee as a way to preserve consumer choice and yet encourage less use of the plastic bags. Those who don’t bring their own bags to the store pay a the 7 cent fee, which is itemized on their receipt; the grocer keeps 2 cents for their trouble, and the nickel per bag goes to the city.

If we’re willing to charge 7 cents for this, why not 2 cents per pound of carbon?

 

What if we did the same thing for carbon pollution?

The relative ease and simplicity of the bag fee got us thinking about how we might apply the same idea to another, somewhat more serious environmental problem: climate change. What would happen if we asked consumers to pay, say 2 cents per pound, for every pound of carbon that they emitted into the atmosphere? If consumers got some small signal that dumping carbon into the atmosphere wasn’t “free” then they’d have a strong incentive to change their behavior.

Of course, this actually isn’t a new idea. The world’s leading economists of every political stripe are in broad agreement that a carbon tax is a foundation for any effective climate change policy. But how we package it is important. Carbon tax advocates have always talked about pricing in “dollars per ton” but that puts it a little bit out of the reach of daily life and the average consumer. Talking about pounds of carbon makes it a little bit more comprehensible, and puts it in the same context as the plastic bag fee. Is it unreasonable to ask everyone to pay, for example,  just two cents for every pound of carbon they emit?

And two cents is pretty darn close to the correct number. While there are various recommendations for the appropriate level for a carbon tax, currently a number of experts are suggesting something like a tax of $40 to $80 per ton.  Divide that number by roughly 2,000 (we’ll just ignore whether the experts want that tax for a metric or an imperial ton) and a $40 per ton tax on carbon works out to about a 2 cents per pound tax.

Just to put that in perspective with our shopping bag, recognize that a typical polyethylene shopping bag weighs about five or six grams. So Chicago is charging consumers about 1 cent per gram for their shopping bag.  That’s roughly 200 times more than a 2 cent per pound tax on carbon (about 450 grams in a pound, so our 2 cents per pound tax works out to less than .005 cents per gram).

What does that mean in practice? Consider our most common form of carbon emissions: driving a gas-powered car.  If your car gets 20 miles per gallon, it produces about one pound of carbon per mile. There are slightly more than twenty pounds of carbon generated by burning a gallon of gas, so a five-mile round-trip to the store would generate about five pounds of carbon, which would cost you a dime with our proposed  2 cents per pound carbon fee. So in this scenario, if you’re buying two bags of groceries, your bag fee (in Chicago) would be 14 cents and your carbon emission fee would be 10 cents. Although if instead of driving your car, you rode your bike, and brought your own bags, you could save almost a quarter.

As a result, the fee we’re talking about to save the planet is not out of line with what we’re perfectly willing to ask consumers to pay to discourage the visible, but largely nuisance effects, of plastic bags.

A small fee, say 2 cents a pound on carbon would send consumers small, but pervasive signals about the effects of their buying choices and travel behavior on the environment. Sometimes–just as when you forget to bring your own bag, you might be willing to spend the 7 cents to have the convenience of a plastic bag, you pay for the privilege (and in the case of a carbon fee, generate revenue that could be used to reduce our dependence on fossil fuel, and offset the regressive effects of the tax). But overall, the fee would bias consumer (and investor) decisions in favor of all kinds of things that resulted in lower carbon emissions. It would make solar energy, and electric cars, and walkable urban places more economical, and make fossil fuel, gas-powered cars, and sprawl even less attractive than they are today. It would automatically reward businesses, inventors, and investors who came up with lower carbon ways to get all of the goods and services we value. It would gradually, but powerfully push us in the direction of lower carbon emissions and greater sustainability.

Shopping bags are a visible, annoying form of pollution. The are a regular feature of litter almost everywhere in the world. And while they’re a blight, and an unnecessary one, the fact is we’re willing (at least in Chicago) to make consumers pay a fee that reflects the environmental damage they cause, and to give a gentle nudge to their behavior in a direction that is better for the environment. And it’s working–plastic bag use in Chicago has dropped 42 percent in six months, and Britain reduced plastic bag use by nearly 90 percent over five years.

So why aren’t we willing to do the same with carbon? Perhaps its as simple as this: Carbon dioxide (the most common form of carbon pollution) is invisible.  We can’t see it.  If you’re car exuded fist-sized lumps of carbon at the rate of one per mile and they cluttered the roadway, we’d probably acknowledge the problem and agree to do this almost instantly. But the carbon evanesces into an invisible–and global–atmosphere, slowly, but surely raising global carbon levels and steadily raising the planet’s temperature. Plastic shopping bags aren’t an existential threat to the planet, so why are we willing to charge consumers 200 times as much (per pound) for these bags as we would charge for carbon emissions?

Is a couple of pennies a pound for carbon pollution too much to ask?

 

 

 

Climate Change: A 2-cent solution

Let’s put a price on using the atmosphere as a garbage dump for carbon

For almost six months, Chicago has been charging shoppers a 7 cent fee for using disposable plastic grocery bags. Rather than banning the bags outright, the city settled on the fee as a way to preserve consumer choice and yet encourage less use of the plastic bags. Those who don’t bring their own bags to the store pay a the 7 cent fee, which is itemized on their receipt; the grocer keeps 2 cents for their trouble, and the nickel per bag goes to the city. (Other places have enacted similar fees: in the UK, shoppers pay 5 pence for plastic bags).

If we’re willing to charge 7 cents for this, why not 2 cents per pound of carbon?

Initial reports offer a good news/bad news story about the impact of the fee. It’s generating substantially less revenue than the city had hoped, but that’s because with just this small financial incentive, shoppers have quickly changed their behavior. Consumers are bring their own re-usable bags to the store, and plastic bag consumption is down by more than 40 percent. The Chicago Sun Times quotes one of the researchers studying the impact of the bag fee on consumers:

They recognize “that this bag is something that was free, and now it’s not,” Palmer said. “Every time customers go to a grocery store, they see that 7-cents-a-bag tax on their receipt.”

The relative ease and simplicity of the bag fee got us thinking about how we might apply the same idea to another, somewhat more serious environmental problem: climate change. What would happen if we asked consumers to pay, say 2 cents per pound, for every pound of carbon that they emitted into the atmosphere? If consumers got some small signal that dumping carbon into the atmosphere wasn’t “free” then they’d have a strong incentive to change their behavior.

Of course, this actually isn’t a new idea. But how we package it is important. Carbon tax advocates have always talked about pricing in “dollars per ton” but that puts it a little bit out of the reach of daily life and the average consumer. Talking about pounds of carbon makes it a little bit more comprehensible, and puts it in the same context as the plastic bag fee. Is it unreasonable to ask everyone to pay, for example,  just two cents for every pound of carbon they emit?

And two cents is pretty darn close to the correct number. While there are various recommendations for the appropriate level for a carbon tax, currently a number of experts are suggesting something like a tax of $40 to $80 per ton.  Divide that number by roughly 2,000 (we’ll just ignore whether the experts want that tax for a metric or an imperial ton) and a $40 per ton tax on carbon works out to about a 2 cents per pound tax.

Just to put that in perspective with our shopping bag, recognize that a typical polyethylene shopping bag weighs about five or six grams. So Chicago is charging consumers about 1 cent per gram for their shopping bag.  That’s roughly 200 times more than a 2 cent per pound tax on carbon (about 450 grams in a pound, so our 2 cents per pound tax works out to less than .005 cents per gram).

What does that mean in practice? Consider our most common form of carbon emissions: driving a gas-powered car.  If your car gets 20 miles per gallon, it produces about one pound of carbon per mile. There are slightly more than twenty pounds of carbon generated by burning a gallon of gas, so a five-mile round-trip to the store would generate about five pounds of carbon, which would cost you a dime with our proposed  2 cents per pound carbon fee. So in this scenario, if you’re buying two bags of groceries, your bag fee (in Chicago) would be 14 cents and your carbon emission fee would be 10 cents. Although if instead of driving your car, you rode your bike, and brought your own bags, you could save almost a quarter.

As a result, the fee we’re talking about to save the planet is not out of line with what we’re perfectly willing to ask consumers to pay to discourage the visible, but largely nuisance effects, of plastic bags.

A small fee, say 2 cents a pound on carbon would send consumers small, but pervasive signals about the effects of their buying choices and travel behavior on the environment. Sometimes–just as when you forget to bring your own bag, you might be willing to spend the 7 cents to have the convenience of a plastic bag, you pay for the privilege (and in the case of a carbon fee, generate revenue that could be used to reduce our dependence on fossil fuel, and offset the regressive effects of the tax). But overall, the fee would bias consumer (and investor) decisions in favor of all kinds of things that resulted in lower carbon emissions. It would make solar energy, and electric cars, and walkable urban places more economical, and make fossil fuel, gas-powered cars, and sprawl even less attractive than they are today. It would automatically reward businesses, inventors, and investors who came up with lower carbon ways to get all of the goods and services we value. It would gradually, but powerfully push us in the direction of lower carbon emissions and greater sustainability.

Shopping bags are a visible, annoying form of pollution. The are a regular feature of litter almost everywhere in the world. And while they’re a blight, and an unnecessary one, the fact is we’re willing (at least in Chicago) to make consumers pay a fee that reflects the environmental damage they cause, and to give a gentle nudge to their behavior in a direction that is better for the environment. And it’s working–plastic bag use in Chicago has dropped 42 percent already.

So why aren’t we willing to do the same with carbon? Perhaps its as simple as this: Carbon dioxide (the most common form of carbon pollution) is invisible.  We can’t see it.  If you’re car exuded fist-sized lumps of carbon at the rate of one per mile and they cluttered the roadway, we’d probably acknowledge the problem and agree to do this almost instantly. But the carbon evanesces into an invisible–and global–atmosphere, slowly, but surely raising global carbon levels and steadily raising the planet’s temperature. Plastic shopping bags aren’t an existential threat to the planet, so why are we willing to charge consumers 200 times as much (per pound) for these bags as we would charge for carbon emissions?

Is a couple of pennies a pound for carbon pollution too much to ask?

Note:  Reader @mateodechicago points out that Chicago’s bag fee applies to paper as well as plastic bags.

 

 

Pity the poor Super Commuter

About 2 percent of all car commuters travel 90 minutes to work, same as a decade ago.

We’ve always been clear about our views on mega commuters, those traveling an hour and a half or more to work daily. As we said last year, mega commuting is a non-big, non-growing non-problem. But the loneliness of the long distance commuter is an evergreen topic for journalists: these poor victims of modern life, consigned to spend long periods of times in their cars, isolated from family and friends.

The latest installment in this long-running saga comes courtesy of Curbed.  In an article entitled “Supercommuters, skyrocketing commutes, and America’s affordable housing crisis,” Patrick Sisson various scary sounding data points about the increase in the number of people traveling more than 90 minutes to work each day. (Never mind that the increases reported since 2010 largely reflect the effects of a growing economy; the number of people commuting increased by 11 million between 2010 and 2015. There are obligatory quotes from the Texas Transportation Institute’s researchers–who are always good for a statistic laced lamentation of traffic problems (never mind that the numbers are wildly exaggerated). Sisson’s story draws heavily on a recent data analysis by the Pew Charitable Trusts Stateline in a story titled: “In Most States, a Spike in ‘Super Commuting.‘”

Don’t get us wrong: there’s nothing pleasant about long commutes. Quite the opposite, we (and most people) recognize that they’re toxic. But let’s put a few facts in order.

Very few people have long commutes. Among car commuters in the US, only about 2 percent of all commuters have a commute of ninety minutes or more.  Ninety-eight percent of us manage to get to work in less time.

Contrary to what you may have heard, long duration commutes are not growing as a share of all commuting. We’ve dug up 15 years worth of Census surveys on car commuting behavior from the data compiled by the Integrated Public Use MicroSample Project (IPUMS).  These data let us slice and dice commuting patterns by mode and by year.  Here’s a chart showing the share of automobile commuters who reported traveling 90 minutes or more each year between 2001 and 2015 (the latest year for the American Community Survey).

There was a brief period of growth of commutes, from about 1.6 percent in 2001-02 to about 2.0 percent in 2006.  Then a slight decline, and then an uptick back to 2.1 percent in 2014.  The record of the last decade or so hardly qualifies as “skyrocketing.” It’s pretty much flat.

If we are concerned about super commuters, there’s a good argument to be made that our attention ought to be directed to those folks who travel via public transit.  The American Community Survey reports on usual mode of travel to work, so we’ve estimated separately the number (and share) of super commuters among those who travel to work by bus, streetcar, subway or railroad.  While just 2 percent of car commuters endure long commutes, more than 10 percent of all these transit commuters have long commutes. Statistically, transit riders are five times as likely to have super commutes than car drivers. So, if we view this “problem” from a transportation perspective, maybe we should be talking about improving the speed and coverage of transit systems.

But its a fair question to ask whether those with long commutes are really the helpless victims they’re portrayed to be in these stories. We know from abundant anecdotes–usually related in the article about the plight of the super commuter–that they’ve chosen to live in some neighborhood far removed from their place of work because it offered them a bigger house, a larger yard, better schools, or simply lower prices than housing that is less than ninety minutes to work.  In a very real sense, those who choose longer commutes are rewarded in terms of these amenities. A long commute is essentially “sweat equity,” buying more real estate not with cash received in a paycheck, but by putting in more time behind the wheel to get what you want. Sisson acknowledges as much in his article, quoting Texas Transportation Institute’s Phil Lasley:

Homeowners are now being priced out of many U.S. markets, he says, and are willing to sacrifice transportation time for the neighborhood or lifestyle they want. When it comes down to making a decision, most sacrifice a shorter drive for a lawn, marble countertops, and a good school district.

The recent blip in super commuting in 2015 tends to confirm this sweat equity story. In mid-2014, gas prices fell by about 40 percent, effectively lowering the cost of commuting. The number of car super commuters jumped about 8.9 percent in 2015 over 2014, compared to just an 4.1 percent increase in the previous year.

In the end, the super commuting story tells us more about housing and personal preferences than it does about transportation and public policy. Some people, maybe as many as 2 percent of us, will want to live some place that requires a 90 minute commute to a job (at least for a while in our lifetime). For those people who truly can’t afford any housing closer to their jobs (as opposed to those who are willing to trade off a longer commute for better amenities), its a message that we need to build a range of housing types in all parts of a metro area, and be sure that housing supply expands in line with housing demand.

 

 

Prices Matter: Parking and Ride Hailing

Pricing parking drives demand for ride hailing services

Ride-hailing companies like Uber and Lyft have been highly reluctant to share data about their services with cities. In California, the state Public Utilities Commission has pre-empted municipal access to ride-hail data (and isn’t sharing it with anyone). As Bruce Schaller’s recent study of New York (one place where the city government has compelled access to ride data) shows, the growth of ride-hailing is having a material impact on traffic congestion.

San Francisco’s County Transportation Authority (SFCTA) figured out a clever work-around for accessing ride hailing data. The principal operators like Uber and Lyft rely on a public-facing Application Programming Interface (API) that tracks the location of vehicles and their availability. Researchers for Northeastern University scraped this data in real time for six weeks for  trips beginning and ending in San Francisco, and used it to create a database of ride-hailed trips in the city.

Aggregating it with other city data on traffic volumes, SFCTA was able to use this data to compute what share of all trips in various parts of the city were taken using hailed rides at various times of day. Overall, the study showed that there were nearly 170,000 ride hailed trips with both ends in the city on a typical weekday last fall.  (Ride hailing varies by weekday and peaks on Friday and Saturday evenings.)

While ride hailing services covered the entire city of San Francisco–and, its fair to say, provided more rides in outlying parts of the city that did conventional taxis–ride hailing trips were highly concentrated in the city’s densest urban neighborhoods.  In the downtown area, ride hailed trips accounted for as much as 20 percent of all traffic.

In past months, we’ve looked at the penetration of ride hailing services in different markets (using data developed by the Brookings Institution) and we’ve correlated that with a rough proxy of metro area parking prices. We found that ride hailing had the highest market penetration in those metro areas with the highest parking prices. We think that makes a lot of sense: hailing a Lyft or an Uber can save you time and money if parking near your origin or destination is hard to find or expensive. But if parking is free or abundant, it may be cheaper, easier and faster to drive your own vehicle.

The new San Francisco ride-hailing data give us a more refined way of looking at the parking price-ride hailing connection. Instead of looking at aggregate metropolitan data, we can now look at neighborhood level data and see how variations in parking prices among neighborhoods correlate with ride-hailing use.

Our data on parking rates come from the web-site Parkopedia, which tracks the location, number of parking spaces, and advertised hourly and daily parking rates for off-street parking across the nation. We aggregated Parkopedia’s data on San Francisco parking lots and garages by zip code, and computed the median price of an hour of parking.  Parking ranges from around $15 per hour in the densest zip codes down town, to $2 or less per hour in the cities least dense residential neighborhoods.  Several zip codes had no data entries in the Parkopedia database, suggesting that there are few, if any, paid, off-street parking lots in these neighborhoods.

While we haven’t done a statistical correlation (the SFCTA data is aggregated by traffic analysis zones and the parking data is by zip code) a quick visual comparison shows that the highest levels of ride-hailing activity are in the same parts of the city that have the highest parking prices.

This is more evidence suggesting that pricing plays an important role in shaping transportation behavior. The biggest market potential for ride-hailing services is where there’s a density of prospective customers who face relatively high prices for storing their vehicles. While that (coupled with surge pricing) makes downtown streets at peak hours a lucrative place for ride-hailing vehicles to ply their trade, it also means that they are contributing to traffic congestion. And unless cities take the step of pricing the use of their limited peak hour street capacity, they’re likely to be overwhelmed by this demand (and see the ride hailing business capture the economic rents associated with the use of the public right of way). Cities everywhere should be closely following the work being done in New York and San Francisco; as ride-hailing grows, and with the likely advent of fleets of autonomous hailed vehicles, these same issues will appear elsewhere.

Sisyphus meets Bob the Builder

Why traffic engineers really aren’t interested in reducing traffic congestion

We now know with a certainty that investments in additional highway capacity in dense urban environments simply trigger additional travel, what we call “induced demand.” The phenomenon is so well-documented that a recent article called in “The Fundamental Law of Traffic Congestion.”

In a sense, this ought to be profoundly depressing to the traffic engineering profession. It implies that their work has been and continues to be the labors of Sisyphus, the Titan who was punished by the Gods who required him to push a huge boulder up a steep hill each day, only to have it come tumbling back down and need to be pushed up again the next day, and every other day.

 

But traffic engineers hardly seem fazed by the experience. If the reports of the industry’s trade associations–the American Society of Civil Engineers (ASCE) and AASHTO (The American Association of State Highway and Transportation Officials–are any indication, they’re positive delighted to tell you what a horrible job they’ve done in building insufficient capacity–there are lots and lots more rocks needing to be pushed up hill. This Sisyphean philosophy of highway engineering was perfectly—if unwittingly—captured in this Washington Post headline, from its traffic columnist, “Dr. Gridlock”:

Screen Shot 2016-01-19 at 11.18.20 AM

Given all this, you might think that the highway engineering profession would be composed of dour, deeply depressed individuals, frustrated by the collective failure–after decades of trying and hundreds of billions of dollars of spending–to reduce the scourge of traffic congestion.  But no. These engineers view their jobs with the child-like enthusiasm of Bob the Builder (motto:  We can build it!).  It doesn’t really much matter what the “it” is, it’s building stuff that counts.  Viewed in this light, induced demand isn’t really so much a depressing unsolvable problem as it is an never-ending excuse for operating heavy equipment, pouring concrete and building stuff. Cool!

 

 

Maybe it’s just that highway engineers have their own perverse spin on that mantra that “It’s about the journey, not the destination”—especially when it comes to building more roads. The inevitability of induced demand in urban settings means that trying to reduce congestion by widening highways means you’ll end up chasing your tail, forever. Which to them is a feature, not a bug—if you’re in the asphalt or concrete business, or are a highway engineer, that’s not a bad thing—it’s a guarantee of lifetime full employment. So little wonder that these asphalt socialists are really indifferent to whether multi-billion dollar highway projects have any effect on congestion at all.

So, for example, consider Houston’s Katy Freeway, at 23 lanes, America’s biggest. Texas has spent billions successively widening the road, each time resulting in more and more traffic and congestion. The latest multi-billion dollar project to widen the roadway, trumpeted by AASHTO as an example of how new construction could eliminate bottlenecks, actually generated so much additional traffic that travel times actually increased after it was opened.

The converse is also true. State departments of transportation, who will shed crocodile tears over the time lost to traffic congestion, routinely turn a blind eye to actual policies and investments that reduce congestion, like tolling. Take for example, the ongoing travel behavior experiment being carried out in Louisville Kentucky. There, Kentucky and Indiana have dropped a cool billion dollars on doubling the capacity of I-65 as it crossed the Ohio River near downtown Louisville. They widened I-65 from six lanes to twelve to reduce congestion. But what really reduced congested was charging a toll–cars pay between $1 and $4 per crossing. What they’ve stumbled on, perhaps inadvertently, is the most powerful congestion reducing technology any traffic engineer has ever seen: the toll gantry.

Actual, un-retouched photograph of 5 PM peak hour traffic on I-65 in downtown Louisville. Tolling the roadway eliminated traffic congestion.

Since implementing the tolls, traffic on the I-65 bridges has fallen by nearly half, from about 122,000 vehicles per day to just 66,000.  Even at rush hour, there are very few cars using the new tolled freeway.  Now you’d think that Louisville area engineers would be falling all over themselves telling the rest of the world how they’ve finally figured out a real live solution to traffic congestion. (They haven’t).

The truth seems to be that highway departments really only care about building stuff. They really don’t care about traffic congestion, except as it provides a convenient excuse to demand more money to expand roads. Look carefully at the prominent icons of any highway department office:  you’ll see pictures of roads and bridges and overpasses, and awards from the concrete association, steel fabricators and asphalt contractors. As long as transportation spending is guided chiefly by this “Bob the builder” mentality, we’ll just get more expensive roads, more traffic and more congestion.

 

 

Cultural appropriation: Theft or Smorgasbord?

If it weren’t for cultural appropriation, would America have any culture at all?

In Portland, two women opened a food cart business–Kook’s Burritos–selling burritos based on ones that they’d seen and tasted during a trip to Puerto Novo, Mexico. They were frank, telling reporters that they’d hung out watching local vendors prepare tortillas, to see if they could glean the tricks of the trade. Returning stateside, after some trial and error, they came up with a version that they thought matched the original, and opened their business. What quickly ensued was a web-based war of words that lambasted the two Portland women for cultural appropriation–essentially profiting by stealing the knowledge of Mexican chefs. The storm of controversy, and death threats, prompted the women to close their business.

This isn’t an isolated issue: There’s even a controversy over the cultural ownership of Poutine. Quebecois are furious that it’s being rebranded as a “Canadian” dish, because it hails strictly from Quebec. “Poutine is a Québécois creation, not a Canadian one,  and suggesting otherwise ignores that poutine ‘has been used as a form of stigma against a minority group that is still at risk of cultural absorption.'”

The transnational appropriation of food has a long history. Marco Polo is generally credited with stealing the idea of noodles during his visit to China. (And apparently Japanese ramen is another cross cultural noodle appropriation). Thanks to cultural appropriation, some foods have gone from local oddities to global standards in just a generation or two. Prior to World War II, pizza was essentially unknown outside of Naples. Returning GI’s brought it back to the US; and it spread globally (as did America’s faux German “hamburger.”) Why, if we’re concerned about cultural appropriation, isn’t someone insisting that Domino’s and Pizza Hut pay royalties to the Neapolitans? Maybe it has something to do with the fact that most pizza itself is  an amalgam of New World tomatoes and old world ingredients. So apparently, we simultaneously have cultural appropriation and cultural imperialism: stealing pizza from the Neapolitans and foisting it off on the rest of the world.

Other key examples of cultural appropriation and adaptation abound in the food business:

  • Starbucks traces its inspiration to Howard Schultz’s trip to Italy in the early 1980s. He cribbed the cafe’ formula and even the job title “Barista” from Italy’s coffee shops.
  • Oregon’s microbrew industry was led by pioneering firms like Widmer Brewing. Kurt Widmer studied the brewing arts in Germany and based the company’s signature Hefeweizen based on what he learned there. (And now Widmer sells its beers in Europe.)

Imitation is the sincerest form of plagiarism

The website Uproxx has a summary of the web furor about the purloined burritos and an interesting roundtable conversation by four food journalists. In some respects, the criticisms of cultural appropriation from the foodie press is a bit rich: it’s an industry that consists in no small part of celebrating novelty and fashion, and elevating food heroes (who all borrow heavily from established chefs and cuisines). There’s a lot of back and forth here; to give you a flavor of the conversation, here’s food writer Zach Johnston:

Saying that people can’t cook another culture’s food that they adore and bring that food home to open it up to a wider audience is the same as saying Joe Rogan or Vince Mancini are culturally appropriating Brazilian culture because they practice Jiu Jitsu. Martial Arts — like cooking and eating — is a unifier, not a divider. And we can’t dismiss logistical reality. 70 percent of Americans are white. Cooking is a trainable and malleable endeavor. White people are going to make samosas, tacos, and bratwurst in America. And American food culture is better for it.

Portland’s alt-weekly, Willamette Week has its own forum with local chefs representing a range of ethnic cuisines and backgrounds.  Chef Ahn Luu, who runs a Vietnamese-Cajun restaurant in Portland argues:

If you’re cooking Thai food outside of Thailand—even in Myanmar or China—it’s not gonna be authentic. All food travels around the world, and every culture has their own version. It’s all getting blown way out of proportion, and people are taking it too seriously. It’s food. If it’s good, eat it.

In that vein, Bloomberg View columnist Noah Smith is an unabashed advocate of cultural appropriation. He argues that it benefits both the appropriators, and those from whom it is appropriated. The appropriators get access to a wider variety of goods and services, they get beneficial mutations to their own products, and cultural appropriation often triggers technological change. Those from whom the technology was appropriated benefit from broader demand for their products, more jobs for immigrants, and greater cultural empathy.

And without cultural appropriation, it’s possible to get stuck in a very bad equilibrium. According to Paul Krugman, the reason that English food was so bad for so long (and has gotten better in recent decades) was because the country suffered from too little cultural appropriation.  But in recent years, the flood (at least pre-Brexit) of immigrants to the United Kingdom, and the holiday travels of the English exposed them (and their taste buds) to a wider range of choices, and as a result, English food has improved dramatically.

Plus, two women operating a food cart part-time is (forgive us) really small potatoes when it comes to cultural appropriation. Its hard to see how one can get terribly upset with a couple of women imitating the food they saw and tasted on a trip (and tried to faithfully copy) and a giant corporation bastardizing an entire nation’s cuisine (we’re looking at you Taco Bell and Olive Garden).  If this is a problem at all, isn’t the real issue the huge imbalance of power between corporations and solo entrepreneurs? Here’s another example. The Dominguez family based in Hood River, Oregon manufactures an extremely popular brand of tortilla chips called “Juanita’s” (trust us:  they’re excellent), which is distributed in the Pacific Northwest.  A couple of years ago, a entirely new brand started showing up on store shelves in Oregon & Washington:  Josefina’s–with a similar red and green bag. Though it didn’t say so on the packaging, the Josefina’s chips were manufactured by the nation’s largest snack chip company (Lays).

Cities and the Cultural Re-Mix

Arguably, cultural appropriation and remixing is at the heart of what cities do. As Jane Jacobs wrote, cities bring together people with different backgrounds and ideas, and mix them serendipitously in ways that create the “new work” that drives progress. The sheer variety of different and interesting things that are available in cities is one of the chief attractions of urban living. The ever-changing smorgasbord of consumption choices–which borrow ideas from all over the world–are what make cities interesting and dynamic places to live. For those with a taste for variety, it turns out that the cost of living in big cities is actually lower than in other places.

Ultimately, a lot of the argument is over the ownership of ideas. We’re strong believers in a knowledge economy, and the continuous development of new and better ideas (from microchips to drugs to better ways to sew a shirt to better ways to make a cup of coffee) are all things that make us better off and which propel economic growth. That said, who owns and who profits from any particular piece of knowledge is an unsettled and contentious area. For some things we grant very strong legal rights (patents, copyrights) and have private businesses that aggressively exploit their value (drug guy). In other areas, and food is one, its almost impossible to control intellectual property like recipes, and imitation and learning produce widespread spillovers.

One of the good things about knowledge as a factor of production is that it is, as Paul Romer has observed, non-rival. You and I can both make use of the same idea without diminishing its utility to either of us. And, for what its worth, we practice what we preach at City Observatory:  All of our work is published under a Creative Commons Attribution (CC-A) license, so anyone is free to copy, republish and reuse our content, subject only to the proviso that they acknowledge our original work.  So please, appropriate away.

 

Your college degree pays off more if you live in a city

The more education you have, the bigger the payoff to living in a city

It’s a well-understood fact that education is a critical determinant of earnings. On average, the more education you’ve attained, the higher your level of earnings. This holds both for individuals, and as we’ve shown for metropolitan areas. But the payoffs to education also depend on where you live. As it turns out, for knowledge workers, pay levels are much higher in cities than in rural areas: on average, someone with a BA or graduate degree makes 25 to 35 percent more if they live in a city than if they live in a rural area. But for those with just a high school diploma, the urban earnings premium is much less, and for those didn’t finish high school, there’s not gain at all.

The much higher returns to education in cities reflect the productivity of urban, knowledge driven economies, and explain why talented workers, especially mobile young ones, are increasingly moving to urban locations.

Let’s take a closer look at the data. The US Department of Agriculture’s Economic Research Service has analyzed data from the Census Bureau on the comparative earnings of urban and rural residents by educational level. The definition of “rural” in this case is the approximately one in seven Americans who live outside the nation’s metropolitan areas.  “Urban” is everything else.

The data show the regular stair-step relationship between educational attainment an average earnings: the more education you have, the higher your earnings, on average. This holds for both rural and urban areas.  But especially for those who live in urban areas, the returns to education are significantly higher. In 2015, those with a graduate or professional degree living in an urban area earned about $18,000 more than their rural counterparts, urban workers with a bachelor’s degree earned about $10,000 more than those with a BA in rural communities.  But at the lower levels of educational attainment, the differences almost disappear. High school dropouts earn almost exactly the same whether they live in an urban area or not.

The importance of urban location to capitalizing on educational attainment is even more apparent when we compute proportionately how much more urban workers earn than their similarly educated rural counterparts. Those with a professional or graduate degree earn one-third more  than their rural counterparts. Those with just a BA earn 25 percent more.  those with more modest education’s also earn a premium if they live in urban areas, but it declines to zero for those who haven’t finished high school.

Percent by which average urban earnings exceed rural earnings

It’s also the case that the return to education is higher within urban areas than it is in rural areas.  For example, in rural areas, a four-year college graduate makes about 50 percent more than a high school graduate.  But in urban areas, a four-year college graduate makes 75 percent more than a high school graduate. So its also the case that the economic incentives (and rewards) for an education are relatively higher in urban areas than in urban ones. It’s also likely to mean that these economic returns to education are more obvious to urban residents than it is to rural ones.

This consistent set of relationships between urban-ness and the economic return to education has at least two important implications.

First, it shows that cities increase the productivity of workers. Cities tend to better match workers to jobs that capitalize on their skills and interests. Cities also help workers acquire new skills. And cities facilitate agglomeration economies, particularly by enabling and incentivizing the innovation that makes workers, firms and the entire economy more productive.

Second, it helps explain why well-educated workers tend to live in cities. If you can earn more living in a city you are likely to do so. Since this relationship is stronger for higher educated workers, they have a larger incentive for living in cities. Those with a high school diploma may only see a modest increase, and therefore have fewer incentives to move to (or stay in cities). This may be affected by cost of living differences, too. If rural areas are cheaper-as they are for many things, particularly housing-this may make them more attractive to less educated workers (who generally spend a higher share of their income on housing, and are therefore relatively more sensitive to housing cost differentials). It also suggests that it will be very difficult to encourage talented workers to move to (or stay in) rural areas:

Cities and the returns to education

The more education you have, the bigger the payoff to living in a city

A recent Wall Street Journal article painted the nation’s rural areas as its new inner cities, with high rates of poverty, limited economic opportunity and a range of social problems.  While the aggregate data mask enormous variation in the nation’s non-metro areas, it got us thinking about the relationship between the payoffs to education and living in the cities.

The US Department of Agriculture’s Economic Research Service has analyzed data from the Census Bureau on the comparative earnings of urban and rural residents by educational level. The definition of “rural” in this case is the approximately one in seven Americans who live outside the nation’s metropolitan areas.  “Urban” is everything else.

The data show the regular stair-step relationship between educational attainment an average earnings: the more education you have, the higher your earnings, on average. This holds for both rural and urban areas.  But especially for those who live in urban areas, the returns to education are significantly higher. In 2015, those with a graduate or professional degree living in an urban area earned about $18,000 more than their rural counterparts, urban workers with a bachelor’s degree earned about $10,000 more than those with a BA in rural communities.  But at the lower levels of educational attainment, the differences almost disappear. High school dropouts earn almost exactly the same whether they live in an urban area or not.

The importance of urban location to capitalizing on educational attainment is even more apparent when we compute proportionately how much more urban workers earn than their similarly educated rural counterparts. Those with a professional or graduate degree earn one-third more  than their rural counterparts. Those with just a BA earn 25 percent more.  those with more modest education’s also earn a premium if they live in urban areas, but it declines to zero for those who haven’t finished high school.

Percent by which average urban earnings exceed rural earnings

It’s also the case that the return to education is higher within urban areas than it is in rural areas.  For example, in rural areas, a four-year college graduate makes about 50 percent more than a high school graduate.  But in urban areas, a four-year college graduate makes 75 percent more than a high school graduate. So its also the case that the economic incentives (and rewards) for an education are relatively higher in urban areas than in urban ones. It’s also likely to mean that these economic returns to education are more obvious to urban residents than it is to rural ones.

This consistent set of relationships between urban-ness and the economic return to education has at least two important implications.

First, it shows that cities increase the productivity of workers. Cities tend to better match workers to jobs that capitalize on their skills and interests. Cities also help workers acquire new skills. And cities facilitate agglomeration economies, particularly by enabling and incentivizing the innovation that makes workers, firms and the entire economy more productive.

Second, it helps explain why well-educated workers tend to live in cities. If you can earn more living in a city you are likely to do so. Since this relationship is stronger for higher educated workers, they have a larger incentive for living in cities. Those with a high school diploma may only see a modest increase, and therefore have fewer incentives to move to (or stay in cities). This may be affected by cost of living differences, too. If rural areas are cheaper-as they are for many things, particularly housing-this may make them more attractive to less educated workers (who generally spend a higher share of their income on housing, and are therefore relatively more sensitive to housing cost differentials). It also suggests that it will be very difficult to encourage talented workers to move to (or stay in) rural areas:

Integration and social interaction: Evidence from Intermarriage

Reducing segregation does seem to result in much more social interaction, as intermarriage patterns demonstrate

Change doesn’t happen fast, but it happens more frequently and more quickly when we have integrated communities

One of the regular critiques of urban integration is that while we might get people from different backgrounds to live in the same neighborhood, that doesn’t necessarily mean that they interact socially on a regular basis.  Earlier, for example, we took a close and critical look at Derek Hyra’s claim that mixed-income, mixed-race communities fell short of improving the lot of the disadvantaged because of the persistence of what he called “micro-segregation.”  Even though they might live in the same neighborhood, people from these different groups still associated primarily with other people with similar backgrounds. We thought there were a lot of problems with this argument (most notably, that the data show a strong positive impact of mixed income neighborhoods for the lifetime prospects of poor kids, notwithstanding micro-segregation).

Ruth Negga and Joel Edgerton star in “Loving,” the story of a couple that challenged the constitutionality of a Virginia ban on interracial marriage.

While we have some useful measures of residential segregation (compiled from Census data), its harder to come by data that illustrate the extent to which people from different racial and ethnic groups spend time associating with each other. A new report from the Pew Charitable Trusts sheds an interesting light on the most personal inter-group interaction: racial/ethnic intermarriage.  It has been half a century since the Supreme Court struck down state bans on interracial marriage in Loving v. Virginia. The data show that intermarriage has increased five-fold from 3 percent of newlyweds in the 1960s to about 17 percent today. The trend has been propelled in part by the nation’s growing diversity, and also due to changing attitudes about intermarriage.  Pew used data from the most recent American Community Survey to calculate the rate of intermarriage between people from different racial and ethnic groups in each of the nation’s metropolitan areas. Pew’s ranking shows that intermarriage is much more common in some metros than in others. In the West, intermarriage rates tend to be much higher, for example, than they are in the South.  (Urban areas have higher intermarriage rates than rural ones, as well).

In part, these differences reflect the regional variation in attitudes toward intermarriage. But the opportunities for intermarriage also hinge directly on the racial and ethnic composition of a metropolitan area.  More diverse areas tend to have greater opportunity for intermarriage than more homogenous ones. The University of North Carolina’s Philip Cohen took the Pew data and compared it to the racial and ethnic diversity of each metropolitan area, and computed an adjusted intermarriage score for each metro area.  Given an area’s racial and ethnic composition, how much intermarriage did it exhibit. This ranking gives us a much clearer idea of where intermarriage is common and apparently socially acceptable, and where different racial and ethnic groups are, in practice, mixing. (See Cohen’s blog for full details).

We thought we’d use these data to look at the correlation between metropolitan segregation and intermarriage. Given an area’s racial and ethnic diversity, are people from different groups more (or less) likely to intermarry depending on the segregation of the metro area? The following chart shows the white-non-white segregation index for each metro area (on the horizontal axis) compared to the demographically adjusted intermarriage rate (from Philip Cohen).  Higher values on the white-non-white segregation index correspond to higher levels of segregation; the index shows the percent of persons in a region who would have to move to a different census tract so that each tract would have the same white/non-white balance as the metropolitan area of which it was a part. (We extracted the segregation index data from an excellent commentary on housing diversity by Trulia’s Cheryl Young.)

These data show a strong negative correlation between segregation and intermarriage. People who live in highly segregated metropolitan areas are much less likely to marry someone from a different racial and ethnic group than those who live in the least segregated areas. Compare, for example, Philadelphia and Austin.  Philadelphia is one of the most segregated large cities (dissimilarity index .65); Austin one of the least segregated (.38). Philadelphia’s intermarriage rate is about half that of Austin’s (.16 vs. .32).

Its possible to imagine that the correlation between segregation and intermarriage reflects both personal opportunities and social values. In less segregated communities, people from different racial and ethnic groups are–by definition–more likely to come into close proximity to one another. But segregation may also reflect (or influence) broader social attitudes about whether interracial relationships are tolerated. These data are very consistent with the notion that greater physical integration of people from different racial and ethnic groups is associated with greater inter-personal interaction.

Of course, the usual caveats about correlation not proving causation apply to this analysis. But it is nonetheless striking that after controlling for the diversity of metropolitan population, intermarriage is much more common in places with low levels of segregation than in places that are more highly segregated. This evidence is highly consistent with the thesis that social interaction among people from different racial and ethnic groups is enhanced by greater integration.

Integration and social interaction: Evidence from Intermarriage

Reducing segregation does seem to result in much more social interaction, as intermarriage patterns demonstrate

Yesterday, we took a close and critical look at Derek Hyra’s claim that mixed-income, mixed-race communities fell short of improving the lot of the disadvantaged because of the persistence of what he called “micro-segregation.”  Even though they might live in the same neighborhood, people from these different groups still associated primarily with other people with similar backgrounds. We thought there were a lot of problems with this argument (most notably, that the data show a strong positive impact of mixed income neighborhoods for the lifetime prospects of poor kids, notwithstanding micro-segregation).

Ruth Negga and Joel Edgerton star in “Loving,” the story of a couple that challenged the constitutionality of a Virginia ban on interracial marriage.

While we have some useful measures of residential segregation (compiled from Census data), its harder to come by data that illustrate the extent to which people from different racial and ethnic groups spend time associating with each other. A new report from the Pew Charitable Trusts sheds an interesting light on the most personal inter-group interaction: racial/ethnic intermarriage.  Its been 50 years since the Supreme Court struck down state bans on interracial marriage in Loving v. Virginia. The data show that intermarriage has increased five-fold from 3 percent of newlyweds in the 1960s to about 17 percent today. The trend has been propelled in part by the nation’s growing diversity, and also due to changing attitudes about intermarriage.  Pew used data from the most recent American Community Survey to calculate the rate of intermarriage between people from different racial and ethnic groups in each of the nation’s metropolitan areas. Pew’s ranking shows that intermarriage is much more common in some metros than in others. In the West, intermarriage rates tend to be much higher, for example, than they are in the South.  (Urban areas have higher intermarriage rates than rural ones, as well).

In part, these differences reflect the regional variation in attitudes toward intermarriage. But the opportunities for intermarriage also hinge directly on the racial and ethnic composition of a metropolitan area.  More diverse areas tend to have greater opportunity for intermarriage than more homogenous ones. The University of North Carolina’s Philip Cohen took the Pew data and compared it to the racial and ethnic diversity of each metropolitan area, and computed an adjusted intermarriage score for each metro area.  Given an area’s racial and ethnic composition, how much intermarriage did it exhibit. This ranking gives us a much clearer idea of where intermarriage is common and apparently socially acceptable, and where different racial and ethnic groups are, in practice, mixing. (See Cohen’s blog for full details).

We thought we’d use these data to look at the correlation between metropolitan segregation and intermarriage. Given an area’s racial and ethnic diversity, are people from different groups more (or less) likely to intermarry depending on the segregation of the metro area? The following chart shows the white-non-white segregation index for each metro area (on the horizontal axis) compared to the demographically adjusted intermarriage rate (from Philip Cohen).  Higher values on the white-non-white segregation index correspond to higher levels of segregation; the index shows the percent of persons in a region who would have to move to a different census tract so that each tract would have the same white/non-white balance as the metropolitan area of which it was a part. (We extracted the segregation index data from an excellent commentary on housing diversity by Trulia’s Cheryl Young.)

These data show a strong negative correlation between segregation and intermarriage. People who live in highly segregated metropolitan areas are much less likely to marry someone from a different racial and ethnic group than those who live in the least segregated areas. Compare, for example, Philadelphia and Austin.  Philadelphia is one of the most segregated large cities (dissimilarity index .65); Austin one of the least segregated (.38). Philadelphia’s intermarriage rate is about half that of Austin’s (.16 vs. .32).

Its possible to imagine that the correlation between segregation and intermarriage reflects both personal opportunities and social values. In less segregated communities, people from different racial and ethnic groups are–by definition–more likely to come into close proximity to one another. But segregation may also reflect (or influence) broader social attitudes about whether interracial relationships are tolerated. These data are very consistent with the notion that greater physical integration of people from different racial and ethnic groups is associated with greater inter-personal interaction.

Of course, the usual caveats about correlation not proving causation apply to this analysis. But it is nonetheless striking that after controlling for the diversity of metropolitan population, intermarriage is much more common in places with low levels of segregation than in places that are more highly segregated. This evidence is highly consistent with the thesis that social interaction among people from different racial and ethnic groups is enhanced by greater integration.

Socioeconomic mixing is essential to closing the Kumbaya gap

Integrated neighborhoods produce more mixing, but don’t automatically generate universal social interaction. What should we make of that?

Our recent report, America’s Most Diverse, Mixed Income Neighborhoods identifies those places where people from different racial and ethnic backgrounds and from different income strata all live in close proximity to one another. We’ve counted more than 1,300 neighborhoods with nearly 7 million residents that have high levels of racial/ethnic and income integration. In these places, at least, people from different backgrounds share a common neighborhood.  But is that enough? Some critics complain that while people may live close to one another in such places, there may be little meaningful interaction. Today we consider this issue.

In one idealized view of the world, economically integrated neighborhoods would have widespread and deep social interactions among people from different backgrounds. We’d tend to be color-blind and class-blind, and no more (or less) likely to interact with people from different groups than with people similar to ourselves. In practice, even in neighborhoods with a high degree of racial or income diversity, it still tends to be the case that people primarily associate with people like themselves. Even in the most integrated neighborhoods, there’s a “kumbaya” gap. Should we we regard that as a sign of failure?

That’s the argument that Derek Hyra makes about gentrifying neighborhoods, like U Street, in Washington DC. Blacks and whites, rich and poor live in close proximity to one another, but primarily associate only with people like themselves in daily live. Last week’s CityLab article interviewing Hyra is entitled: “Gentrification doesn’t mean diversity.” The article’s URL is “gentrifying neighborhoods aren’t really diverse.”

The point Hyra actually makes isn’t that the neighborhoods aren’t diverse, per se, but that within the neighborhoods, people still associate primarily with people with similar demographic characteristics. We may have alleviated segregation at one level, but in personal interactions, there’s still “micro-segregation.”

CityLab’s Tanvi Misri interviews Hyra about his new book, —Race, Class and Politics in the Cappuccino City.  Hyra observes that Washington, DC’s U Street neighborhood is now more racially and economically diverse, but notes that its still the case that people mostly associate with others of similar backgrounds in places like churches, stores and coffee shops. His argument seems to be, sure, its great that so-called gentrifying neighborhoods are more integrated, but since people of different races/classes, aren’t socializing directly, its basically a failure.  From the interview:

Elaborate on what’s positive and what’s problematic about this change, and with this perception of the neighborhood.

We have been so segregated in the United States and that now that whites are attracted and willing to move into what was formerly a low-income African-American neighborhood does symbolize some progress, in terms of race relations in the United States. That we have mixed-income, mixed-race neighborhoods, I think, is a very positive thing.

But that diversity not necessarily benefiting the former residents. Most of the mechanisms by which low-income people would benefit from this change are related to social interaction—that low-, middle-, and upper-income people would start to talk to one another. They would problem solve with one another. They would all get involved civically together to bolster their political power. But what we’re really seeing is a micro-level segregation. You see diversity along race, class, sexual orientation overall, but when you get into the civic institutions—the churches, the recreation centers, the restaurants, the clubs, the coffee shops—most of them are segregated. So you’re not getting a meaningful interaction across race, class, and difference. If we think that mixed-income, mixed-race communities are the panacea for poverty, they’re not.

Is the failure to reach maximum kumbaya really an indication that more socioeconomic mixing isn’t a good thing?  We don’t think so, for several reasons  First, unless you first get mixed income, mixed race neighborhoods, you have almost no chance having the opportunity for regular  social interactions. When we live in neighborhoods widely segregated by race and/or income its even more difficult to establish these boundary-crossing personal relationships. Socioeconomic mixing is necessary, even if it alone isn’t sufficient–especially immediately–to produce deeper interactions.

Second, “kumbaya” integration is probably an unrealistic goal: even within our neighborhoods (and socioeconomic groups) we do spend our personal time disproportionately with people who share our own peculiar interests. That’s true even within economically homogenous neighborhoods: people tend to spend much more time and develop stronger relationships with people most like them.

Third:  The evidence of overwhelming that mixed income neighborhoods (kumbaya or not) have big benefits, especially for lower income kids.  They get more resources, can access stronger networks, find better partners and career paths, etc.  The evidence from the Equality of Opportunity project, led by Raj Chetty, the research of Patrick Sharkey, and Eric Chyn’s study of Chicago Housing Authority residents all confirm that simply moving to a more mixed income neighborhood materially improves the life outcomes of poor kids. In addition, an important aspect of the socioeconomic mixing in the civic commons is promoting the kind of interactions that help us develop an awareness–imperfect and incomplete as it may be–that there are real people who have very different lives and expectations than we do.

Fourth, we know what happens when people don’t have this kind of first hand familiarity with a more diverse population. It shows up plainly in the results of the last election.  People who lived in communities with limited exposure to immigrants, or in neighborhoods that were predominantly white, segregated enclaves were much more likely to vote for Donald Trump than Hillary Clinton, even after controlling for other characteristics (party affiliation, age, and income) than others.  After sifting through national polling and demographic data Gallup’s  Jonathan Rothwell concludes:

“The analysis provides clear evidence that those who view Trump favorably are disproportionately living in racially and culturally isolated zip codes and commuting zones. Holding other factors, constant support for Trump is highly elevated in areas with few college graduates and in neighborhoods that standout within the larger commuting zone for being white, segregated enclaves, with little exposure to blacks, Asians, and Hispanics.”

The more separated we are from one another, the more likely we are to not support broad-based policies that promote equality and opportunity.  In the absence of more U Streets, we get policies that produce more and more segregated suburbs and neighborhoods of concentrated poverty.  We shouldn’t fixate on the failure of U Street to achieve some imaginary ideal; instead we should recognize that its essential to do many more “U Streets”  just to offset the scale of the segregation everywhere else. Fighting segregation comes first; Kumbaya will come, if it comes at all, later.

If we set impossibly high expectations about the nature of integration, and when we’re provided with anecdotes that recent and long-time residents in a community don’t associate much with one another, it’s tempting–but wrong–to conclude the whole thing was an epic fail.  As with so much in this field, that makes the perfect the enemy of the good, or at least the somewhat better.

 

Integration and the Kumbaya gap

Gentrifying neighborhoods produce more mixing, but don’t automatically generate universal social interaction. What should we make of that?

In one idealized view of the world, economically integrated neighborhoods would have widespread and deep social interactions among people from different backgrounds. We’d tend to be color-blind and class-blind, and no more (or less) likely to interact with people from different groups than with people similar to ourselves. In practice, even in neighborhoods with a high degree of racial or income diversity, it still tends to be the case that people primarily associate with people like themselves. Even in the most integrated neighborhoods, there’s a “kumbaya” gap. Should we we regard that as a sign of failure?

That’s the argument that Derek Hyra makes about gentrifying neighborhoods, like U Street, in Washington DC. Blacks and whites, rich and poor live in close proximity to one another, but primarily associate only with people like themselves in daily live. Last week’s CityLab article interviewing Hyra is entitled: “Gentrification doesn’t mean diversity.” The article’s URL is “gentrifying neighborhoods aren’t really diverse.”

The point Hyra actually makes isn’t that the neighborhoods aren’t diverse, per se, but that within the neighborhoods, people still associate primarily with people with similar demographic characteristics. We may have alleviated segregation at one level, but in personal interactions, there’s still “micro-segregation.”

CityLab’s Tanvi Misri interviews Hyra about his new book, —Race, Class and Politics in the Cappuccino City.  Hyra observes that Washington, DC’s U Street neighborhood is now more racially and economically diverse, but notes that its still the case that people mostly associate with others of similar backgrounds in places like churches, stores and coffee shops. His argument seems to be, sure, its great that so-called gentrifying neighborhoods are more integrated, but since people of different races/classes, aren’t socializing directly, its basically a failure.  From the interview:

Elaborate on what’s positive and what’s problematic about this change, and with this perception of the neighborhood.

We have been so segregated in the United States and that now that whites are attracted and willing to move into what was formerly a low-income African-American neighborhood does symbolize some progress, in terms of race relations in the United States. That we have mixed-income, mixed-race neighborhoods, I think, is a very positive thing.

But that diversity not necessarily benefiting the former residents. Most of the mechanisms by which low-income people would benefit from this change are related to social interaction—that low-, middle-, and upper-income people would start to talk to one another. They would problem solve with one another. They would all get involved civically together to bolster their political power. But what we’re really seeing is a micro-level segregation. You see diversity along race, class, sexual orientation overall, but when you get into the civic institutions—the churches, the recreation centers, the restaurants, the clubs, the coffee shops—most of them are segregated. So you’re not getting a meaningful interaction across race, class, and difference. If we think that mixed-income, mixed-race communities are the panacea for poverty, they’re not.

Is the failure to reach maximum kumbaya really an indication that more socioeconomic mixing isn’t a good thing?  We don’t think so, for several reasons  First, unless you first get mixed income, mixed race neighborhoods, you have almost no chance having the opportunity for regular  social interactions. When we live in neighborhoods widely segregated by race and/or income its even more difficult to establish these boundary-crossing personal relationships. Socioeconomic mixing is necessary, even if it alone isn’t sufficient–especially immediately–to produce deeper interactions.

Second, “kumbaya” integration is probably an unrealistic goal: even within our neighborhoods (and socioeconomic groups) we do spend our personal time disproportionately with people who share our own peculiar interests. That’s true even within economically homogenous neighborhoods: people tend to spend much more time and develop stronger relationships with people most like them.

Third:  The evidence of overwhelming that mixed income neighborhoods (kumbaya or not) have big benefits, especially for lower income kids.  They get more resources, can access stronger networks, find better partners and career paths, etc.  The evidence from the Equality of Opportunity project, led by Raj Chetty, the research of Patrick Sharkey, and Eric Chyn’s study of Chicago Housing Authority residents all confirm that simply moving to a more mixed income neighborhood materially improves the life outcomes of poor kids. In addition, an important aspect of the socioeconomic mixing in the civic commons is promoting the kind of interactions that help us develop an awareness–imperfect and incomplete as it may be–that there are real people who have very different lives and expectations than we do.

Fourth, we know what happens when people don’t have this kind of first hand familiarity with a more diverse population. It shows up plainly in the results of the last election.  People who lived in communities with limited exposure to immigrants, or in neighborhoods that were predominantly white, segregated enclaves were much more likely to vote for Donald Trump than Hillary Clinton, even after controlling for other characteristics (party affiliation, age, and income) than others.  After sifting through national polling and demographic data Gallup’s  Jonathan Rothwell concludes:

“The analysis provides clear evidence that those who view Trump favorably are disproportionately living in racially and culturally isolated zip codes and commuting zones. Holding other factors, constant support for Trump is highly elevated in areas with few college graduates and in neighborhoods that standout within the larger commuting zone for being white, segregated enclaves, with little exposure to blacks, Asians, and Hispanics.”

The more separated we are from one another, the more likely we are to not support broad-based policies that promote equality and opportunity.  In the absence of more U Streets, we get policies that produce more and more segregated suburbs and neighborhoods of concentrated poverty.  We shouldn’t fixate on the failure of U Street to achieve some imaginary ideal; instead we should recognize that its essential to do many more “U Streets”  just to offset the scale of the segregation everywhere else. Fighting segregation comes first; Kumbaya will come, if it comes at all, later.

If we set impossibly high expectations about the nature of integration, and when we’re provided with anecdotes that recent and long-time residents in a community don’t associate much with one another, it’s tempting–but wrong–to conclude the whole thing was an epic fail.  As with so much in this field, that makes the perfect the enemy of the good, or at least the somewhat better.

 

Just ahead: Road pricing?

Trump’s infrastructure package would let states pursue road pricing

A trillion dollars for infrastructure. That’s been the headline talking point for months about the Trump Administration’s policy agenda, but the details have been murky at best. A short white paper prepared for the campaign by now Commerce Secretary Wilbur Ross sketched out a plan for tax credits leveraging private investments, but that has approach appears to be morphing into something different.

If the road ahead were priced correctly, it wouldn’t be congested.

A new budget fact sheet released by the Office of Management and Budget earlier this month, is starting to sharpen the focus on some of the potential details of the administration’s upcoming infrastructure package. It now appears that Transportation secretary Elaine Chao is taking the lead here. She said that the administration plans to implement long run changes in how “projects are regulated, funded, delivered and maintained.”  That appears to be code for a sharply reduced federal role in funding things like highways and transit, and shifting more of the responsibility to state and local governments and the private sector. Apparently, general fund support for the highway trust funded will be pared back, reducing money for new roads, and essentially eliminating future “new starts” transit projects. The popular Tiger-grant program is also on the chopping block. The plan calls for incentivizing non-federal funding, which would include additional state and local funding as well as privately financed projects.

In the realm of transportation, one of the key “incentives” is easing the current restrictions on tolling interstate freeways. Currently, its difficult to impose tolls on federally financed roads. The proposal calls both for expanding an existing policy that allows states to explore road pricing on a pilot basis, and also more generally eliminating the ban on tolling existing roads. From the fact sheet:

  • Incentivize Innovative Approaches to Congestion Mitigation. The Urban Partnership Agreement Program – and its successor, the Congestion Reduction Demonstration Program – provided competitive grants to urbanized areas that were willing to institute a suite of solutions to congestion, including congestion pricing, enhanced transit services, increased telecommuting and flex scheduling, and deployment of advanced technology. Similar programs could provide valuable incentives for localities to think outside of the box in solving long-standing congestion challenges.
  • Liberalize Tolling Policy and Allow Private Investment in Rest Areas. Tolling is generally restricted on interstate highways. This restriction prevents public and private investment in such facilities. We should reduce this restriction and allow the States to assess their transportation needs and weigh the relative merits of tolling assets. The Administration also supports allowing the private sector to construct, operate, and maintain interstate rest areas, which are often overburden and inadequately maintained.

According to Politico, these proposals have already prompted an outcry from truckers and truck stop operators. Truckers do not like paying tolls, and existing truck stop operators recognize that privatized highway rest areas would likely be formidable competitors. But beyond the objections of these two groups, there are actually good reasons to welcome a federal policy that would allow states to implement tolls more broadly.

Despite the widespread mythology that the cars and trucks pay for the roads they drive on, the truth is that gas taxes and vehicle registration fees don’t pay for anything close to the construction and maintenance cost of roads, much less the social and environmental costs they create.  The Congressional Budget Office–the same group that scores health care initiatives–estimates that over the road trucks alone are subsidized to the tune of between $57 and $128 billion per year.

Implementing road pricing more widely is one way to assure that those who benefit from the road system pay the costs of constructing and maintaining it.  And just as importantly, pricing is a way to manage demand and reduce congestion. If we priced peak hour travel more accurately, we would likely diminish traffic, and avoid the need to spend additional billions on road capacity–an expensive lesson that Louisville is learning.

While there’s a lot to be concerned about in the Trump infrastructure proposal–particularly in vague and open-ended suggestions about eliminating the kinds of environmental review standards that have help block some of the worst boondoggles–when it comes to road pricing, urbanists might well be advised to heed the advice of Kim-Mai Cutler:  When life hands me lemons, I make tarte au citron.

 

 

Back at the ranch

What the ranch house teaches us about house prices and filtering.

Back in the heyday of the post-war housing boom, back when the baby boomers were babies, America was building ranch houses–millions of them. In its prime, the ranch house was the embodiment of the middle-class dream, and newly built suburbs across the nation were full of them. The relative modesty and similarity of the ranch home is a good reflection of the nature of the housing market in those years. And, as we’ll see, what’s happened to the ranch house since then is an important reflect of the way housing markets work–or don’t work–today.

Sunset Magazine helped popularize this California style when it published a book full of Cliff May’s ranch home designs in 1946, and the idea spread, with modest regional adaptations, to the entire country and was undoubtedly the dominant housing type of the 1950s. By 1950, the ranch style accounted for nine out of every ten new homes built in the country. Even Levittown had its own “Rancher” model, a 2-bedroom, one-bath starter home with room for expansion, that sold for as little as $8,900 in 1953.  The kind of wide regional variations in housing prices we see today were much more muted in the 1950s. Contemporary home price listings show that in the mid to late 1950s ranch homes sold for prices in the mid-teens all over the country from Mansfield, Ohio ($17,900 in 1954), to Pittsfield, Massachusetts ($14,300 in 1957) to Appleton, Wisconsin ($16,500 in 1959).

The American Dream, circa 1955. (www.midcenturyhometyle.com)

Even with the regional variations in materials and finishes, the ranch homes and their close variants shared the same modest proportions: Two or three bedrooms, all constructed on a single level, with one or occasionally two bathrooms, and averaging 1,000 or perhaps 1,200 square feet. They usually had a single car garage or carport. It was the idealized, suburban middle class lifestyle that skid row shop clerk Audrey (Ellen Greene) dreamed of in Little Shop of Horrors

More than a half century later, the ranch house has fallen out of style. American homebuyers now want bigger houses, with multiple levels, fancier finishes and architectural details that would never fit on the simple ranch house. But while we’re not building any new ones, most of the old ranches are still around, although they now occupy a considerably different place in the housing spectrum than they did when they were new.  The median owner-occupied home in the United States was built just before 1980, meaning that almost everywhere, ranch homes are now much older than the average house on the market.

If housing were simply about shelter, then you would expect all of these houses that commanded roughly the same price when they were built 60 or more years ago to have experienced roughly the same amount of depreciation over time. But that’s far from the case. To get an idea of how much variation there is across markets in the contemporary value of ranch homes, we combed through Zillow home listings to find some 1950s era smaller homes that are for sale now.  (Our selections were haphazard and don’t represent a random or scientific sample, but were chosen to represent how much variation there is across markets). Here’s what we found:

CityListing
Cleveland
$25,000
Kansas City
$25,000
Atlanta
$60,000
Houston
$145,000
Portland
$275,000
Denver
$325.000
Menlo Park
$1,650,000

Depending on where you are, today your ranch home could cost anywhere from as little as $25,000, in some neighborhoods in Cleveland and Kansas City, to as much as $1.6 million in one of the tonier suburbs of Silicon Valley (Menlo Park).  The reason for the difference has little to do with the houses themselves (though there’s no doubt the Menlo Park model is much more nicely painted and maintained that its low priced cousins). It has everything to do, instead, with the housing markets that they’re situated in, and the way in which those markets are regulated.

In markets with weak demand for housing, because of slow economic growth or decline, or simply the hollowing out of first-tier suburbs because of sprawl–Cleveland, Kansas City– the humble ranch home is very cheap. The ranch home is also pretty darn cheap in places with a highly elastic housing supply (Atlanta, Houston), where its relatively easy to build new houses or bigger ones in place of smaller ones. But the higher the demand in a housing market, and the more constraint on building new units, the higher the price the ranch home commands, despite its advanced age, limited adornment, and cramped rooms. With its short commute to Silicon Valley, the tiny old ranch home commands a price that would easily buy a palatial new McMansion in any other part of the country.

As a result, to varying degrees, in most parts of the country, these aging ranch homes are a primary source of affordable housing opportunities. If you are a first-time home buyer, or have a limited income, the ranch home may look like a reasonably priced fixer-upper that enables you to enjoy the benefits of homeownership. But in those places where the housing supply is constrained, households that would otherwise be buying big new homes find themselves bidding up the price of little old ranch houses. So instead of “filtering” down market as they age, our ranch home continues to be occupied by upper income households. (It may even move up-market).

The lesson here is that houses are not merely bundles of sticks, and that the affordability of housing isn’t determined by the historic sunk costs of the materials that went into building them. Judged by a physical assessment of the bill of materials that went into their construction–so many yards of concrete, bricks, board feet of two-by-fours, square feet of plywood, lengths of pipe and wire, and number of windows–the seven surviving ranch houses here are nearly indistinguishable one from another. What makes some affordable, and others not is the policy decisions over the subsequent decades about how easy or difficult it would be to build additional housing nearby.

Dirt cheap.

Why we’re very skeptical about urban farming.

At City Observatory, we don’t tend to have a lot of content about agriculture. Farming is not an urban activity. But every so often, we read techno-optimistic stories about how a new era of hyper-local food grown in your neighborhood or very nearby,  is just around the corner or coming soon to a city near you. The latest of these appeared at Fast Company last week, in an article asking: “Has this Silicon Valley Startup finally nailed the indoor farming model?”

Adele Peters writes about a company called Plenty, which is working on a vertical farming system. They’ve raised $26 million in venture capital to develop technologies they’re using to growing kale and a range of herbs in a mixture of ground-up plastic bottles in clear vertical tubes, aided by hyper-efficient LED lights that speed-up the growth process. Plenty’s business model calls for growing products closer to customers, so that produce is fresher, better tasting and healthier. And, at least in theory, shipping product shorter distances allows them to grow more fragile or more perishable varieties than crops that have to travel for days or thousands of miles.

There are a lot of cool aspects to this product: Peters extols the tastiness of the product–regaling readers us with stories about how celebrity chef Rick Bayless pronounced the product surprisingly good–and shares Plenty’s claim that vertical farming is 350 times more space efficient than conventional dirt-based agriculture, While the story is replete with VC-pitch based talking points about the efficiency of some aspects of the indoor farming model–purportedly 1 percent as much water use as field crops, 30 miles to the consumer, not 3,000, plus the company uses LED lights are 64 times more cost-effective than those available a few years ago–one fact is conspicuously missing from the narrative:  How much will consumers be asked to pay for indoor-grown kale and basil?  As a comparative price point, its worth noting that its already possible to buy live butter lettuce (root ball attached) for $3.89 and live basil plants (Trader Joes sell’s them nationally for $3-$4.)

There are good reasons to be dubious of the economics of urban agriculture. Plenty’s technology requires that they pay urban or retail prices for things like water, energy, growing substrate and space. With conventional farming, water is cheap or even free (rainfall, or water that landowners get by right), the energy content of food comes directly from the sun (again at zero price, bundled with the land). And rural land is orders of magnitude cheaper than urban land. The average value of an acre of cropland in the US is about $4,000 an acre, or roughly 10 cents a square foot.  Meanwhile, typical prices for bare land in urban areas are frequently on the order of $10 to $20 per square foot (and higher in big cities).  So just for starters, vertical farming has to be 100 times more efficient than conventional farming to offset land costs (and that’s without considering capital construction costs to go indoors). To economists, the high price of urban land isn’t so much an obstacle to urban or indoor farming as much as it is an indicator telling us that we have much better uses for our limited supply of urban land than growing food. Every acre of city land not available for housing, for example, pushes residents further out (often on to farm land), and entails more driving for all trips.

Its also worth noting that the technology for quickly moving fragile and perishable plant-based products across the globe is fully mature, and surprisingly inexpensive. For example, we get tulips from the Netherlands and raspberries from Chile for prices that are competitive with US grown product.

Little surprise then, as Peter’s mentions in the article’s tenth paragraph, that several startups–including Farmed Here, Local Food, and Podponics–that have tried indoor farming have gone belly up, taking millions of dollars of investors funds with them. Whether Plenty will succeed where others have failed most likely hinges on the un-asked question of what price it will charge for its produce.

The headline of this Fast Company article is a classic instance of what we call “Hertz’s law*”, named for our City Observatory colleague Daniel Kay Hertz who observed that when story or blog post titles are framed as a yes/no question, the best answer to that question is almost always “no.”  “Has this Silicon Valley Startup finally nailed the indoor farming model?” In this case, we believe Hertz’s law applies. But we’ll stay tuned to see whether Plenty can make the math work. We just hope the next business reporter that writes a story about this company asks a hard question or two about the retail price of the end product.

By the way, our pessimism isn’t because we’re somehow not highly enamored of fresh greens. Here at City Observatory’s headquarters in Portland, a small patch of arugula grows year-round. It is picked minutes before its served, and travels about 75 feet to where its consumed.

Urban arugula.

While we agree there’s undoubtedly a market for premium grade produce, its probably not a very large one, and competition in the high end of the food business is increasingly fierce (as Whole Foods has learned). Indoor farming is likely to be niche technology for some specialized products, but as long as dirt is cheap, and sunlight is free, it’s unlikely to be much of a challenge to open-air, soil-based farming. While we believe cities do most things better than anywhere else, efficiently growing enough food to feed several billion people is the thing they’ll probably continue to do less well.

* Several commenters have alerted us to the much longer lineage of this observation, which goes under the name “Betteridge’s Law.”

Let’s use a marketing campaign to solve traffic congestion

Here’s a thought:  Let’s fight traffic congestion using the same techniques DOT’s use to promote safety.

Let’s have  costumed superheroes weigh in against congestion, and spend billions on safety, instead of the other way around.

Why don’t we insist that driver’s take responsibility for the length of their commutes?

Today marks the first day of “National Pedestrian Safety month” and there’s a new federally-funded marketing campaign to raise awareness.  Tragically, as Streetsblog’s Kea Wilson documents, the campaign is rife with myths and centers of victim-blaming while ignoring the deep-seated and systemic reasons why our road system regularly kills and injures thousands of pedestrians.

But this PR approach to public safety betrays an even deeper bias in the way approach two different aspects of transportation:  Safety and Congestion.  There’s a profound disconnect in transportation policy:  When it comes to fighting congestion, highway departments spare no expense. They’ll right billion dollar checks to widen a mile or three of urban freeway in the name of speeding traffic (if only a little bit  and for a short time).  But when it comes to safety those same highway departments turn stingy, and manage only to find a few pennies, mostly for sanctimonious and ineffectual public service announcements.

Vision Zero is just the latest iteration of a long string of campaigns to promote greater safety. And if you’ll notice, the emphasis is on “campaign” as in communication campaign, marketing campaign or public awareness campaign. It regularly manifests itself if the victim-blaming, victim shaming ideas like “distracted walking”, and blatantly ignores the evidence that the surge in road fatalities since 2014 is directly correlated with the uptick in driving from lower gas prices. No one’s more privileged in American society that the driver a motor vehicle. Drivers are regularly excused from any consequences when the machines they operate kill other human beings.

So here’s today’s modest proposal:  Let’s turn those priorities around.  Let’s have highway departments use marketing campaigns to fight congestion, and spend real dollars on improving safety.  Let’s have a potent message of personal responsibility when it comes to commutes and congestion, and make moving the needle on traffic deaths and injuries the arbiter of where we spend transportation system dollars.

Why do we always fight safety problems with P. R. campaigns?

The death toll for pedestrians is rising. But as far as highway departments and auto advocates are concerned, the appropriate way to tackle this problem is through communication and public education. New York City has spent $2.5 million on a dozen billboards and several hundred bus placards proclaiming (among other things) “Car crashes are not accidents.  Your choices behind the wheel matter.” Los Angeles has spent $2 million to commission eight community groups and artists organizations to teach their friends and neighbors about the role everyone can play in reducing traffic deaths. Some cities have colorful characters to stir the public’s imagination.

San Diego’s efforts safety efforts are led by Captain Vision Zero. Why don’t we have cartoon characters fight congestion, and spend real money on safety, instead of the other way ’round?

It’s cute and eye-catching, but it betrays the trivialization of the problem.  Writing about one such effort, Curbed’s Alissa Walker tweeted:

This Wednesday is #NationalWalktoSchoolDay (it’s also #CleanAirDayCA). Come show your support for students who have to use streets so dangerous they require superpowers to cross them.

They shouldn’t need superheroes. They need elected officials who care.

And too often, so-called safety campaigns are thinly-veiled messages that blame pedestrians for car crashes, like a now-cancelled ad produced in Portland.

If you have a strong sense of deja vu when it comes to public relations-centered traffic safety efforts, there’s a good reason for that.  Every few years we come out with a new set of slogans and a marketing campaign to match. As Jim Wilcox wrote on the Oregon Transportation Forum in 2015:

In 2008, following increased concerns about pedestrian and bicycle safety, leaders in Portland and Eugene got behind a safety campaign called “Eye to Eye“. Partnering with AAA, funds were procured, PR was purchased, graphics were created, media was bought, and local safe transportation advocates stood with civic leaders to roll out the campaign as the media clicked away.

However, within a year or two, the campaign lost steam as already limited funds dried up. But people felt like they did something,  helping to assuage the sense of responsibility for continued pedestrian deaths.

. . .  Our long history of failing to achieve significant increases in pedestrian safety leads a lot to be said and done. I think we should start thinking of the next campaign because the will be another after more pedestrians are killed. I suggest it be called “Seeing Red”.

So be it.  But if a good marketing campaign is the right approach to achieving road safety, then let’s use the same approach to traffic congestion. Instead of building new roads–which just subsidize bad behavior and encourage more trip making, more pollution and more sprawl–let’s educate drivers on how to avoid congestion.

Where’s “personal responsibility” when we talk about traffic congestion?

One of the favored themes of these safety PR campaigns is “responsibility.” Everyone has to take greater responsibility for the impacts of their actions on the safety of other road users.  Here’s the boilerplate from a typical campaign from North Carolina:

Why don’t we have have highway agencies that tell people that the length of their commute is their personal responsibility?  They decide where to live, they choose which jobs they’re going to apply for and take, they decide how long a commute and by what means and at what time they’re going to travel.  Let’s have some public service ads encouraging commuters to own-up to the consequences of their choices, rather than just assuming we ought to throw more money at freeways to (maybe) make their commute a little easier.

Ironically, most Americans have already figured out how to minimize lengthy commutes and their exposure to traffic congestion. Fully 58 million American commuters spend less than 20 minutes in their journey to work, according to the American Community Survey.  They’ve made choices about where to live, and found jobs close enough to home that they are only minimally exposed to traffic congestion.

But there’s much we can do as individuals to reduce traffic and our travel times.  We can travel at off-peak hours, and negotiate flexible work schedules that let us travel at times when roads (and transit services) are less congested.  We can telecommute more.

We can walk, ride bikes and take transit and carpool instead of driving our private car by ourselves.  All of these modes of travel let us carve out valuable time for healthy exercise (biking and walking), time to read, rest or listen to music or audio-books (transit) or socialize with friends and colleagues (car pooling).  Studies have show that nothing has bigger negative effects on personal happiness than long solo commutes.

And we can encourage and support others who pursue these kinds of alternatives.

Finally, when we move or look for a new job, we can put a premium on housing or employment options that have a short commute. A short commute is like a permanent, tax-free raise: You get to spend less time engaged with work-related tasks.

Its all a matter of personal responsibility:  you can make choices that expose yourself to congestion—and make congestion worse for everyone else—or you can make choices that give you more free time, a healthier lifestyle, and make you happier.

If we devoted a fraction of the effort and expense that now goes to pseudo-scientific quantification of the supposed “costs of congestion,” we’d have plenty of money for an aggressive, no-holds-barred public relations campaign telling people to change their commuting behavior to reduce their exposure to traffic congestion.

What’s particularly ironic of course, is the extent to which truly global, social problems, ones which will only be solved (if they ever are) by large scale collective action—like climate change—have been readily re-packaged as a series of personal tips and tricks:  Ten things you can do to save the planet:  un-plug your computer when you’re not using it, switch to compact fluorescent or LED light bulbs, drive a bit slower, cut down on food waste, recycle, use a revolving door whenever possible, turn the water off when you’re brushing your teeth.

So when it comes to congestion, it’s time to roll out the kind of PR only effort that we’ve long applied to safety.  So start devising slogans, printing bumper stickers, renting billboards and commissioning PSAs:  I’m sure if we can just get everyone to pitch in and change their attitudes, take more responsibility, and behave better. That’ll improve traffic.

And actually, we have some tangible real world evidence that it works:  Consider Seattle’s repeated experiences with Carmaggedon. The city closed a downtown freeway that had carried about 90,000 cars per day. The closure was preceded with widespread media coverage and a campaign to encourage people to ride transit, bike and walk, and avoid the area at rush hour. The city feared Carmaggedon again when it started tolling a tunnel built to replace the closed freeway. Both times, rather than being much worse than usual, traffic conditions were actually somewhat better–because some commuters made different choices.

And, maybe, just maybe, when it comes to talking about traffic congestion, state  transportation officials can adopt the same passive, fatalistic and blame-shifting rhetorical approach that they’ve so adeptly applied to traffic deaths. Congestion, like “accidents,” is just one of those things that happen; it’s too bad, but that’s just the way things are, you know.

A little more cavalier neglect of whining commuters coupled with a public relations approach to tackling congestion  might actually free up some real money to save lives by making roads and streets safer. And a PR campaign won’t be any less effective in reducing congestion that widening roads.

 

Hagiometry: Fawning flatterers with an economic model

It’s no longer fashionable to get an unrealistically flattering portrait painted, but you can get an economist to do it with numbers.

You’ve no doubt heard the term “hagiography” an unduly flattering biography or other written treatment designed to burnish the public image of some person. The term is derived from the Greek words for “holy” and “writing”, in this combination meaning essentially, “writing that makes something seem holy.” The dictionary definition is:

A very admiring book about someone or a description of someone that represents the person as perfect or much better than they really are, or the activity of writing about someone in this way.”

 In short, it’s an academic’s $20 word for what working journalists would call a puff piece.

Well, today, at City Observatory, we’re adding our own coinage to this realm: hagiometry.  So if hagiography is writing that flatters the ego; hagiometry is flattery with numbers. And its the stock in trade of a coterie of consultants who will, for a fee, tell you that your convention center, sport stadium, industry, tourist attraction or highway overpass is quantitatively all kinds of good for the local economy. While there’s little market for long form fawning biographies (you can write your own autobiography and take care of that) and large format portraiture has gone out of style, we’re increasingly a data-driven society, so its little surprise that the largest and most lucrative field for highly compensated ego polishing involves the manipulation of statistics.

So what does hagiometry look like?  Well, for starters, if you’ve ever read an economic impact study, then chances are you’ve put your hands on real life hagiometry. We were reminded of just how obvious this is when we read about a recent report designed to show the economic impact of a spring training ballpark in Dunedin, Florida. Let’s turn the microphone over to Deadspin’s Kevin Draper, who wrote a story last week “Florida’s Go-To Stadium Economist is a Hack, A Shill, and also not an Economist.” Draper tells the story of Mark Bonn, a Florida State University professor of recreation who has a small cottage industry of writing economic impact statements for ballparks. He produced a report for the Toronto Blue Jays showing that their spring training site produced $70 million in annual economic impacts, as part of the team’s effort to convince state and local taxpayers to shell out $65 million in subsidies for the stadium. Bonn’s work was so egregiously overstated that even his client couldn’t countenance the exaggerations. The consultant fought to keep the inflated numbers and suggested his client suppress the methodology to avoid any embarrassing questions. (Bonn also produced a report for the New York Yankees showing that their spring training games generated nearly $10 million per game in economic impacts). Even more damning: the whole train of inflated numbers, questionable assumptions and bad math was unearthed by reporters from the local television station WTSP. (Not a medium noted for its interest in or ability to critique any data driven story).

While stadiums are obvious examples of this oeuvre, these charlatans ply their trade in all manner of fields:  port impact studies, university impact studies, tourism impact studies, toll revenue forecasts.  The concept of an economic multiplier is a Godsend to this group; coupled with generous assumptions and a penchant for double- and triple-counting things, and they can make anything appear of out-sized importance.  Archimedes famously said:   give me a long enough lever and I can move the earth,   To the economic impact study, if you give me a strong enough assumption, I can produce any benefit cost ratio you like.

Hagiography is a time honored practice, and has, even produced great art.  Take for example the 21 panel life of Marie de Medici by Peter Paul Rubens than hangs (appropriately in Medici Gallery, at the Louvre).  It depicting major events in her life, all larger than life and with the hand of God clearly shown at every step.  The series begins with God smiling approvingly at her birth and ends with Marie ascending to His side in heaven:
Old School Hagiography: The Life of Marie de Medici

A field guide to hagiometry

Hagiometry has a number of obvious hallmarks. Most importantly, hagiometry is all about selling:  one should either generously subsidize some activity or forbear from unduly burdening it with taxes and/or regulations.  It is also almost always the case that public entities with cash to spend (or who might levy taxes or rules) who are the intended audience for these glowing statistical portraits. More specifically, you know its hagiometry if you see these things:
  1. Its used to sell, not to choose among alternatives or decide “how much” is the right amount to spend on a particular project.
  2. They never talk about opportunity costs or the negative multiplier associated with moving money from other activities.
  3. All activity is assumed to be additive, with no displacement.  Sporting events like the Super Bowl have been shown to “crowd out” other economic activity, so the net effect is smaller than the activity generated by the event itself.
  4. Added costs get counted as benefits:  if people drive more, they spend more on gasoline, and cars, and even though their total cost of living has risen, it all gets counted as a net gain.
  5. There’s little or no mention of who loses.  Economic impact studies focus exclusively on benefits, and almost never on costs. If these businesses get more money, then won’t consumers spend less at other businesses? That Cabelas might chalk up millions in sales, but it may suck the life out of other local retailers, a negative that’s almost never reflected in impact studies.

 Gresham’s Law & the market for economic impact studies

 Just like fawning biographies and larger than life portraits, economic impact studies are invariably commissioned, not by someone with a detached interest in the pursuit of truth (or beauty) but someone with a pretty clear axe to grind. That creates a selection environment which is virtually guaranteed to produce exaggerated results. Artists who produce ugly, if truthful, portraits don’t get many commissions.  The same principle applies to hagiometry: consultants who produce conservative estimates are much less likely to find favor than those who produce generous ones.  The result is that there is a kind of Gresham’s law at work in the industry:  firms that produce generous estimates are in high demand and can earn high fees; those who are less glowing, find they have less work, and either change their methods, or find other lines of business to pursue. This produces an unsurprising race to the bottom that produces the kind of work Mark Bonn did for the BlueJays.

Several years ago, I was part of group opposing something called the “Columbia River Crossing” a multi-billion dollar freeway widening project in Portland.    The State DOT hired a company with a track record of repeatedly over-estimating traffic and toll revenue (they were in part responsible for the SH 135 toll road estimates in Austin, which subsequently went into default).  For a moment, I was puzzled as to why our DOT would choose someone who routinely over-estimates revenue; but then it occurred to me that for the highway advocates excessive optimism was not a bug, but an essential feature.  They didn’t want someone to tell them it would be impractical or risky to build their project.
 The best advice we can offer the public and policy makers in countering hagiometry is simply to be very skeptical of all such economic impact studies. Ordinarily, we might invoke the Latin admonition “Caveat Emptor” or “let the buyer beware,” but in this case, the economic impact study is invariably bought and paid for by its flattered client. When presented with an economic impact study that expensively describes some organization or project’s accomplishments and importance, one might consider another Latin expression: “Timeo danaos et dona ferentes” which is generally translated to mean, “Beware of Greeks bearing gifts.”

Happy Earth Day, Oregon! Let’s Waste Billions Widening Freeways!

If you’re serious about dealing with climate change, the last thing you should do is spend billions widening freeways.

The Oregon Department of Transportation is hell-bent on widening freeways and destroying the planet

April 22 is Earth Day, and to celebrate, Oregon is moving forward with plans to billions dollars into three Portland area freeway widening projects. It isn’t so much Earth Day as a three-weeks late “April Fools Day.”

Yesterday’s New York Times story asked the question, “Can Portland be a climate leader without reducing driving?” As our colleague and long-time City Observatory contributor Daniel Hertz often observed, if a story’s headline is framed as a question, the answer is always “No.”  And this is no exception.

The New York Times, April 22, 2022

The Oregon Department of Transportation’s  plans to squander billions of dollars widening area highways plainly undermines State, regional, and city commitments to reducing greenhouse gas emissions.  Driving is the single largest source of climate pollution in Portland, and it has grown 20 percent, by more than a million tons per year, in the past five years.

Betraying Portland’s Legacy of Environmental Leadership

Nearly five decades after the city earned national recognition for tearing out a downtown freeway, it gets ready to build more. Back in the day, Portland built its environmental cred by tearing out one downtown freeway, and cancelling another–and then taking the money it saved to build the first leg of its light rail system. In place of pavement and pollution, it put up parks. Downtown Portland’s Willamette riverfront used to look like this:

Now the riverfront looks like this:

For decades, city and state political leaders have celebrated this legacy, and proudly touted our environmental leadership based on these bold and far-sighted steps. It is bitterly ironic, and tragic, that half a century after proving that removing freeways promotes livability, economic vitality and thriving cities, Oregon is now embarking on an unprecedented huge expansion of highway capacity, and exactly the time the climate crisis has come plainly into view.

Oregon DOT:  Celebrating Earth Day 2022 by Destroying the Planet

The environmental legacy of freeway removal is not merely forgotten, its being actively demolished by a transportation department that is hell-bent on building wider highways and increasing traffic and greenhouse gas emissions.  Between the $5 billion Interstate Bridge Replacement project, the $1.45 billion I-5 Rose Quarter Project, and a plan to rebuild and widen the I-205 Abernethy Bridge at $500 million, ODOT is embarked on a multi-billion dollar highway building spree.  And that’s just the beginning, because these projects have almost invariably gone over budget, and more expansions (a wider I-205 on either side of the Abernethy Bridge, and plans to widen the I-5 Boone Bridge) will generate even more debt and traffic.

ODOT’s plans  in the face of the state’s legally adopted requirement to reduce greenhouse gases. The state’s Greenhouse Gas Commission (of course, Oregon has one) reported that the state is way off track in achieving its statutorily mandate to reduce greenhouse gases by 10 percent from their 1990 levels by 2020.  The commission’s finding:

Key Takeaway: Rising transportation emissions are driving increases in statewide emissions.

As the updated greenhouse gas inventory data clearly indicate, Oregon’s emissions had been declining or holding relatively steady through 2014 but recorded a non-trivial increase between 2014 and 2015. The majority of this increase (60%) was due to increased emissions from the transportation sector, specifically the use of gasoline and diesel. The reversal of the recent trend in emissions declines, both in the transportation sector and statewide, likely means that Oregon will not meet its 2020 emission reduction goal. More action is needed, particularly in the transportation sector, if the state is to meet our longer-term GHG reduction goals.

And the independent induced travel calculator, calibrated to the latest, peer-reviewed scientific research on induced demand, shows that widening freeways will likely add tens of thousands of tons of greenhouse gas emissions.  In Oregon, as in many states, transportation is now the largest source of greenhouse gas emissions, and cheaper gas is now prompting more travel. The decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Oregon’s vehicle miles traveled, which had been declining steadily, ticked up in 2015, as did its fatality rate. Building more freeway capacity–which will trigger more traffic–flies in the face of the state’s stated and legislated commitment to reducing greenhouse gases.

Building more capacity doesn’t solve congestion, it just increases traffic (and emissions)

The new word of the day is bottleneck: Supposedly, adding a lane or two in a few key locations will magically remedy traffic congestion. But the evidence is always that when you “fix” one bottleneck, the road simply gets jammed up at the next one. As the Frontier Group has chronicled, the nation is replete with examples of billion dollar boondoggle highways that have been sold on overstated traffic projections, and which have done little or nothing to reduce congestion.

As we all know, widening freeways to reduce traffic congestion has been a spectacular failure everywhere it has been tried. From the epic 23-lanes of the Katy Freeway, to the billion dollar Sepulveda Pass in Los Angeles, adding more capacity simply generates more traffic, which quickly produces the same or even longer of delays. The case for what is called induced demand is now so well established that its now referred to as “The Fundamental Law of Road Congestion.” Each incremental expansion of freeway capacity produces a proportionate increase in traffic. And not only does more capacity induce more demand, it leads to more vehicle emissions–which is why claims that reducing vehicle idling in congestion will somehow lower carbon emissions is a delusional rationalization.

If you’re a highway engineer or a construction company, induced demand is the gift that keeps on giving: No matter how much we spend adding capacity to “reduce congestion,” we’ll always need to spend even more to cope with the added traffic that our last congestion-fighting project triggered. While that keeps engineers and highway builders happy, motorists and taxpayers should start getting wise to this scam.

The good news is that there’s some pushback from folks who think more freeways isn’t a solution to anything. But a lot of the energy seems to be directed to a “me too” package of investments in token improvements to biking and walking infrastructure. As Strong Town’s Chuck Marohn warns, that’s a dead end for communities, the environment and a sensible transportation system; while he’s writing about Minnesota, the same case applies in Oregon:

Oh, they’ll pander to you. They’ll promise you all kinds of things….fancy new trains (to park and rides), bike trails (in the ditch, but not safe streets)….but this system isn’t representing you at all. It’s on autopilot. It’s got a long line of Rice interchanges and St. Croix bridge projects just ready to go when you give them the money. Don’t do it.

And as a final word, for those of you hoping to fund transit, pedestrian and cycling improvements out of increased state and federal dollars, I offer two observations. First, you are advocating for high-return investments in a financing system that does not currently value return-on-investment. You are going to finish way behind on every race, at least until we no longer have the funds to even run a race. Stop selling out for a drop in the bucket and start demanding high ROI spending.

Second, the cost of getting anything you want is going to be expansive funding to prop up the systems that hurt the viability of transit, biking and walking improvements. Every dollar you get is going to be bought with dozens of dollars for suburban commuters, their parking lots and drive throughs and their mindset continuing to oppose your efforts at every turn. You win more by defunding them than by eating their table scraps.

So when it comes to 21st Century transportation and Earth Day, maybe we should start with an environmental variation on the Hippocratic Oath:  “First, do no harm.” Portland was  smart enough to stop building freeways half a century ago, when environmentalism was in its infancy, and the prospects of climate change were not nearly so evident. Why aren’t we smart enough to do the same today?

 

 

Happy Earth Day, Oregon! Let’s Widen Some Freeways!

If you’re serious about dealing with climate change, the last thing you should do is spend billions widening freeways.

April 22 is Earth Day, and to celebrate, Oregon is moving forward with plans to drop more than a billion dollars into three Portland area freeway widening projects. It isn’t so much Earth Day as a three-weeks late “April Fools Day.”

Four decades after the city earned national recognition for tearing out a downtown freeway, it gets ready to build more. Back in the day, Portland built its environmental cred by tearing out one downtown freeway, and cancelling another–and then taking the money it saved to build the first leg of its light rail system. In place of pavement and pollution, it put up parks. Downtown Portland’s Willamette riverfront used to look like this:

Now the riverfront looks like this:

 

But as part of a transportation package enacted by the 2017 Oregon Legislature, higher gas taxes and vehicle registration fees will be used partly to shore up the state’s multi-billion dollar maintenance backlog, but prominently to build three big freeway widening projects in the Portland metropolitan area. One project would spend $450 million to add lanes to Interstate 5 near downtown Portland, two others would widen freeways in the area’s principal suburbs. The estimated cost of the projects would be around a billion dollars, but when it comes to large projects, the Oregon Department of Transportation is notorious for grossly underestimating costs. Its largest recent project, widening US Highway 20 between Corvallis and Newport was supposed to cost a little over $100 million but has ended up costing almost $400 million.

The plan flies in the face of the state’s legally adopted requirement to reduce greenhouse gases. Just last year, the state’s Greenhouse Gas Commission (of course, Oregon has one) reported that the state is way off track in achieving its statutorily mandate to reduce greenhouse gases by 10 percent from their 1990 levels by 2020.  The commission’s finding:

Key Takeaway: Rising transportation emissions are driving increases in statewide emissions.

As the updated greenhouse gas inventory data clearly indicate, Oregon’s emissions had been declining or holding relatively steady through 2014 but recorded a non-trivial increase between 2014 and 2015. The majority of this increase (60%) was due to increased emissions from the transportation sector, specifically the use of gasoline and diesel. The reversal of the recent trend in emissions declines, both in the transportation sector and statewide, likely means that Oregon will not meet its 2020 emission reduction goal. More action is needed, particularly in the transportation sector, if the state is to meet our longer-term GHG reduction goals.

In Oregon, as in many states, transportation is now the largest source of greenhouse gas emissions, and cheaper gas is now prompting more travel. The decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Oregon’s vehicle miles traveled, which had been declining steadily, ticked up in 2015, as did its fatality rate. Building more freeway capacity–which will trigger more traffic–flies in the face of the state’s stated and legislated commitment to reducing greenhouse gases.

Building more capacity doesn’t solve congestion, it just increases traffic (and emissions)

The new word of the day is bottleneck: Supposedly, adding a lane or two in a few key locations will magically remedy traffic congestion. But the evidence is always that when you “fix” one bottleneck, the road simply gets jammed up at the next one. As the Frontier Group has chronicled, the nation is replete with examples of billion dollar boondoggle highways that have been sold on overstated traffic projections, and which have done little or nothing to reduce congestion.

As we all know, widening freeways to reduce traffic congestion has been a spectacular failure everywhere its been tried. From the epic 23-lanes of the Katy Freeway, to the billion dollar Sepulveda Pass in Los Angeles, adding more capacity simply generates more traffic, which quickly produces the same or even longer of delays. The case for what is called induced demand is now so well established that its now referred to as “The Fundamental Law of Road Congestion.” Each incremental expansion of freeway capacity produces a proportionate increase in traffic. And not only does more capacity induce more demand, it leads to more vehicle emissions–which is why claims that reducing vehicle idling in congestion will somehow lower carbon emissions is a delusional rationalization.

If you’re a highway engineer or a construction company, induced demand is the gift that keeps on giving: No matter how much we spend adding capacity to “reduce congestion,” we’ll always need to spend even more to cope with the added traffic that our last congestion-fighting project triggered. While that keeps engineers and highway builders happy, motorists and taxpayers should start getting wise to this scam.

A Faustian bargain for transit and active transportation

Portland’s  freeway widening proposal was part of a convoluted political bargain to justify spending for a proposed light rail line. Supposedly, voters won’t approve funding for new transit expansion in Portland unless its somehow “bundled” with funding for freeway expansion projects. That flies in the face of experience of other progressive metropolitan areas, including Denver, Los Angeles and San Francisco.  Urban transit finance measures have a remarkable 71 percent record of ballot box success. Just last November, voters in Seattle approved a $54 billion (that’s with a “B”) Sound Transit3 tax measure to fund a massive, region-wide expansion of bus and rail transit. To argue that you can’t get public support to fund transit with out also subsidizing freeways is an argument that’s at least 50 years out of date.

The good news is that there’s some pushback from folks who think more freeways isn’t a solution to anything. But a lot of the energy seems to be directed to a “me too” package of investments in token improvements to biking and walking infrastructure. As Strong Town’s Chuck Marohn warns, that’s a dead end for communities, the environment and a sensible transportation system; while he’s writing about Minnesota, the same case applies in Oregon:

Oh, they’ll pander to you. They’ll promise you all kinds of things….fancy new trains (to park and rides), bike trails (in the ditch, but not safe streets)….but this system isn’t representing you at all. It’s on autopilot. It’s got a long line of Rice interchanges and St. Croix bridge projects just ready to go when you give them the money. Don’t do it.

And as a final word, for those of you hoping to fund transit, pedestrian and cycling improvements out of increased state and federal dollars, I offer two observations. First, you are advocating for high-return investments in a financing system that does not currently value return-on-investment. You are going to finish way behind on every race, at least until we no longer have the funds to even run a race. Stop selling out for a drop in the bucket and start demanding high ROI spending.

Second, the cost of getting anything you want is going to be expansive funding to prop up the systems that hurt the viability of transit, biking and walking improvements. Every dollar you get is going to be bought with dozens of dollars for suburban commuters, their parking lots and drive throughs and their mindset continuing to oppose your efforts at every turn. You win more by defunding them than by eating their table scraps.

So when it comes to 21st Century transportation and Earth Day, maybe we should start with an environmental variation on the Hippocratic Oath:  “First, do no harm.” We were smart enough to stop building freeways when environmentalism was in its infancy, and the prospects of climate change were not nearly so evident. Why aren’t we smart enough to do the same today?

 

 

Happy Earth Day, Oregon! Let’s Widen Some Freeways!

Four decades after the city earned national recognition for tearing out a downtown freeway, it gets ready to build more

April 22 is Earth Day, and to celebrate, Oregon’s Legislature is on the verge of considering a transportation package that would drop more than a billion dollars into three Portland area freeway widening projects.

Back in the day, Portland built its environmental cred by tearing out one downtown freeway, and cancelling another–and then taking the money it saved to build the first leg of its light rail system. In place of pavement and pollution, it put up parks. Downtown Portland’s Willamette riverfront used to look like this:

Now the riverfront looks like this:

 

But as part of a transportation package being developed by the Oregon Legislature, the plan is to raise gas taxes and other fees partly to shore up the state’s multi-billion dollar maintenance backlog, but prominently to build three big freeway widening projects in the Portland metropolitan area. One project would spend $400 million to add lanes to Interstate 5 near downtown Portland, two others would widen freeways in the area’s principal suburbs. The estimated cost of the projects would be around a billion dollars, but when it comes to large projects, the Oregon Department of Transportation is notorious for grossly underestimating costs. Its largest recent project, widening US Highway 20 between Corvallis and Newport was supposed to cost a little over $100 million but has ended up costing almost $400 million.

The proposal is advancing at a doubly ironic time: Just two months ago, the state’s Greenhouse Gas Commission (of course, Oregon has one) reported that the state is way off track in achieving its statutorily mandated goal of reducing greenhouse gases by 10 percent from their 1990 levels by 2020.  The commission’s finding:

Key Takeaway: Rising transportation emissions are driving increases in statewide emissions.

As the updated greenhouse gas inventory data clearly indicate, Oregon’s emissions had been declining or holding relatively steady through 2014 but recorded a non-trivial increase between 2014 and 2015. The majority of this increase (60%) was due to increased emissions from the transportation sector, specifically the use of gasoline and diesel. The reversal of the recent trend in emissions declines, both in the transportation sector and statewide, likely means that Oregon will not meet its 2020 emission reduction goal. More action is needed, particularly in the transportation sector, if the state is to meet our longer-term GHG reduction goals.

Transportation is now the largest source of greenhouse gas emissions, and cheaper gas is now prompting more travel. The decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Oregon’s vehicle miles traveled, which had been declining steadily, ticked up in 2015, as did its fatality rate. Building more freeway capacity–which will trigger more traffic–flies in the face of the state’s stated and legislated commitment to reducing greenhouse gases.

The second irony is that the state is proposing a gas tax increase that would go to new road capacity at exactly the same time it doesn’t have enough money to pay for other public services.  Despite a robust economy, the state faces a serious revenue shortfall.  General fund revenues for the coming 2017-19 biennium are an estimated $1.6 billion short of the amount needed to fund the current level of public services. So unless new tax revenues are found, the state will likely have to cut back on schools and medical care for the poor (the two biggest categories of public expenditure). Oregon is likely to celebrate the end of the current legislative session by raising taxes to spend more money subsidizing driving at the same time it is cutting back on education (which is the one thing that’s been reliably shown to improve a state’s economy).

Building more capacity doesn’t solve congestion, it just increases traffic (and emissions)

The new word of the day is bottleneck: Supposedly, adding a lane or two in a few key locations will magically remedy traffic congestion. But the evidence is always that when you “fix” one bottleneck, the road simply gets jammed up at the next one. As the Frontier Group has chronicled, the nation is replete with examples of billion dollar boondoggle highways that have been sold on overstated traffic projections, and which have done little or nothing to reduce congestion.

As we all know, widening freeways to reduce traffic congestion has been a spectacular failure everywhere its been tried. From the epic 23-lanes of the Katy Freeway, to the billion dollar Sepulveda Pass in Los Angeles, adding more capacity simply generates more traffic, which quickly produces the same or even longer of delays. The case for what is called induced demand is now so well established that its now referred to as “The Fundamental Law of Road Congestion.” Each incremental expansion of freeway capacity produces a proportionate increase in traffic. And not only does more capacity induce more demand, it leads to more vehicle emissions–which is why claims that reducing vehicle idling in congestion will somehow lower carbon emissions is a delusional rationalization.

If you’re a highway engineer or a construction company, induced demand is the gift that keeps on giving: No matter how much we spend adding capacity to “reduce congestion,” we’ll always need to spend even more to cope with the added traffic that our last congestion-fighting project triggered. While that keeps engineers and highway builders happy, motorists and taxpayers should start getting wise to this scam.

A Faustian bargain for transit and active transportation

Portland’s  freeway widening proposal is part of a convoluted political bargain to justify spending for a proposed light rail line. Supposedly, voters won’t approve funding for new transit expansion in Portland unless its somehow “bundled” with funding for freeway expansion projects. That flies in the face of experience of other progressive metropolitan areas, including Denver, Los Angeles and San Francisco.  Urban transit finance measures have a remarkable 71 percent record of ballot box success. Just last November, voters in Seattle approved a $54 billion (that’s with a “B”) Sound Transit3 tax measure to fund a massive, region-wide expansion of bus and rail transit. To argue that you can’t get public support to fund transit with out also subsidizing freeways is an argument that’s at least 50 years out of date.

The good news is that there’s some pushback from folks who think more freeways isn’t a solution to anything. But a lot of the energy seems to be directed to a “me too” package of investments in token improvements to biking and walking infrastructure. As Strong Town’s Chuck Marohn warns, that’s a dead end for communities, the environment and a sensible transportation system; while he’s writing about Minnesota, the same case applies in Oregon:

Oh, they’ll pander to you. They’ll promise you all kinds of things….fancy new trains (to park and rides), bike trails (in the ditch, but not safe streets)….but this system isn’t representing you at all. It’s on autopilot. It’s got a long line of Rice interchanges and St. Croix bridge projects just ready to go when you give them the money. Don’t do it.

And as a final word, for those of you hoping to fund transit, pedestrian and cycling improvements out of increased state and federal dollars, I offer two observations. First, you are advocating for high-return investments in a financing system that does not currently value return-on-investment. You are going to finish way behind on every race, at least until we no longer have the funds to even run a race. Stop selling out for a drop in the bucket and start demanding high ROI spending.

Second, the cost of getting anything you want is going to be expansive funding to prop up the systems that hurt the viability of transit, biking and walking improvements. Every dollar you get is going to be bought with dozens of dollars for suburban commuters, their parking lots and drive throughs and their mindset continuing to oppose your efforts at every turn. You win more by defunding them than by eating their table scraps.

So when it comes to 21st Century transportation and Earth Day, maybe we should start with an environmental variation on the Hippocratic Oath:  “First, do no harm.” We were smart enough to stop building freeways when environmentalism was in its infancy, and the prospects of climate change were not nearly so evident. Why aren’t we smart enough to do the same today?

This post was revised to correctly reflect the date of Earth Day 2017.

 

Volunteering as a measure of social capital

Volunteering is one of the hallmarks of community; here are the cities with the highest rates of volunteerism

The decline of the civic commons, the extent to which American’s engage with one another in the public realm, especially across class lines, has been much remarked upon. Our report, Less in Common, explores the many dimensions along with the fabric of our connections to one another has become increasingly strained over several decades: we are less likely to socialize with neighbors, we travel in isolation, increasingly we recreate in private, rather than public space, and as a result, the strength of a shared public realm has deteriorated.

In his book Bowling Alone, Robert Putnam popularized the term “social capital.” Putnam also developed a clever series of statistics for measuring social capital. He looked at survey data about interpersonal trust (can most people be trusted?) as well as behavioral data (do people regularly visit neighbors, attend public meetings, belong to civic organizations?). Putnam’s measures try to capture the extent to which social interaction is underpinned by widely shared norms of openness and reciprocity.

As economist Brad DeLong explains,

. . . at some deep level human sociability is built on gift-exchange—I give you this, you give me that, and rough balance is achieved, but in some sense we both still owe each other and still are under some kind of mutual obligation to do things to further repay each other.

This sense of mutual obligation is important both to society, and the the effective function of markets. When we live in communities, places where most people have a strong sense of mutual obligation to look out for and take care of one another, social problems are lessened and economies run more smoothly.

It’s difficult to come up with a single, clear-cut indicator of social capital, so we and other researchers have ended up relying on a patchwork of different measures to judge the degree to which different cities exhibit high or low levels of civic interaction.

One of the most fundamental of these measures is volunteering. It’s long been a staple of American lore–since DeToqueville–that we regularly engage non-remunerated community activities.

Our data come from the Corporation for National and Community Service. It works with the Census Bureau to conduct a nationally representative survey exploring the degree to which Americans engage in a range of volunteer activities.

Across the nation’s largest metropolitan areas, about 27 percent of adults reported having volunteered in their local community the past year. The volunteering ethic is strongest in Salt Lake City and the Twin Cities of Minneapolis and St. Paul, where more than a third of adults volunteer.  Conversely, volunteering is much lower than the national average in cities such as Miami, New Orleans, New York and Las Vegas.

This measure stands in stark contrast to our measure of “anti-social capital” the number of security guards per capita in each metropolitan area, which we wrote about earlier this year. Not surprisingly, cities that rank high in our measure of anti-social capital (Miami, New Orleans, and Las Vegas) are all in the “top five” for security guards per capita and in the bottom five for volunteering.  Conversely, the cities with the fewest security guards per capita (Minneapolis, Portland, Grand Rapids and Rochester) are all in the top ten for volunteering.

While any ranking always implies that there are winners and losers, we interpret the variation we see here a bit more optimistically. These data imply that what happens in a metropolitan are can affect its degree of social capital. Fixing this problem from the top down may seem daunting, but improving social capital from the bottom up is something than can be done at the community level. No matter where you live, we’re sure there are opportunities for you to volunteer to help make your city a better place.

Key to prosperity: Talent in the “traded sector” of the economy

“Traded sector” businesses that employ well-educated workers mark a prosperous region

At City Observatory, we regularly stress the importance of education and skills to regional economic success. Statistically, we can explain almost two-thirds of the variation in per capita income among large metropolitan areas just by looking at the educational attainment of the population.

The strong relationship between education and income is a basic fact, but there’s a lot of nuance. A more sophisticated way of understanding the role of talent is to distinguish between what economists call the “traded” and “local” sectors of the economy. Traded sector businesses are those that sell their output in competition with businesses in other states or nations. Local businesses, as the name implies, sell their goods and services primarily or exclusively in a local market. By definition, local businesses tend to be sheltered from competition from other places.

Many well-educated workers are employed in the local sector of the economy. Teachers, health care workers and government employees are all more likely to have a college degree than the typical worker. There tends to be little variation in place to place among the education and skill levels of workers in these industries. Instead, the big variations among metropolitan areas in the skill level of workers is in those trade sector businesses.

A good measure of the knowledge intensity of a region’s traded sector is the share of workers in the traded sector that have a four-year college degree or higher education. Below, we’ve used data from the American Community Survey to compute the fraction of traded sector workers in each metro area with a four-year degree. Slightly fewer than 30 percent of all traded sector workers in large metro areas had a four-year degree. At the top of our rankings are some of the nation’s most recognizable knowledge centers–San Jose, San Francisco, Washington and Boston, all of which have college attainment rates of more than 40 percent in their traded sector industries.

 

A low level of educational attainment in the traded sector is a good indicator that a region’s exporting industries aren’t based on strong knowledge assets. Two of the lowest ranking cities (Riverside and Las Vegas) disproportionately specialize in industries that are not knowledge intensive: distribution and tourism, respectively.

This measure excludes college-educated workers employed in health care, education and government, which are not considered traded sectors. Data on traded sector employment by educational attainment are taken from the American Community Survey and tabulated from the Integrated Public Use Micro-Sample (IPUMS).

The pernicious myth of “naturally occurring” affordable housing

Housing doesn’t “occur naturally”

Using zoning to preserve older, smaller homes doesn’t protect affordability

There’s no such thing as “Naturally Occurring Affordable Housing”–older, smaller homes become affordable only if supply and demand are in balance, usually because it’s relatively easy to build more housing.

The parable of the ranch home shows that old, small homes don’t “naturally” become affordable:  they’re utterly unaffordable in markets like Silicon Valley that constrict housing supply.

“Preserving” older, smaller homes will have exactly the opposite of the intended effect, restricting supply, driving up prices and increasing displacement.

 

There’s a catchy but fundamentally flawed idea floating around housing affordability discussions:  “naturally occurring affordable housing.”

To many housing policy practitioners, “affordable housing” is a creation of government policy:  it’s either public housing (built, paid, for and operating by local authorities), or subsidized affordable housing (typically built by non-profits) with government subsidies and tax breaks.  By extension, anything else that turns out to have low rents or inexpensive prices (and wasn’t built with these subsidies) is somehow “naturally occurring.” It even has a cute acronym: “NOAH.”

A new report from the American Planning Association claims that restrictive zoning can be a way to hold down housing costs by “preserving naturally occurring affordable housing.”  The report concedes that up-zoning and expanding supply might help with affordability, but then simply ignores this point, asserting that added supply doesn’t result in more homes affordable for low income households, and argues that public policy should protect older, smaller homes from being redeveloped, especially to add many more units.

What this misses is that the cumulative effect of these ostensibly well-intended efforts to “preserve” older smaller homes is to drive up the price of these homes by making them scarcer, and to increase the overall level of prices and rents in a market.  There’s a profound myopia to this approach to planning that assumes that the affordability of a structure is an intrinsic characteristic of that building, rather than the overall balance of supply and demand in a city or region.  Affordability is a system condition, not something that inures to a particular building.  Ignoring the system level effects of supply restrictions makes housing availability and affordability problems worse.  The additive, cumulative effect of limitations on constructing new homes is to make all homes more expensive, and measures like those advocated in this report drive up rents for everyone.  Indeed, measures to restrict new home construction assure that older, smaller homes will not become less expensive over time.  That’s the big lie about the term “naturally occurring”–older smaller homes are cheaper in some cities because they are abundant relative to demand, and typically because these places make it easier to build new housing.

Whether a 1950’s ranch home is “affordable” today or not has nothing to do with its age or size, but rather the supply and demand for housing in the market in which it is located.  The reason 1,000 square foot, 1950s ranch houses in Silicon Valley sell for over $1 million, and never became “naturally occurring affordable housing” the way they did in other cities has everything to do with the zoning and other restrictions that “preserved” these homes and prevented enough new housing getting built.

As a result, the idea of “preserving naturally occurring affordable housing” has a kind of “leeches and bleeding” quality as a policy prescription.  Precluding redevelopment of smaller, older homes actually makes the housing affordability problem worse by blocking new supply that would help drive down prices.  Instead of fetishizing smaller, older homes, planners should step back and look at overall supply and demand in the market.

The world according to NOAH

The idea of “naturally occurring affordable housing” has been kicking around in planning circles for a few years. There’s a 2016 report from Co-Star (the real estate advisory firm), issued in collaboration with the Urban Land Institute and the National Association of Affordable Housing Lenders, which inventories the number of such naturally occurring units in each of the nation’s large metropolitan areas (they count about 5.6 million).

Even conservative think tank the Manhattan Institute has employed the term, arguing that Mayor Bill de Blasio’s affordable housing program is unnecessary because the city has a reservoir of “naturally occurring affordable housing” that are currently available and require no additional government investment.

The term is a relatively recent coinage. Google shows no instances of the phrase “naturally occurring affordable housing” appearing prior to 2007. There were fewer than 10 websites using that term in the years up through 2013, and in 2014, “NOAH” began to take off. There were 66 occurrences in 2014, 39 in 2015 and 125 in 2016. While it’s a popular, and to some, seductive, term it’s fundamentally misleading.

Reality:  There’s no such thing as naturally occurring affordable housing

 

A cave: Actual “Naturally Occurring Affordable Housing” (Flickr: Adifferentbrian)

Basalt, glaciers, arable land and virgin forests are all naturally occurring. So are clouds, insects, and mountains.

While there’s nothing wrong with affordable housing that doesn’t currently rely on direct government subsidies, there’s something profoundly misleading with the term “naturally occurring.”

There’s nothing “natural” about it. Housing markets and the process of investment, decline, and filtering, are all profoundly influenced by a range of policies, from the federal government’s subsidies to housing and highways, to local land use decisions. The process of investment and neighborhood change that results in used housing is “anything but natural” as the University of California’s Karen Chapple and her colleagues put it in a recent report to the California Air Resources Board:

The story of neighborhood decline in the United States is oft-told. While early researchers naturalized processes of neighborhood transition and decline, the drivers of decline are anything but natural and stem from a confluence of factors including: federal policy and investments, changes in the economy, demographic and migration shifts, and discriminatory actions.

(Ironically, that doesn’t stop the authors from also using the term “naturally occurring affordable housing” four times in their report, juxtaposing that with Section 8 vouchers and deed-restricted affordable housing units.)

Naturally occurring conjures up visions of mineral deposits, or mountain ranges or a benign climate.  But the existence–or non-existence–of affordable, privately owned housing has everything to do with a wide range of conscious public policy choices that simply don’t belong in the category of natural occurrence.  The danger with this term is that it implies that there’s really nothing that we can or should do to promote market housing–it’s naturally occurring, right? It’s either going to be there or it isn’t, so there’s really no point trying to influence it.  And if your community doesn’t have enough, well, then there’s really not much you can do about it.

What that misses, of course, is that public policies, especially local zoning requirements, building codes, parking requirements, development fees and the like have everything to do with whether the private market provides housing that is affordable. When we ban apartments from wide swaths of most communities, when we outlaw affordable micro-units in desirable neighborhoods, when we subject development to a wide range of discretionary and arbitrary approval processes, and when we impose enormous costs in the form of parking requirements, we’re making sure that the private market doesn’t produce housing that is affordable.

And as we’ve noted, the whole process of filtering, where, as housing ages it becomes more and more affordable, is contingent on allowing an ample supply of new housing, even when those new units are themselves more expensive than low and moderate income households can afford. The only reason that some communities have plenty of what gets called “naturally affordable” housing is that they made it relatively easy to build new housing and that in turn led housing to filter downward and become more affordable.

In contrast, many communities–we’re looking at you, San Francisco–made it really difficult to build any new housing, with the result that a lot of older units which would have filtered down market as they aged, became the only feasible housing alternative, and consequently their prices got bid up so that they didn’t become more affordable as they got older.

So by all means, let’s talk about the role of markets and privately owned housing in addressing affordability; but let’s not use a term that intrinsically isolates this from the policy arena.

The myth of naturally occurring affordable housing

Block that metaphor! There’s nothing “natural” about “naturally occurring affordable housing.”

There’s a new term that’s gaining currency in some housing policy circles: “naturally occurring affordable housing.”  It even has a catchy acronym: “NOAH.”  There’s a recent report from Co-Star (the real estate advisory firm), issued in collaboration with the Urban Land Institute and the National Association of Affordable Housing Lenders, which inventories the number of such naturally occurring units in each of the nation’s large metropolitan areas (they count about 5.6 million).

Even conservative think tank the Manhattan Institute has employed the term, arguing that Mayor Bill de Blasio’s affordable housing program is unnecessary because the city has a reservoir of “naturally occurring affordable housing” that are currently available and require no additional government investment.

The term is a new coinage. A quick search of Google shows no instances of the phrase “naturally occurring affordable housing” appearing prior to 2007. There were fewer than 10 websites using that term in the years up through 2013, and in 2014, “NOAH” began to take off. There were 66 occurrences in 2014, 39 in 2015 and 125 in 2016. And 2017 is on pace to break that with almost 90 instances in the first quarter alone.

The implication is that there are two ways to get affordable housing: to have the public sector subsidize it–usually through some combination of low income housing tax credits on the front-end and Section 8 vouchers for tenants on the back-end. There’s also conventional public housing, owned and built by the public sector. And then, in contrast there’s this second kind of affordable housing: “naturally occurring,” which is privately owned and operated, and just happens to charge rents that are affordable to low and moderate income households.

A cave: Actual “Naturally Occurring Affordable Housing” (Flickr: Adifferentbrian)

Block that metaphor!

Basalt, glaciers, arable land and virgin forests are all naturally occurring. So are clouds, insects, and mountains.

While there’s nothing wrong with affordable housing that doesn’t currently rely on direct government subsidies, there’s something profoundly misleading with the term “naturally occurring.”

There’s nothing “natural” about it. Housing markets and the process of investment, decline, and filtering, are all profoundly influenced by a range of policies, from the federal government’s subsidies to housing and highways, to local land use decisions. The process of investment and neighborhood change that results in used housing is “anything but natural” as the University of California’s Karen Chapple and her colleagues put it in a recent report to the California Air Resources Board:

The story of neighborhood decline in the United States is oft-told. While early researchers naturalized processes of neighborhood transition and decline, the drivers of decline are anything but natural and stem from a confluence of factors including: federal policy and investments, changes in the economy, demographic and migration shifts, and discriminatory actions.

(Ironically, that doesn’t stop the authors from also using the term “naturally occurring affordable housing” four times in their report, juxtaposing that with Section 8 vouchers and deed-restricted affordable housing units.)

Naturally occurring conjures up visions of mineral deposits, or mountain ranges or a benign climate.  But the existence–or non-existence–of affordable, privately owned housing has everything to do with a wide range of conscious public policy choices that simply don’t belong in the category of natural occurrence.  The danger with this term is that it implies that there’s really nothing that we can or should do to promote market housing–it’s naturally occurring, right? It’s either going to be there or it isn’t, so there’s really no point trying to influence it.  And if your community doesn’t have enough, well, then there’s really not much you can do about it.

What that misses, of course, is that public policies, especially local zoning requirements, building codes, parking requirements, development fees and the like have everything to do with whether the private market provides housing that is affordable. When we ban apartments from wide swaths of most communities, when we outlaw affordable micro-units in desirable neighborhoods, when we subject development to a wide range of discretionary and arbitrary approval processes, and when we impose enormous costs in the form of parking requirements, we’re making sure that the private market doesn’t produce housing that is affordable.

And as we’ve noted, the whole process of filtering, where, as housing ages it becomes more and more affordable, is contingent on allowing an ample supply of new housing, even when those new units are themselves more expensive than low and moderate income households can afford. The only reason that some communities have plenty of what gets called “naturally affordable” housing is that they made it relatively easy to build new housing and that in turn led housing to filter downward and become more affordable.

In contrast, many communities–we’re looking at you, San Francisco–made it really difficult to build any new housing, with the result that a lot of older units which would have filtered down market as they aged, became the only feasible housing alternative, and consequently their prices got bid up so that they didn’t become more affordable as they got older.

So by all means, let’s talk about the role of markets and privately owned housing in addressing affordability; but let’s not use a term that intrinsically isolates this from the policy arena.

Has Portland’s rent fever broken?

More evidence that supply and demand are at work in housing markets

In early 2016, Portland experienced some of the highest levels of rent inflation of any market in the US.  According to Zillow’s rental price estimates, rents were rising between 15 and 20 percent year over year in late 2015 and early 2016. Portland was attracting lots of new residents, and its housing supply was still in the process of rebounding from the effects of the Great Recession. In response to the big uptick in rents, the city enacted one of the nation’s most stringent inclusionary zoning requirements, which will force developers of new apartment buildings permitted after February 2017 to set aside as much as 20 percent of new units for low and moderate income households.

But in the past year, there are growing signs that the surge in rental inflation has peaked.  According to Zillow’s estimates, the average price of a two-bedroom apartment in the Portland area $1495 is almost exactly the same as it was a year ago.  Rental price inflation has dropped from nearly 20 percent a year ago to effectively zero in the first few months of 2017.

 

It’s not easy to accurately estimate the current level of apartment rents. Different sources have their own strengths and limitations, and some services, which just rely on skimming the current crop of apartment listings, with little adjustment for quality or time on market, produce erratic and unreliable results, as we’ve explained. So to try and triangulate our view of the Portland market, we turn to a couple of other sources of information.

Zillow’s estimates of current rent price inflation are largely confirmed by independent figures from ApartmentList.com.  ApartmentList.com uses a variant of the “repeat sales” technique employed by the CaseShiller home price index. By comparing the rental price for the same apartment at different periods of time, one can generate estimates of price inflation that are unaffected by shifts in the composition of apartments available for rent at different times (a problem that plagues more simple-minded indices).  ApartmentList.com’s estimates suggest that Portland’s rents have fallen to 0.9 percent in the past twelve months.

In its national rankings, ApartmentList.com now ranks Portland one of seven markets with declining rents over the past year, a group that includes Houston, San Francisco, and Miami. The following chart shows 12 month rent inflation estimates for each of the nation’s 52 largest metropolitan areas.

 

Another bellwether of the multi-family marketplace, asking prices for apartments seems to have peaked and declined slightly in the past few months.  LoopNet, which tracks real estate transactions, reports that the asking price for apartments for sale has declined about 4 percent in the past few months. These data suggest that investors, who bid up the typical price of an apartment to almost $170,000 in 2016, have backed off a bit from that level. (The amount that investors are willing to pay for apartments tracks closely with expectations about future rent levels).

The outlook:  Higher incomes, more housing supply

Going forward, there are a couple of reasons to expect that Portland’s housing affordability problems will moderate. First, the improving regional economy is starting to drive up incomes. State economist Josh Lehner has an analysis showing that renter incomes are rising, and predicts that rental affordability (as measured by the share of income that people spend on rent) will stop getting worse.

More importantly, there’s a big surge in apartment supply in Portland. One private firm that tracks the construction industry locally–the Barry Apartment Report–estimates that there are roughly 25,000 apartments in the construction pipeline in Portland.

More of this will drive down rents

The big uncertainty in the years ahead will be whether the policies the city has enacted (inclusionary zoning) and others that it is considering (rent control) will choke off future investment in local housing supply.  Once the current inventory of permitted housing is built (and much of it was permitted just prior to the new inclusionary requirements taking effect), its far from clear that developers will find Portland as attractive a marketplace for new apartments as it has been in the past few years.

New York City isn’t hollowing out; It’s growing

You can’t leave out births and deaths when you examine population trends

The release of the latest census population estimates has produced a number of quick takes that say that cities are declining. The latest is Derek Thompson, writing at The Atlantic and bemoaning the net domestic migration out of the New York metro area, and taking that as a sure sign that the city is hollowing out.

And there’s a factoid that’s true here:  More people who were living in New York City in 2010 are living elsewhere in the US in 2016 than people who were living in the US outside of New York City who live there today.

But “net domestic migration” is only one part of the population change puzzle.

New York: Welcoming immigrants and witnessing births.

A second, that Thompson acknowledges, is international immigration.  New York still is the first port of entry for more people moving to the United States than any other city. (My grandmother immigrated there a century ago; there’s a good chance several of your relatives did, too). New York’s population growth has always been fueled by foreign immigrants.  And foreign immigration has almost perfectly offset domestic out-migration from the city.  Since 2010, net domestic out-migration was -425,000; and foreign immigration was +400,000, so the net loss due to both forms of migration (foreign and domestic combined) was -25,000 (or about 0.3 percent of New York City’s population over a six-year period.

If that were all to the story, then Thompson might have a point (although a very small one).  But there’s a third piece of the population puzzle that he’s omitted altogether: natural increase, or the difference between births and deaths. Though you might not think so given all the “family hostile” rhetoric in Thompson’s article–”Maybe families want to live in denser areas but are being priced out, moving to the suburbs, and buying larger vehicles rather than a small car that can be parallel-parked on a crowded city block”–New York City’s population skews a lot younger than the US.  The result: it produced a lot more babies than funerals.  In fact, the five boroughs are positively fecund by comparison to the rest of New York State.  Over the past six years, the natural increase in the population in New York (i.e. births minus deaths) was plus 400,000–that’s more than 16 times larger than the difference between net domestic outmigration and foreign in-migration.

Leaving out natural increase really mis-states the effect of net-domestic migration.  Some of the people who migrate out of the city in any year are children under the age of six, i.e. people who weren’t here in 2010.  So Thompson’s math excludes these new born kids from his base year calculation, but then counts them when the leave the city. The key point here is that you can’t make sense of the aggregate impact of migration without also looking at natural increase.

And this dynamic is different for different cities:  In Pittsburgh, the reverse is true.  That city’s population skews much older than the US average, and experiences more deaths than births, with the result that its natural increase is negative. To be sure, some people are moving out of New York, but its not demonstrably because they dislike the place, but because the city is becoming increasingly crowded.

Bottom line:  New York City is growing.  When you account for all of the different components of population change–births, deaths, people moving in and people moving out–the city’s population is up in total by almost 400,000 since 2010, an increase of 4.6 percent.  Here’s a synopsis of county/borough level population data for New York City, compiled from Census estimates by the Empire Center.

Not only that, every borough in New York City is growing. Even the Bronx is growing.

As with so many stories that rely on fragments of migration or population data, the narrative that some people are moving out of cities implicitly assumes that they are choosing to leave because they don’t want to live in cities. In fact, the growth of city population, and the rising price of homes in cities is a sign that more people want to live in cities than we can currently accommodate. Our failure to increase the supply of housing in cities is increasingly becoming the constraint on urban economic growth. You’ll know cities are failing when you see house prices and land values dropping.  That will be a sign that consumers have rejected urban living. But that’s not what’s happening in New York City today.

Migration is making counties more diverse

Migration, especially by young adults, is increasing racial and ethnic diversity in US counties

As we related last week, a new report from the Urban Institute quantifies the stark economic costs of racial and income segregation in the United States. Places with higher levels of segregation have lower incomes for African-Americans, lower rates of educational attainment, and higher rates of serious crimes. Reducing segregation by race and class is an important and unfinished agenda for achieving greater social justice, and improving our economy.

But how can we reduce segregation? As we all know, its difficult and expensive to build new housing in established neighborhoods. There’s often opposition to new development, whether its infill housing in cities, or affordable housing in suburbs. But while the housing stock can change only slowly, the occupants of housing units are changing all the time–the average renter has lived in her apartment less than two years, for example. The critical question is whether the regular and on-going movement of people in and out of different housing and different neighborhoods is reinforcing existing patterns of segregation, or whether its creating greater diversity.

A new report–Moving to Diversity–from the Richelle Winkler a sociologist at Michigan Technological University, and Kenneth Johnson, a demographer at University of New Hampshire, looks at the way in which population movements are changing the face of America’s counties.  Counties turn out to be a convenient unit for analysis, because its possible to accurately separate out the effects of births, deaths and net migration by race and ethnicity. The report looks at population change between 1990 and 2010, and focus on four broad racial-ethnic categories: non-hispanic whites, non-hispanic blacks, hispanic persons, and all other racial-ethnic combinations. To compute the effects of migration, the authors calculate what each county’s demographics would look like based on its racial and ethnic composition in the base year (for example 1990) forecast forward simply to reflect the effects of births and deaths of the base year population. The difference between that estimate and the actual observed value in the end year (for example 2000), is the net effect of migration on county demographics.

The report offers several key insights into the ways in which migration is influencing the racial and ethnic composition of different counties. First, its the movement of younger people, especially young adults which is contributing to the big increases in county-level diversity.  The movement of those 20-39 accounted for the biggest changes in both black-white and hispanic white diversity. Moves by older adults actually tended to decrease racial and ethnic diversity (think white people moving to even “whiter” counties, and so on).  But overall, the trend toward greater diversity is driven by the young, who are both more likely to move, and when they move, tend to move to more diverse locations.

 

There are important city-to-suburb and suburb-to-city components to migration. Young white people contribute to greater diversity by moving from whiter counties (disproportionately in suburbs) to urban counties that tend to have more persons of color. Conversely, Black and Hispanic migrants exhibit net migration in the other direction, from less white urban counties to whiter suburban ones. The effect of both kinds of migration is to increase diversity in both counties.  As Winkler and Johnson explain:

Blacks and Hispanics of all ages migrated to areas that were “whiter,” thereby increasing diversity. The movements of the white population have been more complex, however, with impacts that vary considerably by age. White young adults (age 20–39) moved from predominantly white counties to counties with larger black and Hispanic population shares, often in large urban centers. The net flow of white young adults into central-city counties increased the white young adult population there by approximately 20 percent in the 1990s and again in the 2000s. The outflow of these same young white adults from suburban and rural counties to big urban cores also contributed to more diversity in these origin areas by diminishing the number of whites there.

While the overall effect of migration was to increase integration by race and ethnicity, this didn’t occur everywhere. Winkler and Johnson estimate that migration significantly increased diversity in  about 10 percent of counties, a modestly increased diversity in about 56 percent of counties, had little effect one way or another in about 32 percent of counties, and resulted in noticeably less diversity in only about 2 percent of counties.

The distinct age structure of these migration trends suggests that future migration will also tend to increase diversity. Young people are much more likely to migrate than older ones. The persistence of the migration of white young adults to cities, coupled with the migration of persons of color to suburbs makes both areas more diverse than they would otherwise have been.

In the face of a growing body of evidence on the negative effects of segregation, its good to know that the individual migration decisions of people in the up and coming generations are contributing to growing diversity at the county level in the US.

Time for the annual Ben & Jerry’s seminar in transportation economics

They’ll be lined up around the block because the price is too low–just like every day on urban roads

You can learn everything you need to know about transportation economics today, just by helping yourself to a free ice cream cone. One day a year, and today is that day, Ben and Jerry are giving away free ice cream to everyone who comes by their stores. Whether you’re hankering for Cherry Garcia or Chunky monkey, you can now get it for absolutely zero price.

In addition to free ice cream, there also giving a free lesson in why American roads are perpetually clogged, and why state highway departments are (a) always broke, and (b) always think they need to build more and more and more lanes for traffic.

As you’re standing in line waiting for your “free” ice cream cone, give a little thought to the parallels between that line and your typical rush hour traffic jam.  In both cases, you’re waiting in line for the same reason–the price is too low, and demand is overwhelming supply.  This is the valuable lesson that Ben and Jerry are providing in the fundamentals of transportation economics.

Gridlock.

You’ll note that unlike the average day at a Ben and Jerry’s, when you might have to wait in line a for a minute or two to get your favorite flavor, now you’re going to end up waiting twenty minutes, or a half hour, or possibly longer.  In terms of customers served and gallons scooped, this is going to be their biggest day of the year–last time they gave out a million scoops of ice cream worldwide.

You’ll probably also notice that most of the people standing in line are people who aren’t working nine-to-five.  Not many investment bankers or plumbers, but lots of students, moms with small kids, and people who have at least part of the day off from work. (Unlike waiting in traffic, where everyone is isolated in their cars, and experiencing aggravation and road rage, there’s a kind of social, party atmosphere at Ben and Jerrys.

Make no mistake, although you’re not laying out any cash for your ice cream, you are paying for it: with your time. Let’s say that you’d pay $2.50 for that scoop of Phish Food (they’re a bit smaller than regulation on free cone day).  If you have to wait half an hour, and you value your time at say, $15.00 per hour, that $2.50 scoop really cost you something like $7.50.  It’s a safe bet that most of the people waiting in line value their time at something less than $5.00 an hour if they’re willing to wait that long for a “free” cone. Also, if you really want ice cream, and are pressed for time, there’s no way that you’re going to jump to the head of the line no matter how much you’d be willing to pay.

Free. It only works because its one day a year.

Substitute “freeway” for “free cone” and you’ve got a pretty good description of how transportation economics works. When it comes to our road system, every rush hour is like free cone day at Ben and Jerry’s.  The customers (drivers) are paying zero for their use of the limited capacity of the road system, and we’re rationing this valuable product based on people’s willingness to tolerate delays (with the result that lot’s of people who don’t attach a particularly high value to their time are slowing down things for everyone).

If Ben and Jerry’s were run by traffic engineers, instead of smart business people (albeit smart business people with a strong social minded streak), they’d look at these long lines and tell Ben & Jerry that they really need to expand their stores.  After all, the long lines of people waiting to get ice cream represent “congestion” and “delay,” that can only be solved by building more and bigger ice cream stores. And thanks to what you might call the “fundamental law of ice cream congestion” building more stores might shorten lines a little, but then it would likely prompt other people to stand in line to get free ice cream, or to go through the line twice. But, of course, with zero revenue Ben & Jerry would find it hard to build more stores.

Tomorrow, Ben and Jerry will go back to charging for ice cream. And the lines at their stores will disappear. In another year or two, Manhattan will be doing pretty much the same thing, when it starts charging a price for private vehicles entering Manhattan. Road pricing can get rid of the lines of cars in New York City just as effectively as it does in front of ice cream shops.

No doubt Ben and Jerry generate enough good will, and probably attract a few new customers with their willingness to give up one day of revenue per year. And they’ll make more than enough money on the other 364 days of the year to cover their losses. But what works for ice cream one day a year is an epic failure when it comes to roads. As long as the price is zero, there will be more demand than you can handle, and you’ll be struggling to pay for the capacity that (you think) is needed.

 

The Ben & Jerry’s crash course in transportation economics

What one day of free ice cream teaches us about traffic congestion

Today’s that day, folks. Ben and Jerry are giving away free ice cream to everyone who comes by their stores. Whether you’re hankering for Cherry Garcia or Chunky monkey, you can now get it for absolutely zero price.

Well, there is that one thing:  You’re going to have to wait in line, and probably for a long time. As you’re standing there, it would be a good time for you to ponder the valuable lesson that Ben and Jerry are providing in the fundamentals of transportation economics.

Gridlock.

You’ll note that unlike the average day at a Ben and Jerry’s, when you might have to wait in line a for a minute or two to get your favorite flavor, now you’re going to end up waiting twenty minutes, or a half hour, or possibly longer.  In terms of customers served and gallons scooped, this is going to be their biggest day of the year–last time they gave out a million scoops of ice cream worldwide.

You’ll probably also notice that most of the people standing in line are people who aren’t working nine-to-five.  Not many investment bankers or plumbers, but lots of students, moms with small kids, and people who have at least part of the day off from work.

Make no mistake, although you’re not laying out any cash for your ice cream, you are paying for it: with your time. Let’s say that you’d pay $2.50 for that scoop of Phish Food (they’re a bit smaller than regulation on free cone day).  If you have to wait half an hour, and you value your time at say, $15.00 per hour, that $2.50 scoop really cost you something like $7.50.  It’s a safe bet that most of the people waiting in line value their time at something less than $5.00 an hour if they’re willing to wait that long for a “free” cone. Also, if you really want ice cream, and are pressed for time, there’s no way that you’re going to jump to the head of the line no matter how much you’d be willing to pay.

Free. It only works because its one day a year.

Substitute “freeway” for “free cone” and you’ve got a pretty good description of how transportation economics works. When it comes to our road system, every rush hour is like free cone day at Ben and Jerry’s.  The customers (drivers) are paying zero for their use of the limited capacity of the road system, and we’re rationing this valuable product based on people’s willingness to tolerate delays (with the result that lot’s of people who don’t attach a particularly high value to their time are slowing down things for everyone).

If Ben and Jerry’s were run by traffic engineers, instead of smart business people (albeit smart business people with a strong social minded streak), they’d look at these long lines and tell Ben & Jerry that they really need to expand their stores.  After all, the long lines of people waiting to get ice cream represent “congestion” and “delay,” that can only be solved by building more and bigger ice cream stores. And thanks to what you might call the “fundamental law of ice cream congestion” building more stores might shorten lines a little, but then it would likely prompt other people to stand in line to get free ice cream, or to go through the line twice. But, of course, with zero revenue Ben & Jerry would find it hard to build more stores.

No doubt Ben and Jerry generate enough good will, and probably attract a few new customers with their willingness to give up one day of revenue per year. And they’ll make more than enough money on the other 364 days of the year to cover their losses. But what works for ice cream one day a year is an epic failure when it comes to roads. As long as the price is zero, there will be more demand than you can handle, and you’ll be struggling to pay for the capacity that (you think) is needed.

 

The Ben & Jerry’s crash course in transportation economics

They’ll be lined up around the block because the price is too low–just like every day on urban roads

Today’s that day, folks. Ben and Jerry are giving away free ice cream to everyone who comes by their stores. Whether you’re hankering for Cherry Garcia or Chunky monkey, you can now get it for absolutely zero price.

Well, there is that one thing:  You’re going to have to wait in line, and probably for a long time. As you’re standing there, it would be a good time for you to ponder the valuable lesson that Ben and Jerry are providing in the fundamentals of transportation economics.

Gridlock.

You’ll note that unlike the average day at a Ben and Jerry’s, when you might have to wait in line a for a minute or two to get your favorite flavor, now you’re going to end up waiting twenty minutes, or a half hour, or possibly longer.  In terms of customers served and gallons scooped, this is going to be their biggest day of the year–last time they gave out a million scoops of ice cream worldwide.

You’ll probably also notice that most of the people standing in line are people who aren’t working nine-to-five.  Not many investment bankers or plumbers, but lots of students, moms with small kids, and people who have at least part of the day off from work.

Make no mistake, although you’re not laying out any cash for your ice cream, you are paying for it: with your time. Let’s say that you’d pay $2.50 for that scoop of Phish Food (they’re a bit smaller than regulation on free cone day).  If you have to wait half an hour, and you value your time at say, $15.00 per hour, that $2.50 scoop really cost you something like $7.50.  It’s a safe bet that most of the people waiting in line value their time at something less than $5.00 an hour if they’re willing to wait that long for a “free” cone. Also, if you really want ice cream, and are pressed for time, there’s no way that you’re going to jump to the head of the line no matter how much you’d be willing to pay.

Free. It only works because its one day a year.

Substitute “freeway” for “free cone” and you’ve got a pretty good description of how transportation economics works. When it comes to our road system, every rush hour is like free cone day at Ben and Jerry’s.  The customers (drivers) are paying zero for their use of the limited capacity of the road system, and we’re rationing this valuable product based on people’s willingness to tolerate delays (with the result that lot’s of people who don’t attach a particularly high value to their time are slowing down things for everyone).

If Ben and Jerry’s were run by traffic engineers, instead of smart business people (albeit smart business people with a strong social minded streak), they’d look at these long lines and tell Ben & Jerry that they really need to expand their stores.  After all, the long lines of people waiting to get ice cream represent “congestion” and “delay,” that can only be solved by building more and bigger ice cream stores. And thanks to what you might call the “fundamental law of ice cream congestion” building more stores might shorten lines a little, but then it would likely prompt other people to stand in line to get free ice cream, or to go through the line twice. But, of course, with zero revenue Ben & Jerry would find it hard to build more stores.

No doubt Ben and Jerry generate enough good will, and probably attract a few new customers with their willingness to give up one day of revenue per year. And they’ll make more than enough money on the other 364 days of the year to cover their losses. But what works for ice cream one day a year is an epic failure when it comes to roads. As long as the price is zero, there will be more demand than you can handle, and you’ll be struggling to pay for the capacity that (you think) is needed.

 

Carmaggedon does a no-show in Portland

Once again, Carmaggedon doesn’t materialize; Shutting down half of the I-5 Interstate Bridge over the Columbia River for a week barely caused a ripple in traffic

It’s a teachable moment if we pay attention:  traffic adapts quickly to limits in road capacity

The most favored myth of traffic reporters and highway departments is the notion of traffic diversion:  If you restrict road capacity in any one location, then it will spill over to adjacent streets and create gridlock.  It’s invariably used as an argument against any plans to slow car movement or repurpose capacity for transit, cyclists or people walking.

Time and again, however, when road capacity is reduced, either by design or accident, the predicted gridlock fails to materialize.  Earlier this year, New York closed 14th Street to most car traffic; speeds on parallel streets 13th and 15th, were unaffected, according to traffic monitoring firm Inrix. Similarly, Seattle’s experience with closing the Alaskan Way viaduct (now demolished), and imposing tolls on its replacement, the Highway 99 Tunnel, failed to generate the predicted traffic diversion and gridlock.

Crying Carmaggedon, again

The latest case study of comes from Portland.  Portland is connected to suburbs in Southwest Washington by two Interstate highway bridges that crossing Columbia River. The oldest of the two bridges that carry Interstate 5 over the river , built in 1917, needed to have one of the trunnions that carry the weight of its lift-span replaced, an operation that required the bridge to be closed to traffic for a week. The I-5 bridges usually carry about 120,000 to 130,000 vehicles per day, but that capacity was cut in half as both Northbound and Southbound traffic were diverted to the newer structure.  So instead of 6 lanes, the I-5 freeway was reduced to 3 lanes.

Not surprisingly, the Oregon and Washington Department’s of Transportation predicted massive traffic tie-ups.  The two highway departments predicted four-mile long backups, leading the Vancouver Columbian to forecast that the region would be mired in “trunnion trauma“:

If drivers do not change their travel habits, estimates indicate that backups could stretch for 4 miles on either side of the Columbia River, and the duration of congestion could nearly triple, from seven to 20 hours a day.

Traffic catastrophe “didn’t materialize”

The weeklong closure of the bridge ended Friday September 25, and prompted a story looking back at the region’s experience by The Oregonian’s  transportation beat-reporter Andrew Theen.  He found that despite some occasional slowdowns, traffic between the two states was little different than any other week.

 

. . . travel patterns largely followed the normal cycle: Evening trips northbound across either of the two bridges were a slog, but that’s typically the case with scores of residents working on the Oregon side of the metro area, even during a pandemic.

Deja vu all over again: A reprise of 1997

Long-time Portland-area residents will know that this is not the first time the bridge has been closed to replace a trunnion. To great fanfare, the bridge was closed for just this purpose in 1997.  The predictions of carpocalypse were even more dire then.  That led local officials to mount a massive PR campaign, to put additional buses on the road, and even arrange for a temporary commuter rail service between Portland and Vancouver.  And then, just as today, the expected gridlock never happened.  Local media were stunned that it was so uneventful.

Back in 1997, a clearly surprised Oregonian team of reporters summed up the experience in a September 16 story: “Gridlock? It’s a breeze for savvy commuters.”

Officials predicted monster traffic jams on the freeways and main arteries in Vancouver and Portland. Total gridlock, most planners said weeks before Tuesday’s closure. But the expected calamity was over before it started, in part because of a media blitz and mass transit options set up by transportation officials.

You’d have thought that this experience would make the two states’ highway departments more reticent to forecast traffic disaster.  Not so.  Local highway engineers gravely warned, things would be worse this time:

This year’s closure likely won’t be so smooth, primarily due to growth in Clark County. Since the September 1997 project, the county’s population has increased by about 175,000.

“I seriously doubt we will see a replay,” said Ryan Lopossa, Vancouver’s streets and transportation manager. “There are just so many people who make that crossing.”

The trunnion experience echoes many other If this gives you a bit of deja vu, dear reader, it should.  Back in January, just before the tunnel opened, the city had to commence demolition of the old viaduct, in order to connect on ramps to the new tunnel.  As a result, the city suddenly lost its old waterfront freeway, and didn’t have access to the new tunnel. State highway officials warned that the city was in for weeks of gridlock.  But when they closed the viaduct, not only did nothing happen, but as we related at City Observatory, traffic on most of downtown Seattle got better. Rather than simply diverting to other city streets, traffic levels went down; as the Seattle Times reported “traffic just disappeared.”

What road closures teach us about travel demand

So what’s going on here? Arguably, our mental model of traffic is just wrong. We tend to think of traffic volumes, and trip-making generally as inexorable forces of nature.  The diurnal flow of traffic on urban roadways seems just as regular and predictable as the tides.

What this misses is that there’s a deep behavioral basis to travel. Human beings will shift their behavior in response to changing circumstances. If road capacity is impaired or priced, many people can choose not to travel, change when they travel, change where they travel, or even change their mode of travel. The fact that Carmageddon almost never comes is powerful evidence of induced demand: people travel on roadways because the capacity is available for their trips, when when the capacity goes away or its price goes up, trip making changes to reduce traffic.

If we visualize travel demand as an fixed, irreducible quantity, or an incompressible liquid, it’s easy to imagine that there will be Carmaggedon when a major link of the transportation system goes away.  But in the face of changed transportation system, people change their behavior.  Traffic and congestion is more like a gas, expanding and contracting to fill available space. And while we tend to believe that most people have no choice and when and where they travel, the truth is many people do, and that they respond quickly to changes in the transportation system.  Its a corollary of induced demand:  when we build new capacity in urban roadways, traffic grows quickly to fill it, resulting in more travel and continuing traffic jams. What we have here is “reduced demand”–when we cut the supply of urban road space, traffic volumes fall.

If drivers quickly change their behavior in response traffic capacity, it’s a sign that engineers are crying “Wolf” when they make claims that reducing car capacity of a particular road will produce “gridlock.”  This is a signal that road diets, which have been shown to greatly improve safety and encourage walking and cycling, don’t have anything approaching the kinds of adverse effects on travel that highway engineers often predict.

Carmaggedon never comes

The phenomenon of reduced demand is so common and well-documented that it is simply unremarkable.  Whether it was Los Angeles closing a major section of freeway to replace overpasses, or Atlanta’s I-85 freeway collapse, or the I-35 bridge failure in Minneapolis, or the demolition of San Francisco’s Embarcadero Freeway, we’ve seen that time and again when freeway capacity is abruptly reduced, traffic levels fall as well. There’s a lesson here, if we’re willing to learn it:  if you want to reduce traffic congestion, reduce traffic levels. Whether you do it by restricting capacity, or (more sensibly) by imposing tolls that ask motorist to pay for even a fraction of the cost of the roads they’re using, you get a much more efficient system.

 

 

 

 

Carmaggedon does a no-show in Seattle, again

Once again, Carmaggedon doesn’t materialize; this time when Seattle started asking motorists to pay a portion of the cost of their new highway tunnel

Initial returns suggest that tolling reduced congestion by reducing the overall volume of traffic in downtown Seattle

The most favored mythology of traffic reporters and highway departments is the notion of traffic diversion:  If you restrict road capacity in any one location, then it will spill over to adjacent streets and create gridlock.  It’s invariably used as an argument against any plans to slow car movement or repurpose capacity for transit, cyclists or people walking.

Time and again, however, when road capacity is reduced, either by design or accident, the predicted gridlock fails to materialize.  The latest instance of this was just in the past month, when New York closed 14th Street to most car traffic; speeds on parallel streets 13th and 15th, were unaffected, according to traffic monitoring firm Inrix.

Crying Carmaggedon, again

The latest case study of comes from Seattle. Earlier this year, the city opened a new $3 billion tunnel under downtown Seattle, to replace the road capacity lost by the demolition of the city’s aging eyesore, the Alaskan Way viaduct.  Since it opened, the new SR 99 tunnel has been free to vehicles, but starting last Saturday, the state department of transportation started collecting tolls (electronically). What was free last Friday, now costs afternoon peak hour travelers between $2.25 (if they have a transponder) and $4.25 if they use the pay by mail option).  Even so, the toll revenues will ultimately cover only about 10 percent of the cost of constructing the tunnel.

The Washington State Department of Transportation (WSDOT), which operates the new SR 99 Tunnel, warned of severe traffic congestion, predicting that more than a third of the 70,000 vehicles that use the tunnel each day would divert to city surface streets.  Local TV Station KIRO warned motorists:

WSDOT projects around 35 percent of tunnel drivers will avoid tolls in the tunnel and jam downtown streets instead.

That’s pretty scary stuff.  How is Seattle’s latest brush with Carmaggedon turning out? Because Monday was Veterans Day, Tuesday, November 12, was the first regular business day in which the tolls were charged on the new tunnel.  So what happened?

Well, see if you can tell.  Here are the Google maps for traffic conditions in downtown Seattle on a typical Tuesday afternoon and for Tuesday, November 12, the first regular business day with tolling in place on the SR 99 tunnel.   On the left is Google’s map of actual traffic levels at 5:10 PM on Tuesday November 12, showing Google’s familiar color-coded congestion rankings (green is moving smoothly, yellow is slowing, red is stop and go). On the right is Google’s depiction of traffic conditions on a typical Tuesday at 5:10 PM.  You can spot one big difference right away.  The left-hand map shows the the tunnel as two parallel bright green lines; with tolls, traffic is moving smoothly in the tunnel.  But what about all the nearby side-streets in downtown Seattle?  Surely they’ll be overwhelmed by diverted traffic.  Actually, no.  Most of the streets leading to and near the tunnel show green, and traffic conditions in this area are actually better than on a typical Tuesday afternoon.

 

Source: Google Maps Traffic.

Overall, if you compare these two pictures, it’s pretty clear that today’s traffic situation in downtown Seattle is much better than a typical day.  Sure, Interstate 5, the freeway to the East of downtown Seattle is congested (as it is most late afternoon weekdays)  But downtown Seattle streets, particularly on the west side of downtown are “green.” or free flowing.  Overall, there’s a lot more “green” on Tuesday’s traffic charts than on a typical day. In other words:  no gridlock or Carmaggedon here.

Tolling the new SR 99 Tunnel didn’t make traffic worse.  If anything, it made traffic better.  The tunnel itself was flowing smoothly–it was green rather than red, meaning those paying the toll were getting value for their money.  Not only that, but it doesn’t appear that traffic on downtown streets in Seattle was any worse than on an ordinary Tuesday.  It’s likely that by reducing travel volumes on the tunnel, tolling reduced the number of cars driving onto downtown Seattle Streets.  Small reductions in travel demand that keep roads from crossing a tipping point and becoming congested make traffic move more smoothly.

If this gives you a bit of deja vu, dear reader, it should.  Back in January, just before the tunnel opened, the city had to commence demolition of the old viaduct, in order to connect on ramps to the new tunnel.  As a result, the city suddenly lost its old waterfront freeway, and didn’t have access to the new tunnel. State highway officials warned that the city was in for weeks of gridlock.  But when they closed the viaduct, not only did nothing happen, but as we related at City Observatory, traffic on most of downtown Seattle got better. Rather than simply diverting to other city streets, traffic levels went down; as the Seattle Times reported “traffic just disappeared.”

What road closures teach us about travel demand

So what’s going on here? Arguably, our mental model of traffic is just wrong. We tend to think of traffic volumes, and trip-making generally as inexorable forces of nature.  The diurnal flow of traffic on urban roadways seems just as regular and predictable as the tides.

What this misses is that there’s a deep behavioral basis to travel. Human beings will shift their behavior in response to changing circumstances. If road capacity is impaired or priced, many people can choose not to travel, change when they travel, change where they travel, or even change their mode of travel. The fact that Carmageddon almost never comes is powerful evidence of induced demand: people travel on roadways because the capacity is available for their trips, when when the capacity goes away or its price goes up, trip making changes to reduce traffic.

If we visualize travel demand as an fixed, irreducible quantity, it’s easy to imagine that there will be Carmaggedon when a major link of the transportation system goes away.  But in the face of changed transportation system, people change their behavior.  And while we tend to believe that most people have no choice and when and where they travel, the truth is many people do, and that they respond quickly to changes in the transportation system and to road pricing.  Its a corollary of induced demand:  when we build new capacity in urban roadways, traffic grows quickly to fill it, resulting in more travel and continuing traffic jams. What we have here is “reduced demand”–when we cut the supply of urban road space, traffic volumes fall.

If Seattle drivers quickly change their behavior in response to a dollar or two of tolls, that’s a powerful indication that more modest steps to price roads don’t really mean the end of the world. If we recognize that, in the absence of pricing, traffic will tend to adjust to available capacity, we then end up taking a different view of how to balance transportation against other objectives. For example, this ought to be a signal that road diets, which have been shown to greatly improve safety and encourage walking and cycling, don’t have anything approaching the kinds of adverse effects on travel that highway engineers usually predict.

There’s one other fiscal codicil to this tale.  The tolls that drivers pay to use the new $3 billion tunnel cover barely 10 percent of the costs of construction.  The fact that tolls reduce traffic by a third (or more) show that a good fraction of tunnel users are don’t value the tunnel at even a tenth of the cost of providing its right of way; and that they’ll use it only if its free.  That’s a pretty solid indication that this “investment” has negligible value.  In addition, the fact that implementing pricing has caused traffic conditions in much of downtown Seattle to improve is an indication that simply charging a modest price for roadways actually produces net benefits for road users–it encourages those with choices to travel at other times or avoid congested areas, producing better service for everyone.  And that may be the real lesson here:  the tunnel didn’t make traffic in downtown Seattle better, the toll did.

Carmaggedon never comes

Of course, we’ll wait for detailed data on traffic conditions that will be collected over the next few weeks before making a definitive judgment.  But this phenomenon of reduced demand is so common and well-documented that it is simply unremarkable.  Whether it was Los Angeles closing a major section of freeway to replace overpasses, or Atlanta’s I-85 freeway collapse, or the I-35 bridge failure in Minneapolis, or the demolition of San Francisco’s Embarcadero Freeway, we’ve seen that time and again when freeway capacity is abruptly reduced, traffic levels fall as well. There’s a lesson here, if we’re willing to learn it:  if you want to reduce traffic congestion, reduce traffic levels. Whether you do it by restricting capacity, or (more sensibly) by imposing tolls that ask motorist to pay for even a fraction of the cost of the roads they’re using, you get a much more efficient system.

 

 

 

 

Why Carmaggedon never comes (Seattle edition)

Why predicted gridlock almost never happens and what this teaches us about travel demand

Seattle has finally closed its aging Alaskan Way viaduct, a six-lane double-decker freeway that since the 1940s has been a concrete scar separating Seattle’s downtown from Elliott Bay.  In a few weeks, much of this capacity will be replaced by a new 3 billion dollar highway tunnel under downtown Seattle, but until then, the city will have to simply do without a big chunk of the highway system that circulates cars around downtown Seattle.

The Alaskan Way Viaduct (Wikimedia Commons)

Transportation agencies and the media love to portray major road closures as the real world equivalents of disaster films.  The City of Seattle has even come up with its own ominous moniker THE PERIOD OF MAXIMUM CONSTRAINT, which it says, “sounds like a summer action blockbuster starring Will Smith.”

Losing a major freeway that carries nearly hundred thousand vehicles a day through the heart of the city will certainly cause a major disruption to the traffic.  The Seattle Times confidently told its readers in early January to prepare for a traffic cataclysm:  “the region can’t absorb the viaduct’s 90,000 daily vehicle trips and 30,000 detoured bus riders without traffic jams that likely will ripple out as far as [distant suburbs] Woodinville or Auburn.” Our friends at CityLab echoed the ominous rhetoric with a story headlined “Viadoom:  Time for the Seattle Squeeze Traffic Hell.”

That’s pretty scary stuff.  But two days into Seattle’s brush with carmaggedon how are things looking?

Well, see if you can tell.  Here are the Google maps for traffic conditions on a typical day and post-viaduct closure.  This first pair of charts shows conditions in mid-morning on Tuesday, January 15th. On the left is Google’s map of typical traffic levels on a Tuesday morning at 10:20 am, using Google’s familiar color-coded congestion rankings (green is moving smoothly, yellow is slowing, red is stop and go). On the right is Google’s real time traffic map of downtown Seattle for Tuesday, January 15th (day two of THE PERIOD OF MAXIMUM CONSTRAINT). You can spot one big difference right away.  The right-hand map shows the closed portion of the viaduct shows up as a series of dotted red lines, punctuated with red and white “do not enter symbols” showing closed ramps to and from the viaduct.  On an ordinary day this route carried nearly 100,000 cars.

 

But if you compare these two pictures, there’s not much difference.  If anything there’s a lot more “green” on Tuesday’s traffic charts than on a typical day.  But that was mid-morning.  Surely things would reach epic proportions by 5pm.  So here are the “typical” and actual charts for traffic in downtown Seattle for Tuesday at 5pm.

It’s clearly more congested than it was in the morning, but a high fraction of streets, especially those North and South of Downtown have visibly more green than on a typical weekday.

It may seem like a stretch to suggest that closing the Alaskan Way viaduct actually made traffic conditions in Seattle better, but in some respects, thats likely to be the case.  Worried about getting caught in a traffic jam, it’s likely that many travelers postponed or re-rerouted their trips. If closing the viaduct reduces the number of trips to downtown Seattle, it reduces traffic on other streets as well.

What road closures teach us about travel demand

So what’s going on here? Arguably, our mental model of traffic is just wrong. We tend to think of traffic volumes, and trip-making generally as inexorable forces of nature.  The diurnal flow of 100,000 vehicles a day on an urban freeway the Alaskan Way viaduct is just as regular and predictable as the tides. What this misses is that there’s a deep behavioral basis to travel. Human beings will shift their behavior in response to changing circumstances. If road capacity is impaired, many people can decide not to travel, change when they travel, change where they travel, or even change their mode of travel. The fact that Carmageddon almost never comes is powerful evidence of induced demand: people travel on roadways because the capacity is available for their trips, when when the capacity goes away, so does much of the trip making.

If we visualize travel demand as an fixed, irreducible quantity, it’s easy to imagine that there will be carmaggedon when a major link of the transportation system goes away.  But in the face of changed transportation system, people change their behavior.  And while we tend to believe that most people have no choice and when and where they travel, the truth is many people do, and that they respond quickly to changes in the transportation system.  Its a corollary of induced demand:  when we build new capacity in urban roadways, traffic grows quickly to fill it, resulting in more travel and continuing traffic jams. What we have here is “reduced demand”–when we cut the supply of urban road space, traffic volumes fall.

If Seattle can survive for a couple of weeks without a major chunk of its freeway system, that’s a powerful indication that more modest steps to alter road capacity don’t really mean the end of the world. If we recognize that traffic will tend to adjust to available capacity, we then end up taking a different view of how to balance transportation against other objectives. For example, this ought to be a signal that road diets, which have been shown to greatly improve safety and encourage walking and cycling, don’t have anything approaching the kinds of adverse effects on travel that highway engineers usually predict. So in the next few weeks, keep an eye on Seattle: If the one of the nation’s most bustling cities can survive the loss of a freeway segment that carries a hundred thousand vehicles a day, its a strong sign that more modest changes to road systems really don’t have much impact on metropolitan prosperity.

Carmaggedon never comes

Of course, we’ll wait for detailed data on traffic conditions that will be collected over the next few weeks before making a definitive judgment.  But this phenomenon of reduced demand is so common and well-documented that it is simply unremarkable.  Whether it was Los Angeles closing a major section of freeway to replace overpasses, or Atlanta’s I-85 freeway collapse, or the I-35 bridge failure in Minneapolis, or the demolition of San Francisco’s Embarcadero Freeway, we’ve seen that time and again when freeway capacity is abruptly reduced, traffic levels fall as well.

LOS ANGELES:  One of the most famous instances of this phenomena was in Los Angeles.  In 2011 and 2012, the state highway department closed a 10 mile stretch of Interstate 405 on several weekends to rebuild overpasses. The media was awash in predictions of Carmaggedon. But surprisingly, nothing of the kind happened.  As Brian Taylor and Martin Wachs explain in an article in Access, people mostly avoided taking trips in the area, or chose alternate routes, with the effect that traffic was actually much lighter than normal. They report that “Rather than creating chaos, the first closure greatly reduced traffic congestion.” Taylor and Wachs explain that “crying wolf” about likely gridlock depressed trip-taking in the affected area, but that effect faded as travelers realized things were nowhere as bad as predicted.

MINNEAPOLIS:  You might think that the kind of behavioral effects that keep Carmaggedon at bay only work when its a short closure of a few hours. But even the year-long closure of I-35W in Minneapolis, following the collapse of a highway bridge over the Mississippi in 2007 produced similar results. Travelers quickly changed their routes and travel times, and many people simply stopped taking trips that crossed the river. David Levinson reports that there were about 46,000 fewer trips per day across the river after the bridge collapsed.

ATLANTA:  In April 2017, a spectacular fire destroyed several hundred feet of Interstate 85 in Atlanta, on a segment of roadway carrying nearly a quarter million cars per day, causing newspapers to say “Atlanta to face travel chaos for MONTHS.”  Yet the first commuting day after the collapse, April 3, 2017, the Atlanta Journal Constitution reported:  “Atlanta I-85 collapse: The word on Monday’s commute?  Not so horrible.”  And Google Maps showed that late in the morning, traffic looked pretty normal:

Google traffic map for Atlanta, April 3, 2017, 10AM EDT.

 

 

 

Carmaggedon stalks Atlanta

Why predicted gridlock almost never happens and what this teaches us about travel demand

It had all the trappings of a great disaster film:  A spectacular blaze last week destroyed a  several hundred foot-long section of Interstate 85 in Atlanta. In a city that consistently has some of the worst traffic congestion in the country, losing a key link its freeway system could only mean one thing: Carmageddon.  Governor Nathan Deal has declared a state of emergency. The bridge collapse effectively “puts a cork in the bottle,” said Georgia State Patrol Commissioner Mark McDonough. This particular segment of freeway carries nearly a quarter million cars per day. So as the Daily Mail shouted, chaos is coming:

 

The prospect of gridlock makes for great headlines and local TV news stories, but as it turns out, predictions of terrible traffic in the wake of even major disruptions to the road system are almost never realized.

One of the most famous instances of this phenomena was in Los Angeles.  In 2011 and 2012, the state highway department closed a 10 mile stretch of Interstate 405 on several weekends to rebuild overpasses. The media was awash in predictions of Carmaggedon. But surprisingly, nothing of the kind happened.  As Brian Taylor and Martin Wachs explain in an article in Access, people mostly avoided taking trips in the area, or chose alternate routes, with the effect that traffic was actually much lighter than normal. They report that “Rather than creating chaos, the first closure greatly reduced traffic congestion.” Taylor and Wachs explain that “crying wolf” about likely gridlock depressed trip-taking in the affected area, but that effect faded as travelers realized things were nowhere as bad as predicted.

You might think that the kind of behavioral effects that keep Carmaggedon at bay only work when its a short closure of a few hours. But even the year-long closure of I-35W in Minneapolis, following the collapse of a highway bridge over the Mississippi in 2007 produced similar results. Travelers quickly changed their routes and travel times, and many people simply stopped taking trips that crossed the river. David Levinson reports that there were about 46,000 fewer trips per day across the river after the bridge collapsed.

You’ll forgive our excessively clinical attitude about this damage–and its going to cost tens of millions to fix–but what we have here is a classic “natural experiment” of the kind economists and students of public policy relish. So what happens when we take a major urban freeway out of service for a couple of months?  Are Atlanta commuters in for hours of gridlock every day and grisly commutes? Will the region’s economy grind to a halt as a result? We’ll be watching over the next several months to see.

So far, the results are consistent with what we’ve seen in Los Angeles and Minneapolis.  Monday morning came, and something funny happened: traffic wasn’t so bad.  The Atlanta Journal Constitution reports

And Google Maps showed that late in the morning, traffic looked pretty normal:

So what’s going on here? Arguably, our mental model of traffic is just wrong. We tend to think of traffic volumes, and trip-making generally as inexorable forces of nature.  The diurnal flow of 250,000 vehicles a day on an urban freeway like I-85 is just as regular and predictable as the tides.What this misses is that there’s a deep behavioral basis to travel. Human beings will shift their behavior in response to changing circumstances. If road capacity is impaired, many people can decide not to travel, change when they travel, change where they travel, or even change their mode of travel. The fact that Carmageddon almost never comes is powerful evidence of induced demand: people travel on roadways because the capacity is available for their trips, when when the capacity goes away, so does much of the trip making.

If Atlanta can survive for a month or two without a major chunk of its freeway, that’s a powerful indication that more modest steps to alter road capacity don’t really mean the end of the world. If we recognize that traffic will tend to adjust to available capacity, we then end up taking a different view of how to balance transportation against other objectives. For example, this ought to be a signal that road diets, which have been shown to greatly improve safety and encourage walking and cycling, don’t have anything approaching the kinds of adverse effects on travel that highway engineers usually predict. So in the next few weeks, keep an eye on Atlanta: If the one of the nation’s most sprawling and traffic ridden cities can survive the loss of a freeway segment that carries a quarter million vehicles a day, its a strong sign that more modest changes to road systems really don’t have much impact on metropolitan prosperity.

 

The High Cost of Segregation

A new report from the Urban Institute shows the stark costs of economic and racial segregation

Long-form white paper policy research reports are our stock in trade at City Observatory. We see dozens of them every month, and usually read them with great interest, and flagging the best one’s for the “must read” list we publish as part of the Week Observed. Usually that’s enough. Yesterday’s report from the Urban Institute–The Cost of Segregation– is different.  It’s not just a must read: It’s a must read, digest, understand, and use.

We’ve known for a long time that segregation is “a bad thing.” But the new Urban Institute report offers a stark, comprehensive and compelling calculation of the economic and social costs that segregation imposes every day on the residents of nation’s large metropolitan areas. Higher levels of segregation are associated with lower levels of black per capita income, lower rates of educational attainment, and higher levels of crime. As a result, segregation is more than just wrong or unfair, it imposes serious economic costs. Conversely, more inclusive metropolitan areas are more prosperous.

Urban Institute has computed how large the gains might be from simply reducing the level of segregation in some of the more segregated cities to the level typically found in large metro areas. In the case of Chicago–one of the dozen or so most segregated metro areas–lowering economic and racial segregation to the national median would have these effects:

  • raising black per capita income $3,000 per person (for a total metro gain of $4.4 billion)
  • increasing the number of college graduates by 80,000
  • reducing the number of homicides by almost one-third (from about 6.6 per 100,000 to 4.6 per 100,000 per year.

While the report is ostensibly about the Chicago metropolitan area, what you’ll really find in a careful tabulation of segregation data for all of the nation’s 100 largest metropolitan areas, plotting trends over the 20 year period from 1990 through 2010. As a quick summary, they’ve mapped the ranking of metro areas based on a composite measure the combines economic and racial ethnic segregation. On this map, reddish brown areas have the highest levels of segregation, and dark blue areas have the lowest segregation.

Some Technical Details

The report has a wealth of data on segregation. It uses a slightly different geography than most other analyses of segregation, reporting data for commuting zones, city-centered regions that are somewhat larger that federally defined metropolitan statistical areas. (Economist Raj Chetty and his colleagues used this same geography for their Equality of Opportunity analysis). The report also uses two new measures of segregation.  Its measure of racial and ethnic segregation is the Spatial Proximity Index, which is computed for pairs of groups (Whites and Blacks and Whites and Latinos). The SPI is one if two groups are clustered in the same neighborhoods, values higher than one indicate the degree to which members of each group are more clustered with others in their group (whites with whites, and so on). Higher values indicate greater degrees of segregation between groups.

For economic segregation, the report uses the Generalized Neighborhood Sorting Index, which measures the extent to which high income and low income groups tend to live in the same or different parts of a metropolitan area. The GNSI runs from zero (evenly distributed) to 1 (completely segregated). The index has a spatial component, considers whether, for example, poor neighborhoods are primarily adjacent to other poor neighborhoods, or are more intermingled with higher income neighborhoods.

The report includes detailed data for each of the nation’s 100 largest commuting zones, as well as a clearly constructed on-line calculator that illustrates where a selected metropolitan area stands in relation to all others.  Here are the calculator’s data for Chicago:

This is just a quick overview of what’s in the report. We’ll be digging into its content more in the next few days, and sharing some of our thoughts. But don’t wait for our analysis, there’s lots to learn by downloading the report and pouring over the data for your metro area.

Autonomous vehicles: Peaking, parking, profits & pricing

13 propositions about autonomous vehicles and urban transportation

It looks more and more like autonomous vehicles will be a part of our urban transportation future. There’s a lot of speculation about whether their effects will be for good or ill. While there’s a certain “techno-deterministic” character to these speculations, we’re of the view that the policy environment can play a key role in shaping the adoption of AVs in ways that support, rather than undermine the transportation system and the fabric of cities.

A rocky road for autonomous vehicles? A March 24, 2017 crash of an Uber self-driven vehicle in Tempe Arizona via REUTERS.

Our thinking is still evolving on this subject, but to start the conversation, we’ll pose 13 propositions about the nature of urban travel demand, autonomous vehicles, and what we’ll need to do to change our policies and institutions to cope with them. Given that we think that many of the persistent problems with our current transportation system stem from getting the prices wrong, we think that the way that autonomous vehicles will change the cost and price of urban transportation will be key to shaping their impacts.

  1.  Urban travel demand is highly peaked. As a rule we have plenty of capacity in our transportation system for about twenty of the twenty four hours of the day.  Because we all disproportionately tend to travel at the same times, in the morning and afternoon peaks, streets are taxed to their limits at peak hours, usually for an hour or an hour and a half in the morning, and for two and a half to three hours in the late afternoon. As Jarrett Walker observes, this is a geometry problem, single occupancy vehicles are not sufficiently space efficient that they can accommodate all travelers in peak periods in most urban environments. But it would be more accurate to call this a “space-time” problem:  we don’t have enough space at certain times.  The analyses of AV adoption and deployment routinely abstract from these issues.  The peaked nature of demand has important implications:  more economic value is associated with peak period travel than travel at other times of the day, both due to its volume, and also to the nature of demand. Demand for peak period travel is more inelastic—which is why travelers routinely endure longer travel times in peak hours rather than simply waiting and making those trips at some other hour when congestion is less and travel times are faster:  we willingly endure an extra 5 or 10 minutes in our commute traveling at the peak, when if we waited an hour or ninety minutes, we could shorten our trip by that amount of time.
  2. Parking costs shape mode choice decisions.  Where parking is “free” to end users, they are far more likely to drive. More than four-fifths of all households own automobiles. The costs of owning cars are largely fixed (depreciation, insurance), and the marginal cost of taking a trip by car is often regarded by users as largely just the incremental cost of fuel. The major additional cost to many trips, especially to urban environments is the cost of paying for car storage when the vehicle isn’t being used.  The cost of parking in city centers is a major incentive to using other modes of transportation. There is a very strong correlation  between parking costs and transit use.  In effect, parking costs act as a surrogate road pricing mechanism for trips with origins or destinations in the CBD. The advent of autonomous vehicles (AVs) will greatly reduce or entirely eliminate the cost of parking as a factor in mode choice.  Many people who would not drive to the central business district, in order to avoid parking costs will want to choose AVs.
  3. Autonomous vehicles costs will be low enough to compete against transit. The cost of AV travel may be something on the order of 30 to 50 cents per mile (and could be considerably less).  Most transit trips are less than four miles in distance.  Most transit fares are in excess of two dollars per ride.  AV’s may be cost competitive, and potentially offer much better service (i.e. point-to-point travel, less or no waiting, privacy, greater comfort, etc).  Its fair to assume that the advent of a widespread deployment of fleets of AVs will stimulate a huge demand for urban travel, both among car owning households who don’t currently drive because of parking costs (because parking will be reduced to nil), among car owning households who do commute by car (because they can avoid the cost of parking),
  4. Suburbs will be relatively poor markets for autonomous vehicles. Conversely, where parking is free, and where density is low, fleet AV service will be a far less attractive option for travelers and a far less lucrative market for fleet AV operators.  Because they don’t have to pay for parking currently, commuters don’t save this cost when paying for an AV. Also, less dense areas will by definition be “thinner” markets for car sharing, for companies this means less revenue per mile or per hour and lower utilization; for customers it means longer waits for vehicles. People who live and work in low density areas may find it more attractive to own their own vehicle.
  5. AVs will tend to concentrate in urban centers:  The markets are denser there. The technical challenges of mapping the roadway are more finite, and the cost of mapping can be spread over more trips per road mile traveled. And, importantly, they will be able to surge price in these locations. Surge pricing is possible because the demand for travel, particularly at the peak hour, is higher.  Demand is greater (more people traveling) and people attach a greater value to their travel time.  Companies will want to concentrate their fleets in places that have lots of customers, both to optimize utilization (less waiting, dead-heading) and also to maximize revenue (surge priced trips are more profitable than regular fares).
  6. The demand for peak period travel in urban centers will tend to overwhelm available road capacity, even mores than it does today.  More commuters will seek to travel by AV; AV fleet operators will concentrate their vehicles in lucrative dense locations.
  7. Surge pricing by AV operators will help equilibrate supply and demand. While AVs may only cost 30 to 50 cents to operate, surge prices in dense urban environments could be many times higher than this amount.  Operators will use dynamic pricing to ration vehicles to the highest-value users. Others who might like to travel by AV will choose other modes or times (travel by transit; pay the price of parking and drive one’s own car, wait for a cheaper AV at an off-peak time, walk, bike, etc).  AV’s will tend to fill up existing road capacity.
  8. AV fleet operators will capture a significant portion of the economic rent associated with use of the limited peak period capacity of roads.  Pricing will result in a more efficient allocation of road use among users (in a technical sense, and abstracting from distributional issues).  But the profits from the limited capacity will go to the AV fleet operators, and not the public sector, which is responsible for building and maintaining the roadway, and is typically asked to incur huge expense for additions to capacity to lessen congestion.
  9. Under current road financing policies, AVs might end up paying almost nothing for the use of the public roadway. The gasoline tax is the principal source of revenue for road construction and maintenance. Electric AVs pay nothing in most states toward road costs.  A hallmark of current transportation network companies has been their “disruptive” policies of avoiding (or shifting) the fees and taxes imposed on conventional taxis. We assume this behavior will continue.
  10. In addition, AVs will disproportionately make use of the most congested, most expensive parts of the public street and road system. Unlike typical vehicles, which as widely noted are parked 90+ percent of the time, AVs will receive much higher use, and as noted here, will tend to gravitate toward the densest markets, and due to surge pricing, will be drawn to the most congested locations. With fuel taxes, the privately owned vehicles pay the same per mile cost for road use whether they use lightly trafficked roads at off-peak times, or use congested urban roads at peak times.  As noted, parking costs effectively discourage peak use in dense locations.  And to some extent, the off-peak and low density use of cars means that some roads cross-subsidize others. Parking fees and private ownership of cars have in effect limited the ability of cars to overwhelm city streets. Both of these constraints will be largely erased by fleets of autonomous vehicles.
  11. Some regime change in road pricing is needed.  The gasoline tax won’t work for electric vehicles. Fees tied simply to energy consumed or vehicle miles traveled ignore the very different system costs imposed by travel in different places and at different times.  A VMT fee still allows private fleet operators to capture all or most of the economic rent associated with peak travel in dense urban places, and provides no added revenue to address road or transportation system capacity constraints.
  12. What we really need is surge pricing for road use. The key constraint on urban transportation system performance is peak hour capacity. Single occupancy vehicles represent a highly inefficient way to make use of very expensive peak hour capacity. Without surge pricing for roads, AV fleet operators have strong incentives to capture the economic rents associated with peak period travel, shifting costs and externalities to the public sector and non-user travelers.
  13. Surge pricing should be established before AV fleets are widely deployed.  Once deployed, AV fleet operators will have a powerful incentive to fight surge pricing because it will reallocate economic rents from them to the public sector.

Please consider this a first draft. We invite your comments, and expect to periodically revise, expand and annotate these 13 propositions.

Breaking Bad: Why breaking up big cities would hurt America

New York Times columnist Russ Douthat got a lot of attention a few days ago for his Johnathan Swiftian column–”Break up the liberal city“–suggesting that we could solve the problems of lagging economic growth in rural and small town America by whacking big cities into pieces and spreading their assets more widely. Douthat views himself as a latter day Teddy Roosevelt, busting up the big concentrations of urban power, the way Roosevelt took on Standard Oil. Simply put, this is one of the most spectacularly wrong-headed policy prescriptions for economic development that has ever been offered. Far from spreading wealth, diminishing cities would actually destroy value and make the nation worse off.

Cities don’t extract rent, they create value

Douthat’s reasoning is based on a simplistic zero sum view of economic assets like industries and universities: cities have somehow unfairly monopolized the nation’s wealth, and we ought to redistribute it. The implied analogy here is to anti-trust law: cities have somehow cheated to monopolize resources. What this misses is that cities actually create value through increasing returns, what economists call agglomeration economies. People in cities are more productive, more innovative, and have higher skills because they live in cities.  Absent cities, the innovation and productivity upon which these industries depend for their success, they simply wouldn’t exist. As Ed Glaeser told the Washington Post:

“Cities enable workers to search over a wider range of firms, and to hop from one firm to another in case of a crisis. They enable service providers to reach their customers, and customers to access a dizzying range of service providers. Perhaps most importantly they enable the spread of ideas and new information. . . . cities are forges of human capital that enable us to get smart by being around other smart people.”

Economists have come to widely embrace the view advanced by Jane Jacobs that cities succeed in large part because of their diversity and density, which produces the kinds of spontaneous collisions of people that give rise to new ideas and new industry (what Jacobs called “new work”).  The nation’s largest metros produce a disproportionate share of its new patents, and economically successful new businesses because of these agglomeration economies: just 20 metros produce 63 percent of all patents. In biotechnology, for example, just three metro areas (Boston, San Diego and San Francisco) produce a majority of new biotech firms. Dispersing these researchers–who rely on critical mass and close and serendipitous interaction–would reduce the flow of new ideas that drive economic growth.

The signal characteristic of our economic recovery is that it has been led and driven by the nation’s large metros. Since the economic peak of the last expansion, large metro areas have accounted for about 87 percent of net new jobs in the US economy. This isn’t because they’ve somehow unfairly monopolized resources, but because the kinds of knowledge-based industries that we depend on to propel economic growth–software, business and professional services, and creative industries–all flourish in dense urban environments. Disperse these industries and you undercut the agglomeration economies that underpin their success.

The economic problem with cities is that we don’t have enough of them, or rather, that its so difficult and expensive to accommodate more people in the places with the highest levels of productivity.  The definitive bit of research on this subject comes from University of California Berkeley economist Enrico Moretti and his University of Chicago colleague Chang-Tai Hsieh, who have estimated how much less productive the US is than it might be if the growth of the most productive metro economies weren’t limited. Their estimate: 13.5 percent of annual GDP, or more than $1.6 trillion annually.

The irony here, also, is that the wealth and productivity of the nation’s cities underwrite a disproportionate share of the cost of the national government. The nation’s largest metropolitan areas have higher incomes and given the progressivity of the federal income tax pay a larger share of the nation’s income taxes. Rural areas–and red states generally–are net recipients of redistribution produced by federal taxing and spending. Shifting economic activity way from metro areas would reduce productivity and federal tax revenues. And one final twist: current federal tax and spending policies (including the home mortgage interest deduction and highway spending effectively penalize city dwellers who are more likely to be renters and depend on transit).

And finally, it would be worth considering the environmental consequences of dispersing economic activity in cities. Because cities residents drive less, walk, bike and take transit more, and live in smaller and more energy efficient dwellings, large cities turn out to be much more energy efficient and produce fewer greenhouse gases per capita than smaller cities and rural areas. So redistributing city assets would increase carbon emissions and accelerate global warming.

The lesson is not that we need to break up cities, but create more of them

More and more Americans are looking to move to cities.  This is especially true of younger, well-educated workers. Because the growing demand for urban living is facing a slowly changing supply of urban housing, rents are rising, effectively pricing some workers out of the opportunity to live in these highly productive places. At City Observatory we’ve called out the nation’s “Shortage of Cities,” and argued for policies that would help create more housing opportunities in the most productive places, and promote reinvestment and revitalization of lagging cities.

 

 

The hamster wheel school of transportation policy

Going faster doesn’t mean your city gets anywhere more quickly, and it doesn’t make you happier

One of the key metrics guiding transportation policy is speed:  how quickly can you get from point A to point B. But is going faster a good guide to how we ought to build better places?

When it comes to driving, in particular, the evidence is making cars go faster doesn’t make places better to live in.  In fact, just the opposite. That becomes clear when we look at a cross-section of cities, and see how the variation in average roadway speeds corresponds to measures of happiness.  Cities with higher travel speeds just tend to have more sprawling development patterns, and require people to drive further for common destinations. Those who live in faster moving places are, on average, less happy with their transportation systems than those who live in slower places. In effect, optimizing a transportation system for speed is just a kind of hamster-wheel school of transportation policy: the wheel goes around farther, but we’re still not going anywhere.

To begin with, we’ve got estimates of the average speed of travel in different metropolitan areas developed by the University of California’s Victor Couture. His data shows that average travel speeds in some metropolitan areas (like Louisville) are 22 percent faster than in the typical large metro area; while in other areas they are slower. Miami’s speeds average about 12 percent less than the typical metro.

The second part of our analysis considers how happy people are with the transportation system in their metropolitan area.  Here we examine survey data generated by real estate analytics firm Porch. They commissioned a nationally representative survey of residents of the nation’s large metropolitan areas and asked them how they rated their satisfaction with their local transportation system on a scale of 1 to 5, with 5 being very satisfied.  We compared these metro level ratings of satisfaction to Couture’s estimates of relative speeds in each metro areas. There’s a bit of a time lag between the two data sources: the survey data is from 2015 while the speed data is from 2008; but as we showed yesterday, the 2008 speed data correlates closely with an independent study of traffic congestion levels in 2016, suggesting that the relative performance of city transportation systems hasn’t changed much in that time period.

Faster Metros don’t have happier travelers

The following chart shows happiness with the regional transportation system on the vertical axis, and average speed on the horizontal axis.  Higher values on the vertical (happiness) scale indicate greater satisfaction; larger values on the horizontal (speed) scale indicate faster than average travel speeds.  The data show a weak negative relationship that falls short of conventional significancel tests (p = .16).  While there isn’t a strong relationship between speed and happiness, if anything it leans towards being a negative one; those who live in “faster” cities are not happier with their transportation system than those who live in slower ones.

 

We have a strong hunch as to why traveling faster might not generate more satisfaction with the transportation system. Faster travel is often correlated with lower density, and longer travel distances to common destinations, such as workplaces, schools and stores. If you have a sprawling, low density metropolitan area, with great distances between destinations, much of the potential savings in travel time may be eaten up by having to travel longer distances. A complementary explanation is that places with faster speeds, may be ones where proportionately more travel occurs on higher speed, higher capacity roads, such as freeways, parkways and major arterials, as opposed to city streets. The higher measured speed may a product of traveling long distances at high speeds in some cities, as opposed to cities with much shorter average trips on slower city streets.

Faster travel is correlated with more driving

To explore this hypothesis, we compared average vehicle miles traveled (VMT) per person per day, as reported by the US Department of Transportation, to the average estimated speeds for metropolitan areas.  Both of these sets of observations are for 2008. The following chart shows VMT per capita on the vertical axis and average speed on the horizontal axis. As we thought, there’s a strong positive relationship between speed and distance traveled. People who live in places with faster speeds drive more miles per day.

More driving is associated with less satisfaction with metro transportation

To tie this all together, we thought we’d look at one more relationship:  How does distance traveled affect happiness with an area’s transportation system? This final chart shows the happiness (on the vertical axis) and vehicle miles traveled (on the horizontal axis). Here there is a strong negative relationship: the further residents drive on a daily basis, the less happy they are with their metro area’s transportation system.

We think this chart has an important implication for thinking about cities and transportation. Instead of focusing on speed, which seems to have little if any relationship to how people view the quality of their transportation system, we ought to be looking for ways to influence land use patterns so that people to have to travel as far. If we could figure out ways to enable shorter trips and less travel, we’d have happier citizens. It’s time to get off the hamster wheel.

Going faster doesn’t make you happier; you just drive farther

Speed doesn’t seem to be at all correlated to how happy we our with our local transportation systems. 

If there’s one big complaint people seem to have about the transportation system its that they can’t get from place to place as quickly as they like. TV traffic reporters are always alerting us to delays; Google and Waze are giving us advice on quicker routes, and transportation departments are always promising that some big new project will speed traffic. But that leads  us to consider a pretty basic question:  Does going faster make us happier.  The short answer is no; the longer answer is more complicated (but is also “no.”)

Some time ago, we unearthed some interesting estimates of the average speed of travel in different metropolitan areas developed by the University of California’s Victor Couture. His data shows that average travel speeds in some metropolitan areas (like Louisville) are 22 percent faster than in the typical large metro area; while in other areas they are slower. Miami’s speeds average about 12 percent less than the typical metro. We’ve long assumed that one of the goals of our transportation system is to enable us to move as quickly as possible when we travel, so it stands to reason that the people who live in “faster” cities ought to be happier with their transportation systems.

Faster, but not happier. (Flickr: Opengridscheduler)

To test that hypothesis, we had a look at some survey data generated by real estate analytics firm Porch. They commissioned a nationally representative survey of residents of the nation’s large metropolitan areas and asked them how they rated their satisfaction with their local transportation system on a scale of 1 to 5, with 5 being very satisfied.  We compared these metro level ratings of satisfaction to Couture’s estimates of relative speeds in each metro areas. There’s a bit of a time lag between the two data sources: the survey data is from 2015 while the speed data is from 2008; but as we showed yesterday, the 2008 speed data correlates closely with an independent study of traffic congestion levels in 2016, suggesting that the relative performance of city transportation systems hasn’t changed much in that time period.

Faster Metros don’t have happier travelers

The following chart shows happiness with the regional transportation system on the vertical axis, and average speed on the horizontal axis.  Higher values on the vertical (happiness) scale indicate greater satisfaction; larger values on the horizontal (speed) scale indicate faster than average travel speeds.  The data show a weak negative relationship that falls short of conventional significancel tests (p = .16).  While there isn’t a strong relationship between speed and happiness, if anything it leans towards being a negative one; those who live in “faster” cities are not happier with their transportation system than those who live in slower ones.

 

We have a strong hunch as to why traveling faster might not generate more satisfaction with the transportation system. Faster travel is often correlated with lower density, and longer travel distances to common destinations, such as workplaces, schools and stores. If you have a sprawling, low density metropolitan area, with great distances between destinations, much of the potential savings in travel time may be eaten up by having to travel longer distances. A complementary explanation is that places with faster speeds, may be ones where proportionately more travel occurs on higher speed, higher capacity roads, such as freeways, parkways and major arterials, as opposed to city streets. The higher measured speed may a product of traveling long distances at high speeds in some cities, as opposed to cities with much shorter average trips on slower city streets.

Faster travel is correlated with more driving

To explore this hypothesis, we compared average vehicle miles traveled (VMT) per person per day, as reported by the US Department of Transportation, to the average estimated speeds for metropolitan areas.  Both of these sets of observations are for 2008. The following chart shows VMT per capita on the vertical axis and average speed on the horizontal axis. As we thought, there’s a strong positive relationship between speed and distance traveled. People who live in places with faster speeds drive more miles per day.

More driving is associated with less satisfaction with metro transportation

To tie this all together, we thought we’d look at one more relationship:  How does distance traveled affect happiness with an area’s transportation system? This final chart shows the happiness (on the vertical axis) and vehicle miles traveled (on the horizontal axis). Here there is a strong negative relationship: the further residents drive on a daily basis, the less happy they are with their metro area’s transportation system.

We think this chart has an important implication for thinking about cities and transportation. Instead of focusing on speed, which seems to have little if any relationship to how people view the quality of their transportation system, we ought to be looking for ways to influence land use patterns so that people to have to travel as far. If we could figure out ways to enable shorter trips and less travel, we’d have happier citizens.

Going faster doesn’t make you happier; you just drive farther

Speed doesn’t seem to be at all correlated to how happy we our with our local transportation systems. 

Yesterday, we presented some new estimates of the average speed of travel in different metropolitan areas developed by the University of California’s Victor Couture. His data shows that average travel speeds in some metropolitan areas (like Louisville) are 22 percent faster than in the typical large metro area; while in other areas they are slower. Miami’s speeds average about 12 percent less than the typical metro. We’ve long assumed that one of the goals of our transportation system is to enable us to move as quickly as possible when we travel, so it stands to reason that the people who live in “faster” cities ought to be happier with their transportation systems.

Faster, but not happier. (Flickr: Opengridscheduler)

To test that hypothesis, we had a look at some survey data generated by real estate analytics firm Porch. They commissioned a nationally representative survey of residents of the nation’s large metropolitan areas and asked them how they rated their satisfaction with their local transportation system on a scale of 1 to 5, with 5 being very satisfied.  We compared these metro level ratings of satisfaction to Couture’s estimates of relative speeds in each metro areas. There’s a bit of a time lag between the two data sources: the survey data is from 2015 while the speed data is from 2008; but as we showed yesterday, the 2008 speed data correlates closely with an independent study of traffic congestion levels in 2016, suggesting that the relative performance of city transportation systems hasn’t changed much in that time period.

Faster Metros don’t have happier travelers

The following chart shows happiness with the regional transportation system on the vertical axis, and average speed on the horizontal axis.  Higher values on the vertical (happiness) scale indicate greater satisfaction; larger values on the horizontal (speed) scale indicate faster than average travel speeds.  The data show a weak negative relationship that falls short of conventional significancel tests (p = .16).  While there isn’t a strong relationship between speed and happiness, if anything it leans towards being a negative one; those who live in “faster” cities are not happier with their transportation system than those who live in slower ones.

 

We have a strong hunch as to why traveling faster might not generate more satisfaction with the transportation system. Faster travel is often correlated with lower density, and longer travel distances to common destinations, such as workplaces, schools and stores. If you have a sprawling, low density metropolitan area, with great distances between destinations, much of the potential savings in travel time may be eaten up by having to travel longer distances. A complementary explanation is that places with faster speeds, may be ones where proportionately more travel occurs on higher speed, higher capacity roads, such as freeways, parkways and major arterials, as opposed to city streets. The higher measured speed may a product of traveling long distances at high speeds in some cities, as opposed to cities with much shorter average trips on slower city streets.

Faster travel is correlated with more driving

To explore this hypothesis, we compared average vehicle miles traveled (VMT) per person per day, as reported by the US Department of Transportation, to the average estimated speeds for metropolitan areas.  Both of these sets of observations are for 2008. The following chart shows VMT per capita on the vertical axis and average speed on the horizontal axis. As we thought, there’s a strong positive relationship between speed and distance traveled. People who live in places with faster speeds drive more miles per day.

More driving is associated with less satisfaction with metro transportation

To tie this all together, we thought we’d look at one more relationship:  How does distance traveled affect happiness with an area’s transportation system? This final chart shows the happiness (on the vertical axis) and vehicle miles traveled (on the horizontal axis). Here there is a strong negative relationship: the further residents drive on a daily basis, the less happy they are with their metro area’s transportation system.

We think this chart has an important implication for thinking about cities and transportation. Instead of focusing on speed, which seems to have little if any relationship to how people view the quality of their transportation system, we ought to be looking for ways to influence land use patterns so that people to have to travel as far. If we could figure out ways to enable shorter trips and less travel, we’d have happier citizens.

Are restaurants dying, and taking city economies with them?

Alan Ehrenhalt is alarmed. In his tony suburb of Clarendon, Virginia, several nice restaurants have closed. It seems like an ominous trend. Writing at Governing, he’s warning of “The Limits of Cafe’ Urbanism.” Cafe Urbanism is a  “lite” version of the consumer city theory propounded by Harvard’s Ed Glaeser, who noted that one of the chief economic advantages of cities is the benefits they provide to consumers in the form of diverse, interesting and accessible consumption opportunities, including culture, entertainment and restaurants.

While the growth of restaurants has coincided with the revival of Clarendon in the past decade, all this seems a bit insubstantial to Ehrenhalt. He worries that if the urban economic revival, is built upon the fickle tastes of restaurant consumers–as it were on a foundation of charred octopus and bison carpaccio–city economies could be vulnerable. What, Ehrenhalt worries, will happen if the growth of these restaurants peters out?

That may already be happening. In 2016, according to one reputable study, the number of independently owned restaurants in the United States — especially the relatively pricey ones that represent the core of café urbanism — declined by about 3 percent after years of steady growth. The remaining ones were reporting a decline in business from a comparable month in the previous year.

There are a couple of problems with this “restaurant die-off” story.  First, its a bit over-generous to suggest that restaurants themselves are the principal economic force behind urban economic revival. The growth of restaurants is more a marker of economic activity than the driver. Restaurants are growing because cities are attracting an increasing number of well-educated and productive workers, which drives up the demand for a range of local goods, including restaurants. While the restaurants contribute to the urban fabric, they are more a result of urban rebound than a cause.

Second, the data clearly show that the restaurant business continues to expand. If anything, nationally, we’re in the midst of a continuing and historic boom in eating out. In 2014, for the first time, the total amount of money that Americans spent on food consumed away from home exceeded the amount that they spend on food for consumption at home. There may come a time when Americans cut back and spend less on eating out, but that time is not now at hand: According to Census Bureau data, through January 2017, restaurant sales were up a robust 5.6 percent over a year earlier.

Ehrenhalt’s data about the decline in independent restaurants is apparently drawn from private estimates compiled by the consulting firm NPD, which last spring reported a decline of 3 percent in independent restaurants, from 341,000 units to 331,000 units in the prior year. NPD’s data actually compared 2014 and 2015 counts of restaurants. But the NPD estimates aren’t borne out by data gathered by the Census Bureau and Bureau of Labor Statistics, which show the number of restaurants steadily increasing. The counts from the BLS show the number of restaurants in the US increasing by about 2 percent in 2016, an acceleration in growth from the year earlier.

 

 

At City Observatory, we’ve seen a steady stream of articles lamenting the demise of popular restaurants in different cities, each replete with its tales from chefs telling stories of financial woe and burdensome regulation. (The reason never seems to be that the restaurant was poorly run, served bad food, had weak service, or simply couldn’t compete). The truth is failures are commonplace in the restaurant business. No one should be surprised that an industry that puts such a premium on novelty has a high rate of turnover. Government data show that something like 75,000 to 90,000 restaurants close each year, which means the mortality rate, even in good years is around 15 percent. The striking fact about the closure data is that the trend has been steadily downward for most of the past decade.

So nationally, here’s what we know about the restaurant industry:

  • Americans are spending more at restaurants now than ever before, and now spend more eating out than eating at home
  • The number of restaurants is at an all time high, having increased by 40,000 net over the past five years.
  • Restaurant closings are common, but declining.

None of this is to say that Ehrenhalt isn’t right about the restaurant scene in his neighborhood. The fortunes of neighborhoods, like restaurants themselves, wax and wane. But even in Ehrenhalt’s upscale Virginia suburb, which is part of Arlington County, government data show no evidence of a widespread restaurant collapse. Data from the Bureau of Labor Statistics show that there’s been a sustained increase in the number of restaurants in Arlington County. Arlington County now has 580 restaurants, an increase of about 10 percent from its pre-recession peak.

 

It appears that we’re still moving in the direction of what some have called an “experience economy.” And there are few more basic (or enjoyable) experiences than a good meal. One of the economic advantages of cities is the variety and convenience of dining choices. While individual establishments will come and go, the demand for urban eating seems to be steadily increasing. So far from being a portent of economic decline, we think cafe urbanism will be with us–and continue to grow–for some time.

What Travis Kalanick’s meltdown tells us about Uber

As has been well chronicled in the media, it’s been a tough month for Uber. The company’s CEO, Travis Kalanick was vilified in the press for the company’s tolerance for sexual harassment of its female employees, and derided for his participation in President Trump’s business advisory council (from which he resigned after an estimated 200,000 people cancelled their accounts with Uber). Finally, he was recorded in a shouting match with a San Francisco Uber driver, who claimed to have lost $7,000 because of Kalanick’s changes to Uber’s reimbursement policies.

Kalanick is shown telling the driver, Fawzi Kamel, to take responsibility for his own “s***”, and storming out of the car.  Kalanick has since apologized.

But tirade and tempers aside, the conversation between driver Kamel and CEO Kalanick, is actually very revealing about Uber’s financial predicament.  Kamel is complaining that while Uber started as a premium service and paid driver’s relatively high rates, that over time the company has been cutting back on the amount it pays drivers.

Kalanick bristles at this criticism (argung that Uber still pays higher rates for its premium “black” service), but also concedes that he’s been pushed to lower rates to meet the competition provided by Lyft and other transportation network companies.  Bloomberg Business Week has transcribed their conversation:

Then Kamel says what every driver has been dying to tell Kalanick: “You’re raising the standards, and you’re dropping the prices.

Kalanick: “We’re not dropping the prices on black.”

Kamel: “But in general the whole price is—”

Kalanick: “We have to; we have competitors; otherwise, we’d go out of business.”

Kamel: “Competitors? Man, you had the business model in your hands. You could have the prices you want, but you choose to buy everybody a ride.”

Kalanick: “No, no no. You misunderstand me. We started high-end. We didn’t go low-end because we wanted to. We went low-end because we had to because we’d be out of business.”

This, in a nutshell, is Uber’s problem: It’s losing money, and its competition is forcing it to lose even more money, in order to stay in business. In an effort to stay afloat, Uber’s passing its pain on to drivers, inventing a raft of lower-priced services (UberX, UberPool) and offering lower reimbursements to their drivers.  Kalanick’s admission that competition is putting a cap on Uber’s prices–and profits–suggests that Uber’s $69 billion valuation may be excessive and that Uber’s critics may be right about the viability of its business model.  The most strident critics maintain that the company will likely implode from its growing losses. Jalopnik’s Ryan Felton has been unstinting in his criticism of the company. Leaked financial reports from the company, analyzed by Hubert Horan at Naked Capitalism  make a strong case that the company’s investors are subsidizing something like 59 percent of the cost of rides.

(Flickr: Kaysha)

Two Questions for Uber

It remains to be seen whether the ride-sharing model is really economically viable, especially in face of competition. Our view at City Observatory has been that promoting competition among providers is a good thing, as a way of lowering prices and encouraging innovation: ‘Let a thousand Uber’s bloom‘ we said. And ultimately competition will help determine whether this business model actually makes any sense. To date, the companies have been propped up by the influx of money from venture capitalists, and, arguably, the willingness of driver/contractors to work for modest (and perhaps exploitative) wages. Ultimately, investors will have to have to ask themselves two questions:

Question 1:  What happens if you have dominant market share in a money-losing industry?

Answer:   You lose more money than your competitors.

Question 2:  What happens when demand for your product increases in a money-losing industry?

Answer:  You lose even more money, faster.

In theory, you can make the argument that paying independent contractor drivers is just a short-term strategy for Uber until it perfects self-driving cars, at which point it will be spared the expense of paying (and also arguing with) Mr. Kamel and several hundred thousand other drivers. The success of that strategy depends on Uber overcoming yet another group of competitors, including other technology companies and auto makers to build and operate fleets of self-driving cars. Of course, the latest bit of news is that Google has accused Uber of stealing intellectual property relating to autonomous vehicles.

There’s no question that ride-sharing and transportation network companies are “disruptive technologies.” But how disruptive they are depends directly on the prices they charge. The growth of Uber and Lyft is significantly due to the fact that their fares are lower than taxis and their service is better than taxis or transit. Earlier this week, a study of New York traffic trends attributed the rise in transportation network companies to the relatively low price of their service. The impact, and ultimately the success of these companies depends on what fares their customers are willing to pay.If Uber’s fares were say to double, its likely that its growth would decelerate significantly, and its mode share might actually decline.

 

Getting to critical mass in Detroit

Last month, we took exception to critics of Detroit’s economic rebound who argued that it was a failure because the job and population growth that the city has enjoyed has only reached a few neighborhoods, chiefly those in and around the downtown. A key part of our position was that successful development needs to achieve critical mass in a few locations because there are positive spillover effects at the neighborhood level. One additional house in each of 50 scattered neighborhoods will not have the mutually reinforcing effect of building 50 houses in one neighborhood. Similarly, building new housing, a grocery store, and offices in a single neighborhood makes them all more successful than they would be if they were spread out among different neighborhoods. What appears to some as “unequal” development is actually the only way that revitalization is likely to take hold in a disinvested city like Detroit.  That’s why we wrote:

. . . development and city economies are highly dependent on spatial spillovers. Neighborhoods rebound by reaching a critical mass of residents, stores, job opportunities and amenities.  The synergy of these actions in specific places is mutually self-reinforcing and leads to further growth. If growth were evenly spread over the entire city, no neighborhood would benefit from these spillovers. And make no mistake, this kind of spillover or interaction is fundamental to urban economics; it is what unifies the stories of city success from Jane Jacobs to Ed Glaser.  Without a minimum amount of density in specific places, the urban economy can’t flourish.  Detroit’s rebound will happen by recording some small successes in some places and then building outward and upward from these, not gradually raising the level of every part of the city.

While this idea of agglomeration economies is implicit in much of urban economics, and while the principle is well-understood, its sometimes difficult to see how it plays out in particular places. A new research paper prepared by economists Raymond Owens and Pierre-Daniel Sarte of the Federal Reserve Bank of Richmond and Esteban Rossi-Hansberg of Princeton University  tries to explore exactly this issue in the city of Detroit. If you don’t want to read the entire paper, CityLab’s Tanvi Misra has a nice non-technical synopsis of the article here.

The important economic insight here is the issue of externalities: In this case, the success of any persons investment in a new house or business depends not just on what they do, but whether other households and businesses invest in the same area. If a critical mass of people all build or fix up new houses in a particular neighborhood (and/or start businesses) they’ll benefit from the spillover effects of their neighbors. If they invest–and others don’t–they won’t get the benefit of these spillovers.

Analytically this produces some important indeterminacy in possible outcomes. Multiple different equilibria are possible depending on whether enough people, businesses, developers and investment all “leap” into a neighborhood at a particular time. So whether and how fast redevelopment occurs is likely to be a coordination problem.

Without coordination among developers and residents Owens, Rossi-Hansberg and Sarte argue, some neighborhoods that arguably have the necessary fundamentals to rebound won’t take off. Immediately adjacent to downtown Detroit, for example, there are hundreds of acres of vacant land that offer greater proximity to downtown jobs and amenities than other places. Why, the authors ask, “do residential developers not move into these areas and develop residential communities where downtown workers can live?”

To answer that question, the NBER paper builds a very complex economic model that represents these spillover effects, and estimates the potential for each neighborhood to add value if it can move from its current underdevelopment equilibrium. In this map illustrating their findings, the neighborhoods with the darkest colors have the highest potential value if development takes place.

The authors measure the potential for future growth by estimating the total increase in rents associated with additional housing development and population growth in each neighborhood. Some neighborhoods are well-positioned for development to take-off, and would show the biggest gains in activity, if the coordination problem could be overcome. That coordination problem is apparent in neighborhoods near downtown Detroit: even though it would make sense to invest, no one wants to be the first investor, for fear that other’s won’t invest.  So Owens, Rossi-Hansberg and Sarte suggest this obstacle might be overcome if we could create a kind of  “investment insurance”–if you invest in this neighborhood, then we’ guarantee a return on your home or business.

As a thought experiment, the authors estimate the amount of a development guarantee that would be needed to trigger a minimum level of investment needed to get a neighborhood moving toward its rebuilding. In theory, offering developers a financial guarantee that their development would be successful could get them to invest in places they wouldn’t choose to invest today. That investment, in turn, would trigger a kind of positive feedback effect that would generate additional development, and the neighborhood would break out of its low-development equilibrium. If the author’s estimates are correct, its unlikely that the guarantees would actually need to be paid.

While this concept appears sound in theory, much depends on getting the estimates right, and also on figuring out how to construct a system of guarantees that doesn’t create its own incentive problems. In effect, however, this paper should lend some support to those in Detroit who are attempting to make intensive, coordinated investments in a few neighborhoods.

More broadly, this paper reminds us of the salience of stigma to neighborhood development. Once a neighborhood acquires a reputation in the collective local consciousness for being a place that is risky, declining, crime-ridden or unattractive, it may be difficult or impossible to get a first-mover to take the necessary investment that could turn things around. The collective action problem is that no one individual will move ahead with investment because they fear (rationally) that others won’t, based on an area’s reputation.  A big part of overcoming this is some action that changes a neighborhood’s reputation and people’s expectations, so that they’re willing to undertake investment, which then becomes a self-fulfilling prophecy.  While economists tend to think that the only important guarantees are financial , there are other ways that city leaders could actively work to change a neighborhood’s reputation and outlook and give potential residents and  investors some assurance that they won’t be alone if they are among the first to move.   New investments, for example, like the city’s light rail system, may represent a signal that risks are now lower in the area’s it serves than they have been.

The implications of shrinking offices

The amount of office space allotted to each worker is shrinking. What does that mean for cities?

Last week a new report from real estate analytics firm REIS caught our eye. Called “The Shrinking Office Footprint” this white paper looks at changes in the demand for office space over the last couple of business cycles.  The full report is available free (with registration) from REIS.

An increasing share of jobs in the US economy are in the kinds of industries and occupations that are housed in leased office buildings. Knowledge-based industries like finance, software, business and management consulting services, market and communications and a range of similar businesses house most of their employees in commercial offices. Of course, investors in the real estate business keenly follow data on office lease rates and vacancy trends to look to see where it is most profitable to buy or build new office buildings. And the leasing of commercial offices is a useful indicator of changes in economic activity.

Cube farm (Flickr: Steve)

The REIS report offers up a number of interesting findings. Overall, their data (which stretch back to 1999) illustrate the depth and severity of the great recession. When the economy nose-dived in 2008, businesses laid off employees, and lots of office space went begging. And while vacancy rates shot up, they actually understated the extent of the impact on real estate.  Many firms had five-, ten- or even fifteen-year leases on their office space, and we’re stuck with “shadow” space.

As a result, as the economy began to recover, there was lots of room (literally) for companies to expand their payrolls without expanding their real estate footprint. As a result there’s a clearly cyclical pattern to the relationship between hiring and new office space leasing.  Early on in a recovery, when firms are filling up un-used or under-used shadow space, they consume relatively small amounts of additional office space per new employee.  As the recovery grows, more firms reach or outgrow their capacity, and then lease additional space.  (You see this pattern clearly in the REIS data: square feet absorbed per new employee rises through the business cycle.

What’s more interesting though, is how the amount of office space per employee has been steadily declining in each successive business cycle.  The metric to pay attention to here is “net absorption” per added office employee.  Net absorption is the different in the number of square feet office space that is leased compared that which becomes vacant. In the expansion of the late 1990s, REIS reports that the average additional employee was associated with about 175 additional square feet of office space.  In the late 2000s, up until the Great recession, the typical employee was associated with about 125 square feet of additional space.  During this decade, each added employee has been associated with only about 50 square feet of additional office space. Nationally, we’ve added about 3.5 million office workers, and leased about 180 million additional net square feet of office space. (See the red lines on the following chart).

Declining space per employee (REIS)

What this reflects is a number of things: the companies that are growing may be those that make the most efficient use of space.  While space intensive industries like manufacturing have been growing slowly or declining, space efficient industries like software have been growing. We also know that hotelling and remote work–arrangements where employees share space, rather than having dedicated offices or cubicles–enables firms to accommodate more workers in any given amount of space. While some of the relatively low rate of absorption per worker represents a hangover from the recession’s “shadow space,” Reis believes that much of the decline in space per worker is permanent: they conclude that “lower net absorption is likely a lasting trend.”

What does this mean for city economies?  While it may mean that fewer office buildings get built than would have been the case if the old space-per-worker ratios had held, it also suggests that the current building stock has more capacity to accommodate additional jobs than it did in the past. Even without building new offices, cities can expand employment. Greater space efficiency also means that companies will have to pay to rent fewer square feet per employee, meaning that the cost of office space is a relatively less important factor in driving business costs. Commercial real estate brokerage CBRE estimates that for a typical 500-employee software firm, office expenses represent just 6 percent of costs, compared to 94 percent for employee labor.

And notice that this analysis estimates that firms have an average of about 150 square feet per employee (75,000 / 500).  If occupancy rates are as low as 50 to 100 square feet per employee as the new Reis analysis suggests, office costs may be even lower as a fraction of total costs. More space efficient businesses are more likely to locate in urban centers, where the advantages of accessibility, and proximity to a wide range of services and activities is an advantage in attracting and retaining talent.

Ultimately, this should also make a significant difference to how we plan and build our cities. Many land use plans assume fixed, or even growing, ratios of space per worker. That’s part of what prompts cities to plan for growth at the “extensive margin”–i.e. by setting aside more land for further commercial and industrial uses, often at the urban fringe. But the shrinking footprint per worker suggests that we’ll need a lot less land for this kind of extensive growth because we can accommodate more employment by using our existing lands and buildings more efficiently, i.e. growing at the intensive margin.

 

What we know about rent control

Today, partly as a public service, we’re going to dig into the academic literature on an arcane policy topic: rent control. We also have a parochial interest in the subject: the Oregon Legislature is considering legislation that would lift the state’s ban on cities imposing rent control. The legislation is being proposed by Oregon House Speaker Tina Kotek.

Indeed. But how to fix it is still a question. (Flickr: Tiger Pixel)

For decades, one of the few topics on which nearly all economists agree is that rent control is bad thing: it discourages new investment in housing and in housing maintenance; it tends to reduce household mobility, encourages the conversion of apartments into condominiums (removing them from the rental housing supply), and leads to the misallocation of housing over time.

In response to the economist’s objections, one of the arguments that rent control advocates make is to draw a distinction between “bad, old first generation rent control” and “new, improved second generation rent control.” Yes, these advocates concede, back in the day there were poorly designed systems that involved rent freezes, and which had the effect of reducing housing supply.  But today, we are told, rent controllers are smarter, and have developed new types of rent control that are supposedly free of these negative effects.  As Speaker Kotek described it to The Portland Oregonian:

What you’re hearing from landlords about rent control is they have an idea of it that’s very much the model that began right after World War II where properties had hard, fast caps on rents. That’s not the kind of rent control we’re talking about. We’re talking about second-generation rent stabilization where there’s a process for managing rent increases that protects investors and tenants.

And to bolster their point, rent control advocates will sometimes quote from two economists–Tony Downs and Richard Arnott–who’ve explored the differences between first and second generation rent controls. They point out–correctly–that Downs and Arnott have identified some important differences in rent control regimes. But neither of them actually endorses rent control, especially of the kind that’s likely to be on offer under the proposed Oregon legislation.

First Generation and Second Generation Rent Control

The first article that some point to is a 1988 Urban Institute report authored by Brookings Institution economist Anthony Downs: Residential Rent Controls: An Evaluation. Downs distinguishes between “strict” and “temperate” rent control regimes. But Downs is clear that there’s a wide continuum and that many different features of rent control affect its stringency, including the share of the housing stock that is covered, whether there is vacancy decontrol, whether the ordinance allows automatic rent increases and generous allowances for increases to cover the cost of maintenance or improvements. Stringent rent control has the worst effects, temperate rent control the least.

As Michael Lewyn notes, the reason that “temperate” rent control regimes haven’t been shown to have much of an adverse effect on supply is because they don’t control rents.  Let’s take a close look at what Anthony Downs had to say about differences in rent control regimes in different cities.  New York’s stringent rent control holds rents for controlled apartments about 57 percent below market rates. In contrast, the more temperate controls in Los Angeles reduced rents by only about 3.5 percent.

Similarly, advocates of rent control sometimes point to the work of Richard Arnott, who like Downs, distinguishes between different levels of stringency in rent control regimes. In 1995, Arnott wrote an article for the Journal of Economic Perspectives asking “Time for revisionism on rent control?” Arnott has argued that systems of rent control that include vacancy decontrol (i.e. that let landlords raise rents on vacant apartments to whatever level they like) would be unlikely to have the same kind of negative effects as first generation rent control schemes.

Many people seized on Arnott’s article as an endorsement of second-generation strategies. So much so, that in 2003, Arnott went to the trouble to specifically deny any support for such strategies, writing an article entitled “Tenancy Rent Control” in The Swedish Economic Policy ReviewWhile advocates imply that Arnott therefore supports rent control, Arnott himself made it clear he does not, stating that “most second generation rent control programs .  .  . have been on balance harmful”:

Far from endorsing rent control, both Downs and Arnott make it clear that rent control regimes that actually have the effect of lowering rents significantly below what the market would otherwise provide would have negative economic consequences.

Proposed Legislation: Promising Benign, Enabling Malignant

More to the point, in the legislation on offer in Oregon, Speaker Kotek’s HB 2001 and HB 2004, there are actually no provisions that precludes cities from enacting the damaging kind of strict or first generation rent control. The first bill, HB 2001, begins with a moratorium on rent increases of more than five percent. And nothing in the legislation precludes Oregon cities from adopting rent control with the most demonstrably damaging features, including applying rent controls to new construction. The other bill, HB 2004, simply repeals the state ban on city and county imposed rent controls altogether.

Plainly there are better and worse forms of rent control: those that do not apply to many (or most units), that allow landlords to raise rents regularly in line with inflation, and that fully decontrol apartments when they become vacant, arguably have fewer negative effects than more stringent measures. But these two proposed bills allow “bad” rent control just as much as they allow “less bad” rent control). Consider just one feature of rent control:  “vacancy decontrol.” Under vacancy decontrol, an incumbent tenant has a protection against rent increases above some level, but when a unit becomes vacant landlords are free to raise the rent to whatever level they want. Downs views this a an essential element of temperate regimes; Arnott makes it the centerpiece of his definition of “second generation” rent control. Nothing in HB 2001 or 2004 requires vacancy decontrol; as a result its simply not accurate to cite either Downs or Arnott’s research as supporting such legislation.

So here’s the takeaway: Neither Downs nor Arnott endorse second generation or less stringent rent controls. Both agree that measures that effectively limit rent have negative consequences and are, in Arnott’s words “on balance harmful.”  But even if there were some forms of rent control that had fewer negative effects, there’s nothing in the legislation that’s so far been proposed in Oregon that would preclude cities or counties from adopting some of the worst, most-disruptive forms of rent control.

 

Houston (Street), we have a problem.

A lesson in the elasticity of demand, prices and urban congestion. It looks like Uber, Lyft and other ride sharing services are swamping the capacity of New York City streets

Every day, we’re being told, we’re on the verge of a technological revolution that will remedy our persistent urban transportation problems. Smart cities, replete with sensors, command centers, and linked to an Internet of Things, will move us every more quickly and effortlessly to our destinations, traffic lights and even traffic will become a thing of the past. The most fully deployed harbinger of the change that such technology has wrought is clearly the app-based ride-hailing services, including Uber, Lyft and others. A few weeks back, we took to task claims from one study that a couple thousand autonomously piloted 10-passenger vans could virtually eliminate traffic congestion in Manhattan.

A new report from New York should give the techno-optimists reason to pause. Transportation consultant Bruce Schaller, who previously served as New York City’s deputy transportation commissioner, has sifted through the detailed records of New York City’s Taxi Licensing Commission and painted a stark picture of growing traffic congestion in the city–thanks to to increasing proliferation of app-based ride hailing services, which the report refers to as transportation network companies (TNCs). Schaller has a shorter op-ed summarizing his report in the New York Daily News, but you’ll want to have a look at the entire 38-page report, entitled “UNSUSTAINABLE? The Growth of App-Based Ride Services and Traffic, Travel and the Future of New York City,” The full report is worth a read, but here are some highlights.

  • The total number of miles driven in New York City by TNCs vehicles has increased by 600 million miles in the past three years.
  • Trips taken by the combination of taxis and TNCs has outpaced the increase in trips taken by transit; for the precdeing 24 years (since 1990) growth in person trips has been led by increased transit ridership.
  • The additional vehicle traffic associated with TNCs amounts to about a 7 percent increase in traffic levels in Manhattan, about the same amount of traffic that the city’s cordon-pricing proposal was supposed to reduce.
  • The bulk of the increase in traffic associated with TNCs has been in the morning and evening peak hours.
  • While TNCs initially grew mostly by taking traffic from yellow cabs, increasingly they are taking riders from transit, and in some cases stimulating additional travel.

Bottom line: the growth of TNCs is increasing the volume of vehicles on New York City streets, and adding to congestion and delays. Interestingly, this key finding is a turnaround for Schaller, who acknowledges that just a year ago (in January 2016), he was part of a team that concluded on behalf of Mayor de Blasio that the added traffic from ride hailing services wasn’t increasing the city’s congestion. The growth in TNC volumes over the past year has apparently changed his mind.

What’s changed? In short, the price of car travel has fallen in New York City, relative to the alternatives. Previously, a combination of high taxi fares and limited entry (a fixed number of medallions) coupled with prohibitively expensive parking rates in most of Manhattan, meant that taking a private car cost four or five times more times as much as the average transit fare. But with Uber and Lyft charging much lower rates, and flooding the market with additional vehicles, there’s been a noticeable uptick in traffic volumes. This is a fundamental lesson in economics: increasing the supply of vehicles and lowering prices is going to trigger additional demand. And in the face of limited street capacity, congestion is likely to increase. And we should keep in mind that the TNCs are just a dress rehearsal for fleets of autonomous vehicles. Subtract the cost of paying drivers, and they promise to be even cheaper and more plentiful than Uber and Lyft are today, especially in high density urban markets.

Ultimately, the solution to this problem will come from correctly pricing the use of the city’s scarce and valuable street space, particularly at rush hour. Schaller makes this very clear:

As they steadily cut fares, TNCs are erasing these longstanding financial disincentives for traveling by motor vehicle in Manhattan. If TNC growth continues at the current pace (and there is no sign of it leveling off), the necessity of some type of road pricing will become more and more evident.

The detailed data from Uber and Lyft however, point up the major limitations of the proposed cordon-pricing scheme that was suggested for New York City a decade ago (under cordon pricing, vehicles entering lower Manhattan, below 96th or 110th street, or crossing the Hudson or the East River would pay a daily toll). But because so much traffic and so many rides begin and end within those boundaries, the cordon pricing scheme does nothing to disincentivize travel in the center, once the vehicle has paid the toll.

Consequently, if pricing is going to work in Manhattan, it will probably have to be some kind of zoned, time-of-day pricing, charging higher rates for travel in and through Manhattan during peak hours, which much lower fees for travel in out-lying boroughs and at off-peak hours. In effect, Uber’s much maligned “surge” pricing is proof of concept for this model; it just has to have prices reflect the scarcity and value of the publicly owned roadway rather than just the momentary scarcity of Uber’s privately owned vehicles.  The GPS and mobile Internet technology that’s now been proven in taxis and TNC vehicles, shows a such a system is technically quite feasible.

Without some form of road pricing, the high concentration of profitable fares in denser neighborhoods and at peak hours, coupled with the additional financial inducement of surge pricing bonuses could lead ever greater volumes of TNC vehicles to clog city streets. As New York transportation expert Charles Komanoff puts it, the Schaller report settles the question that TNCs are making the city’s gridlock worst. He effectively calls the report a must read:

It touches on virtually every consequential transportation trend and policy question facing the five boroughs and stands as the most thoughtful and thorough analysis of New York City traffic and transportation issues since the Bloomberg years.

But New York is just on the leading edge of a range of technological and policy issues that every city is going to have to confront in the years ahead.  If you want to get ahead of the curve–and think about where the expansion of TNCs, and ultimately autonomous vehicles is taking us–here is a good place to start.

Cursing the candle

How should we view the early signs of a turnaround in Detroit?

Better to light a single candle than simply curse the darkness. The past decades have been full of dark days for Detroit, but there are finally signs of a turnaround, a first few glimmers that the city is stemming the downward spiral of economic and social decline. But for at least a few critics that’s not good enough: not content with cursing the darkness, they’re also cursing the first few candles that have been lit, for the sin of failing to resolve the city’s entire crushing legacy of decline everywhere, for everyone, and all at once.

Flickr: Uetchy

Michigan State political scientist Laura Reese and Wayne State urban affairs expert Gary Sands have written an essay “Detroit’s recovery: The glass is half-full at best,” for Conversation which was reprinted at CityLab as “Is Detroit really making a comeback?” The article is based on a longer academic treatment of this subject by Reese, Sanders and co-authors, entitled “It’s safe to come, we’ve got lattes,” in the journal Cities.  (This is one of those rare cases where the mass media version of an article is more measured and less snarky than the title of the companion academic piece, but I digress.)

Reese and Sands set about the apparently obligatory task of offering a contrarian view to stories in the popular press suggesting that Detroit has somehow turned the corner on its economic troubles and is starting to come back. We, too, are wary of glib claims that everything is fine in Detroit.  It isn’t. The city still bears the deep scars of decades of industrial decline coupled with dramatic failure of urban governance. The nascent rebound is evident only in a few places.

There’s a kind of straw man argument here.  Is Detroit “back?” As best I can tell, no one’s making that argument. The likelihood that the city will restore the industrial heyday of the U.S. auto industry, replete with a profitable oligopoly and powerful unions that negotiate high wages for modestly skilled work, just isn’t in the cards.  As Ed Glaeser has pointed out, it’s rare that cities reinvent their economies.  But when they do–as in the cases of Boston and New York–it’s because they’ve managed to do an extraordinary job of educating their local populations, and that base of talent has served as the critical resource for generating new economic activity. Detroit’s still far from that point.

And there’s no one who should think a renaissance will happen quickly if it happens at all. History is littered with examples of once flourishing cities that failed for centuries to find a second act: Athens was long deserted, Venice had its empire and economy collapse, Bruges had its harbor silt-up. In each case, these city’s early economies lived hard, died young and left a beautiful (architectural) corpse.  It’s really only been in the 20th century that each of these cities revived to any degree after their historical decline.

That said, there’s clear evidence that Detroit has stanched the economic hemorrhage. After a decade of year over year of job losses, Wayne County has chalked up five consecutive years of year-over-year job growth. True, the county is still down more than 150,000 jobs from its peak but has gained back 50,000 jobs in the past five years.

While this article presents a number of useful facts that remind us how far Detroit has to go, there are a lot of unresolved contradictions here.  In successive paragraphs, the authors decry the lackluster performance of Detroit home prices (they’re still way below housing bubble levels and haven’t rebounded nearly as well as in other cities), but then go on to decry the unfolding gentrification of the city.  You can’t have it both ways.  Either housing is cheap and devalued, or the city is becoming more expensive.

Why neighborhood level equality is a misleading metric for urban well being

Reese and Sands seem to be upset that Detroit’s nascent recovery is somehow unequal; that some parts of the city are rebounding while others still decline.

“. . . within the city recovery has been highly uneven, resulting in greater inequality.”

Detroit’s problem is not inequality, it’s poverty.  As the Brookings Institution’s Alan Berube put it:

“Detroit does not have an income inequality problem—it has a poverty problem. It’s hard to imagine that the city will do better over time without more high-income individuals.”

To be sure, more higher income residents and new restaurants, condos and office buildings may bring poverty into sharper contrast, but had those same higher income jobs and households located in the suburbs (or some other city), its far from obvious that poor Detroiters would somehow be better off.

As a result, the only way that Detroit is likely to improve its economy is to become at least somewhat less equal.  The city has a relatively high degree of equality at a very low level of income. The reason this has occurred, in large part, is those upper and middle income households—those with the means to do so—have exited the city in large numbers, leaving poor people behind.

We’ve long called out the misleading nature of inequality statistics when applied to small geographies. What’s called “inequality” at the neighborhood level is actually a sign of economic mixing, or economic integration—a neighborhood where high, middle and low income families live in close proximity and where there are housing opportunities at a range of price points.

Incomes in central cities are almost always more unequally distributed than in the metropolitan areas in which they are located, but this is because cities are more diverse and inclusive. At small geographies, this statistic says more about integration than it does about inequality.  

At a highly local level, “equality” is generally achieved in one of two ways:  by having a community that is so undesirable that no one with the means to live elsewhere chooses to stay, leaving an “equal” but very poor neighborhood. Alternatively, high levels of neighborhood equality can be achieved through the application of exclusionary zoning laws that make it illegal and impossible for low (and in some cases even middle income) families to live in an area. As in some exclusive suburbs, such as Flower Mound, Texas and Bethesda, Maryland — two of the highest scoring cities on equality — the equality is only for high income families.

Indeed, the big problem in American cities, as we’ve documented in our report “Lost in Place” is that in poor neighborhoods income is actually too equal. Neighborhoods of concentrated poverty, where more than 30 percent of the population lives below the poverty line, have tripled in the past 40 years. As the work of Raj Chetty and others has shown, neighborhoods of concentrated poverty permanently lower the lifetime earnings prospects of poor kids. Otherwise similar children who grow up in more mixed income (meaning unequal) neighborhoods have higher lifetime earnings.

As a practical matter, the only way forward for the Detroit economy is if more middle income and even upper income families choose to move to the city (or stay there as their fortunes improve). That will nominally make some of the income numbers look less “equal” but will play a critical role in creating the tax base and the local consumption spending that will —gradually— lead to further improvements in Detroit’s nascent economic rebound.

Where do we start? Achieving critical mass

The second fundamental critique in the City Lab piece is an argument that the city’s redevelopment efforts are failures because they aren’t producing improvements for everyone, everywhere in the city all at once. To date, the city’s successes have been recorded in downtown, Midtown and a few nearby neighborhoods, but because other parts of the city have continued to deteriorate and depopulate, the assumption is Detroit must be failing.

This critique ignores the fundamental fact that development and city economies are highly dependent on spatial spillovers. Neighborhoods rebound by reaching a critical mass of residents, stores, job opportunities and amenities.  The synergy of these actions in specific places is mutually self-reinforcing and leads to further growth. If growth were evenly spread over the entire city, no neighborhood would benefit from these spillovers. And make no mistake, this kind of spillover or interaction is fundamental to urban economics; it is what unifies the stories of city success from Jane Jacobs to Ed Glaser.  Without a minimum amount of density in specific places, the urban economy can’t flourish.  Detroit’s rebound will happen by recording some small successes in some places and then building outward and upward from these, not gradually raising the level of every part of the city.

Scale: Making the perfect the enemy of the good

Anyone familiar with Detroit knows that the city’s most overwhelming problem is one of operating and paying for a city built for two million people with a population (and consequently a tax base) less than half that size. The city is still wrestling with the difficult challenge of triage—reducing its footprint and shrinking its service obligations to match its resources.

And that’s the final point that’s so disturbing about the CityLab critique. Reese and Sands  argue that Detroit needs more more jobs  and resources for, among other things, educating its kids. No one doubts this. But where will that money come from? Certainly not from federal or state governments. It will have to come in large part from growing a local tax base, which is contingent on creating viable job centers and attracting and retaining more residents, including more middle and upper income residents.

Make no mistake: the scale here is daunting. The authors offer the helpful observation that if Detroit just somehow had another 100,000 jobs paying $10 per hour, it would pump more than $2 billion a year into the city’s economy. (Keep in mind that from 2001 through 2010, the Wayne County lost about twice that many jobs). While their math is impeccable, their economics are mystical.  This is the academic equivalent of the old Steve Martin joke about how to get a million dollars tax free:  “Okay, first, get a million dollars.”

It would be great if we could craft a sudden solution that would immediately create hundreds of thousands of jobs and drop billions of dollars in wages and money for schools and public services into Detroit. But that’s simply not going to happen. Instead, progress on a smaller scale has to start somewhere, has to involve new jobs, new residents and new investment in a few neighborhoods and then build from there. Businesses will start-up or move in, a few at a time, more in some neighborhoods than others, and then over time grow, providing more jobs and paying more taxes.

It’s going to be a long, hard road ahead for Detroit. And that road will lead to a different and smaller Detroit than existed in, say, the 1950s.  That road is made even harder by critics that damn the first few candles for shedding too little light.

 

The Geography of Independent Bookstores

Which cities have the strongest concentrations of independent bookstores?

Last week, we explored what we called the “mystery in the bookstore.” There’s a kind of good news/bad news set of narratives about bookselling in the US. After decades of decline in independent bookselling, many cities have seen a rebound by locally run stores. And while that appears to be true in many different locations, the overall trend, in terms of the number of bookstores counted in government statistics, still seems to be downward. We’re not quite sure how to reconcile these to divergent trends, so it remains something of a mystery–although we have some suspicions about what’s happening.

Independent bookstore owners Toni & Candace.

Part of the answer to this mystery lies in geography. The bookstore industry seems to be doing better in some places than in others. So, in typical City Observatory fashion, we set out to quantify the concentration of independent bookstores in different metropolitan areas. For data, we turned to the Indie Bookstore Finder, a web-based directory that let us search by radius in different cities. Like other web-based resources, its likely that not every independent bookstore in the country is included, but the data are extensive and nationwide in scope. We selected a point in the center of the central business district in each of the nation’s 53 largest metropolitan areas (all those with a population of a million or more) and searched for independent bookstores within a 25-mile radius.  The Indie Bookstore finder reported that there were 453 independent bookstores in these metropolitan areas.

There’s a pretty wide distribution in the presence of independent bookstores among metropolitan areas, at least according to this data source.  Two Pacific Northwest cities, Seattle and Portland top the list, with more than 7 independent bookstores per million population.  The typical large metro area has between 3 and 4 indie bookstores per million; three metro areas, Hartford, Jacksonville and Virginia Beach have none (at least none who are captured in this database).

As we noted yesterday government statistics count a lot more establishments as bookstores. For example, according to xxx, there are an estimated 3,500 bookstore establishments in these metro areas, so our “indie bookstore” listing about 10 percent of establishments counted by the Census. Many of the businesses classified as bookstores under the Census data are often auxiliaries to other businesses, such as a bookstore that’s part of a museum, or sell a range of printed material, like greeting cars or magazines (think of an airport bookstore). Given the demographics of the firms in this database, however, they seem to represent the more substantial and “book” focused retailers.

What patents tell us about America’s most innovative cities

Patents rates are a useful indicator of innovative activity

The US is increasingly becoming a knowledge-based economy, and as a result, the markers of wealth are shifting from the kinds of tangible assets that characterized the old industrial economy (like huge factory complexes) to much more intangible assets (the creativity and innovativeness of workers and organizations). Most of our statistical measures of economic activity were crafted for our machine-age economy, so its often a challenge to come up with measures of the new, intangible wealth that characterizes today’s economy.

One useful indicator of innovative capacity is patenting. The US government awards patents for novel ideas, and patent records record not only the name of the inventor, but her location as well. As a result, its possible to map and tabulate the density of patenting in different states and metropolitan areas.  The US Patent and Trademark Office provides a periodic tabulation of patent data, and you can also drill down to particular patent categories to identify the kinds of technologies that are present in a particular location, and even track down the number of patents awarded to particular firms or inventors.

To rank metro areas by innovativeness, we’ve computed the number of patents issued per 100,000 population for each of the nation’s metropolitan areas with more than a million population. San Jose–home of Silicon Valley–is far and away the most prolific patenter among US metro areas (it has about 770 patents per 100,000, a number that flows off the chart).  Other tech centers are also leaders in patents per capita including San Francisco, San Diego, Austin, Seattle, Raleigh and Boston.

 

As the data show, the distribution of patenting activity is highly skewed to tech centers.  The typical large metropolitan area has just 40 patents per 100,000 workers.  (The gray bar on the chart shows the inter-quartile range; about half of all metro areas get between 20 and 60 patents per 100,000 workers. The least patent-intensive metros include a group of mostly Southern cities (Virginia Beach, Birmingham, and New Orleans).

To be sure, there are important limits to patents as an indicator. Patenting is extremely common in some industries (like biotechnology and semiconductors), where the ownership of intellectual property is both an important source of competitive advantage (and a bevy of patents is an important source of trading stock for working out cross licensing agreements for firms with complementary technologies).  In some industries, patenting is rare, or unknown altogether. New styles and designs for apparel, for example, are seldom patented.

 

Visions of the City Part III: You don’t own me

What kind of future do we want to live in? While that question gets asked by planners and futurists in an abstract and technical way, some of the most powerful and interesting conversations about our future aspirations are reflected in the mass media. Lately, we’ve been struck by the visions embedded in recent television commercials.  Some of these visions are explicit, but others are a bit more subtle.

Earlier we took a close look at one car maker’s image of what a future city might look like. Ford’s vision, prepared for the Consumer Electronics Show, was a computer-generated video simulation of what it might be like in the near future, and then sometime later, to live in a city full of autonomous vehicles.  It was undoubtedly developed by the company’s engineers, and was used to polish the cred of the auto companies with the tech community and investors.

Until that future arrives, big auto companies have to survive, as they always have, by selling cars today. The world’s largest automaker, Toyota, released a new television ad to try and sell its vehicles. And implicit in its commercials is another view of the future (or perhaps the present) of what cities and city living should be like. And its actually much more interesting, compelling and human than Ford’s.

The commercial is nominally a pitch for Toyota’s Corolla, but if you watch the video–embedded below–its apparent that the car is really just a bit player in a thirty-second drama that’s really about what millennials do to achieve personal fulfillment.  The Corolla is Toyota’s entry level vehicle and is the world’s largest selling car model (nearly 50 million units to a mere 21 million for the Volkswagen Beetle). Like all car makers, Toyota aims to build a lifelong relationship with consumers, starting them out with a little Corolla in their twenties, graduating to a Camry as they get older, and then a Sienna minivan when they have kids. And if they’re successful, in later life they’ll graduate to a Lexus. So the purpose of advertising entry level vehicles is to establish an affinity with these customers early on. That’s not an easy task given the much lower rate at which young adults are getting drivers licenses, driving and buying new cars.

“You don’t own me” is a miniature drama. Its protagonist is a young twenty-something chef (picture a tattooed aspiring Top Chef contestant). In the opening scene of the 30-second commercial, she’s shown cooking away in an established tony restaurant, and then having her signature creation summarily tossed in the trash by a dismissive–and considerably older and male–head-chef). She immediately quits, throwing in her apron, and driving off in her Corolla, singing along to Leslie Gore’s 1964 hit “You don’t own me.”

In act two of this mini-drama there are a series of vignettes of other young adults. An African-American woman is ditching her dress shoes and tying on hiking boots at a state park, a peloton of cyclists pedals by, female roller derby players repeatedly crash into one other on an oval track. A group of young men is playing basketball. Trade magazine CampaignLife highlighted how the ad and song resonate with millennial aspirations:

The ad features young people triumphantly, you might even say defiantly, singing along to the song as they go about various activities like bonfires, group bike rides, and roller derby contests. Created by Saatchi & Saatchi LA, the ad’s best performance is among 21- to 35-year-olds, and interestingly displays stronger Desire and Relevance scores among males in that age group.

Strikingly, for a car ad, the activities highlighted in these vignettes mostly don’t involve driving. They’re set in urban spaces (parks, playgrounds, bike paths).  The car and its technology are essentially featured only once, in passing, as a lane alert signals a group of singing passengers that they’ve drifted across the centerline.

The final act this little drama shows our young chef cooking in her new food truck, and dishing up one of her creations to another millennial standing outside. She’s been transformed from oppressed and disrespected, to an independent, creative entrepreneur. No one owns her.

Of course, this is a fable: most twenty-something start-up food truck owners would probably be maxing out their credit to make the payments on even a second-hand food truck; it’s likely that if they owned a car at all (rather than relying on their bike as a principal means of transportation) that they’d buy that used as well.

Its worth reflecting how different this drama is than Ford’s CG vision of cities of the future. Toyota appreciates that its potential customers are people who are more interested and engaged by all of the things that they can do in cities when they’re not in a car.

Unlike Ford’s futuristic vision, Toyota’s vision of a slightly idealized present focuses on people and how they live. Fittingly, the tag-line of the commercial is “Toyota:  Let’s go places.” The emphasis here is on “places.” And ultimately, that’s the difference between the Ford (and other techno-futurist) view of transportation and the view offered here. We attach value to the places we want to be. What we value in this “near” future is not being owned; being independent and engaging with other people.

Superficially, one might see a subliminal car-sharing message embedded in this ad: “you won’t own me” is essentially what the cars of the future are saying. Instead, there’ll be some combination of fleets of autonomous vehicles, along with much more widespread availability of “on-demand” rental cars like Car2Go, ReachNow and ZipCar. The more serious issue is that a key problems with cars, and our auto-dependent transportation is that in a sense “our cars do own us.” In many places its simply impossible to be a first class citizen without owning one. “You don’t own me” has been a kind of feminist anthem on and off over the years, and maybe that same slogan, applied to privately owned cars, should be a guiding principle for urban planning.

There’s a subtle but profound shift in what’s being sold here: Ultimately this moves us in the direction of transportation as a service. And transportation is just a means to an end (or set of ends) rather than an end in itself. Its good because it enables us to get and do the other things we want. It ceases to be an object of status or consumption good in its own right.

As we think about the future, and the kind of cities we want, maybe we should spend less time fetishizing modes of transportation, and more generally technology-and think about the kind of places we want to be in the the kind of experiences that they enable. Crafting the right kind of narrative about the cities and the lives we desire is an indispensable part of creating a better future.

Envisioning the way we want to live in cities

The biggest challenge for creating great cities is imagination, not technology

There’s a definite technological determinism to how we approach future cities. Some combination of sensors, 5G Internet, sophisticated computing and a very centralized command infrastructure will inexorably lead to places that are somehow greener, more prosperous and just.  Color us skeptical: the unbridled devotion to optimizing cities for technology has been (witness the automobile) an epic blunder.

Our view is that making great cities is about imagining the way we want to live in them, not chasing the latest technology. What makes a city a compelling and desirable place to live?

That’s an open-end, and debatable question. One big global corporation has, perhaps unwittingly, given us a very compelling vision of cities–and life.  This  vision comes from Samsung, the Korea-based technology company.  They’ve been running a long form (60 second) television commercial called “A Perfect Day.” It follows the exploits of a half dozen kids–armed just with bikes, skateboards, and of course Samsung Galaxy smart phones–as they roam around New York City.  There’s a lot going on here, so let’s see if we can’t unpack all the different, and in many ways radical narrative its proposing.

A Perfect Day

Samsung “A Perfect Day” from Dae Kang on Vimeo.

First of all, they are in  a city. New York is front and center. This is not an anonymous or sanitized CG landscape. Its authentically and identifiably a city–a real city. And its shown from the perspective of actual humans experiencing it on the ground.

They are traveling by bike. The first scene of this micro-drama shows a platoon of cyclists (and one lagging skateboarder) set out in the morning, traveling in a marked bike lane on a residential street (in Queens or Brooklyn). They round a corner onto a busy arterial, and then ride across the Williamsburg bridge to Manhattan.

They are un-supervised by adults. The demographics of the group are just a little too perfect: teens and tweens, black, brown and white, boys and girls. But strikingly no adult authority figure is present. A parent calls only as dusk is falling (call answered via wrist-watch, naturally), only to be somewhat dismissively told “almost home,” with that message punctuated with a chorus of “Love you, Mom!” from the ensemble.

 

They are hanging out in public spaces.They’re not in a den, a great room, a tech-laden suburban bedroom, or even a cosseted back yard.  They’re on the streets of the big city. They’re taking their own 3D photos and then sharing their virtual reality headset with a complete stranger they meet on the street. They’re at a skatepark under another towering bridge. They spend the afternoon hanging out at a public pool.

They are having experiences. The kids are recording and sharing their experiences with their Samsung devices. But in every case, the technology is incidental or subservient to the experience.

So here, in a nutshell, we have something that actually resembles a compelling future vision of cities. It includes technology, a little. But it isn’t about autonomous self-driving cars, or about side-walk internet kiosks or ubiquitous electronic surveillance.

Our vision of cities ought to be about the joy and wonder of the experiences we can have in them, not obsessing about the plumbing of moving people and stuff to and fro. For too long we’ve optimized our cities for the vehicles moving through them, rather than the people living in them. Samsung, or at least its creative agency, Weiden and Kennedy get this.

We’re not the only ones who were struck by this ad. Writing at her blog, Free Range Kids, Lenore Skenazy asked “What is this amazing Samsung ad trying to tell us?” The answer is pretty clear: If a city is a place where kids can roam and play, what else does it need to do?

Why narrative matters

In his Presidential Address to the American Economics Association two weeks ago, Nobelist Robert Shiller presented his thoughts on what he called “narrative economics.” Human beings are not the cold rational calculators they’re made out to be in traditional economic modeling. Instead, Shiller argued, human’s are hard-wired to visualize and understand the world through story-telling: We really ought to be called “Homo Narans.” That’s why getting the story right matters so much. If we have a story that centers on technology, vehicles and frenetic movement, we can remake our world in that image. If, instead, we have a story that embraces experience, and place and freedom, we’ll get a very different world.

It’s ultimately debatable whether Samsung’s version of “a perfect day” is one that everyone would agree with. But its an example of the kind of vision that might guide us, as we think about the kind of places we want to build. We should be deliberate in choosing our preferred narrative.

Note: This post has been revised to correct a broken link to the video.

Visions of the City Part II: A Perfect Day

Yesterday we took a close look at Ford’s vision for the future of cities. Our take: Ford’s preferred narrative of the places we’ll live is all about optimizing city life for vehicles. But is that the narrative that should guide us?

Another big global corporation has, perhaps unwittingly, given us a very different vision of cities–and life.  This other vision comes from Samsung, the Korea-based technology company.  They’ve been running a long form (60 second) television commercial called “A Perfect Day.” It follows the exploits of a half dozen kids–armed just with bikes, skateboards, and of course Samsung Galaxy smart phones–as they roam around New York City.  There’s a lot going on here, so let’s see if we can’t unpack all the different, and in many ways radical narrative its proposing.

A Perfect Day

First of all, they are in  a city. New York is front and center. This is not an anonymous or sanitized CG landscape. Its authentically and identifiably a city–a real city. And its shown from the perspective of actual humans experiencing it on the ground.

They are traveling by bike. The first scene of this micro-drama shows a platoon of cyclists (and one lagging skateboarder) set out in the morning, traveling in a marked bike lane on a residential street (in Queens or Brooklyn). They round a corner onto a busy arterial, and then ride across the Williamsburg bridge to Manhattan.

They are un-supervised by adults. The demographics of the group are just a little too perfect: teens and tweens, black, brown and white, boys and girls. But strikingly no adult authority figure is present. A parent calls only as dusk is falling (call answered via wrist-watch, naturally), only to be somewhat dismissively told “almost home,” with that message punctuated with a chorus of “Love you, Mom!” from the ensemble.

 

They are hanging out in public spaces.They’re not in a den, a great room, a tech-laden suburban bedroom, or even a cosseted back yard.  They’re on the streets of the big city. They’re taking their own 3D photos and then sharing their virtual reality headset with a complete stranger they meet on the street. They’re at a skatepark under another towering bridge. They spend the afternoon hanging out at a public pool.

They are having experiences. The kids are recording and sharing their experiences with their Samsung devices. But in every case, the technology is incidental or subservient to the experience.

So here, in a nutshell, we have something that actually resembles a compelling future vision of cities. It includes technology, a little. But it isn’t about autonomous self-driving cars, or about side-walk internet kiosks or ubiquitous electronic surveillance.

Our vision of cities ought to be about the joy and wonder of the experiences we can have in them, not obsessing about the plumbing of moving people and stuff to and fro. For too long we’ve optimized our cities for the vehicles moving through them, rather than the people living in them. Samsung, or at least its creative agency, Weiden and Kennedy get this.

We’re not the only ones who were struck by this ad. Writing at her blog, Free Range Kids, Lenore Skenazy asked “What is this amazing Samsung ad trying to tell us?” The answer is pretty clear: If a city is a place where kids can roam and play, what else does it need to do?

Why narrative matters

In his Presidential Address to the American Economics Association two weeks ago, Nobelist Robert Shiller presented his thoughts on what he called “narrative economics.” Human beings are not the cold rational calculators they’re made out to be in traditional economic modeling. Instead, Shiller argued, human’s are hard-wired to visualize and understand the world through story-telling: We really ought to be called “Homo Narans.” That’s why getting the story right matters so much. If we have a story that centers on technology, vehicles and frenetic movement, we can remake our world in that image. If, instead, we have a story that embraces experience, and place and freedom, we’ll get a very different world.

It’s ultimately debatable whether Samsung’s version of “a perfect day” is one that everyone would agree with. But its an example of the kind of vision that might guide us, as we think about the kind of places we want to build. We should be deliberate in choosing our preferred narrative.

The enduring effect of education on regional economies

One of the themes we stress at City Observatory is the large and growing importance of talent (the education and skills of the population) to determining regional and local economic success. As we shift more and more to a knowledge-based economy, the places that will do well, and that are resilient in the face of change are the one’s with the best-educated populations.

One of the most robust statistical relationships we’ve observed is the strong correlation between the four-year college attainment rate (measuring the fraction of the adult population that’s completed at least a four-year college degree) and a place’s per capita income (its total income divided by the total population). Places with better educated populations have higher incomes. While we generally focus on the relationship at the metro level, we thought we’d step back this week and look at state level data over the past quarter century to see how this relationship has evolved.

Let’s start in 1990, and look at the correlation between state per capita income and the percent of the state’s adults with at least a four-year degree. To facilitate comparisons with more recent data, we’ve expressed 1990 per capita incomes in 2015 dollars using the Implicit Price Deflator for Personal Consumption Expenditures.  This table shows how a state’s educational attainment in 1990 was correlated with its average income in 1990.

The data show a strong positive correlation. Each one percentage point increase in the adult population with a four year degree was associated with an increase of about $950 dollars (expressed in 2015 dollars).  In a statistical sense, variations in education explain about 6x percent of the variation in per capita income among states.

Now move forward to 2015, the latest year for which we have state level data on educational attainment and per capita income. This chart has the same setup, but now shows 2015 adult educational attainment compared to 2015 levels of per capita income. Its similar, but now the relationship is stepper (each one percentage point increase in educational attainment in a state is now equal to a $1,070 increase in per capita income, and education alone explains 66 percent of the variation in income between states.  Keep in mind that idiosyncratic factors beyond educational attainment matter to state income: still flush from the oil boom, Wyoming, North Dakota and Alaska all had much higher per capita income in 2015 than one would expect based on education alone.

Together these two charts show that education is a strong and increasingly important factor related to state income growth. But what’s interesting is to go a step further and put these two analyses together. What we’ve done in the following chart is to compare 1990 levels of educational attainment (i.e. what fraction of a state’s adults had a four-year degree 25 years ago, and compare it to today’s level of per capita income. That relationship is shown here.

The key finding here is that 1990 educational attainment was a better predictors (i.e. higher coefficient of determination ) of 2015 income that it was of 1990 income. That is, educational attainment in 1990 was more strongly correlated with income 25 years later than in the current year. Each one percentage point increase in the four-year attainment rate in 1990 was associated with $1,600 higher per capita income in 2015. The most obvious reason for this is that many of the 1990 college educated citizens in a state were still around 25 years later. More generally, though, states with high levels of education (and income) may be more likely to invest in education, and send their children to college, and also be places that attract even more college graduates. Either way, the persistence of educational attainment seems to be a key factor correlated with long-term economic well-being.

The reason to pursue a talent-centered economic development strategy should be a strong motivation to reap the long-term benefits of a well-educated population.

A hat-tip to our colleague Phineas Baxendall at the Massachusetts Budget & Policy Center points us to similar work done by his organization which looks at the median wage for all workers by state, and compares it to each state’s college attainment rate. As with our analysis, there’s a very strong correlation.

Climate: Our Groundhog Day Doom Loop

Every year, the same story:  We profess to care about climate change, but we’re driving more and transortation greenhouse gas emissions are rising rapidly.

Oregon is stuck in an endless loop of lofty rhetoric, distant goals, and zero actual progress

Case in point:  Portland’s Regional Transportation Plan:  It claims to do something about climate, but is ignoring the continuing increase in transportation GHGs and is planning to spend billions subsidizing more driving.

And Portland Metro is keeping two sets of books:  one that pretends to meet climate goals, while another rationalizes increasing driving by 5 million miles a day.

Every February 2, City Observatory takes a moment to consider whether we’ve made any progress in our stated goal to do something about climate change.  And once again, we find, like Bill Murray in the movie Groundhog Day, we’re stuck in an unchanging doom loop.

 

Groundhog Day. (Source: Bard).

We’re coping with the climate crisis—and our failure to take it seriously—with a kind of second-order denial, not pretending that climate change isn’t real, but instead pretending that we’re actually doing something to solve it, or more precisely, pretending that the things that we are doing are not ineffectual.

Don’t Look Up:  Failing to acknowledge that we’re losing ground on transportation GHGs

Oregon leaders are caught in the trap of yet another allegorical movie: “Don’t Look Up.” On paper, the region pledged a decade ago to implement a “Climate Smart Strategy” that would put the region on track to meet a state-adopted goal of  reducing emissions by 80 percent from 1990 levels by 2050.  As is true nationally, transportation, mostly driving, is the single largest source of greenhouse gases in Metro Portland. In the Portland area, per capita greenhouse gas emissions from transportation have increased more than 14 percent since 2013, by about 1,000 pounds per person. Total greenhouse gas emissions from transportation have increased by about 1.6 million tons per year over that time.

 

A “Climate Smart” Transportation Plan that Rationalizes More Driving

The Portland region is supposed to meet its climate goal through the implementation of its Regional Transportation Plan, which spells out how we’ll spend billions of dollars and hopefully reducing driving and car dependence.  But there are glaring problems with the RTP:  For starters, the Metro RTP simply fails to look to see whether greenhouse gases from transportation are declining, as the plan calls for.  They aren’t: state, regional and federal inventories all show that despite nearly a decade of “climate smart” policies, Portland produces more transportation greenhouses gases now than before it implemented its strategy.

What’s worse, the adopted Regional Transportation Plan actually calls for policies and investments that will make greenhouse gas emissions worse.  It proposes spending billions of dollars widening area freeways, something that will lead to more driving and greenhouse gases.

A close look at the technical analysis that is the foundation for the RTP shows that Metro has two completely different sets of “books” for assessing transportation.  When it comes to demonstrating compliance with state climate laws and regulations, Metro has produced a set of projections showing we’ll hold total driving in the Portland area to its current level—in spite of increase population—by reducing per capita driving by almost a third.  But when it comes to sizing the transportation system—and in particular—justifying investments in added highway capacity, Metro has a second set of books, that assume per capita driving doesn’t change at all, and that as a result, we end up driving about 5 million miles more per day in the Portland area than assumed the climate analysis.  These two estimates are completely contradictory, and they mean that the Regional Transportation Plan doesn’t comply with state climate laws, and that if we actually followed through on our stated climate strategy of holding driving to its current level of about 20 million miles per day, we wouldn’t need to spend any more on expanding highway capacity.  In reality, the greenhouse gas reductions promised by regional leaders only exist as hypothetical quantities in a separate appendix of the regional transportation plan, one that is divorced from actual spending plans, which contemplate spending billions on road widening.

Deja vu all over again’

If it seems to you like you’ve read this before, you have:  The Oregon Global Warming Commission’s latest report reads like its predecessors.  It chronicles the growing climate crisis, yet again declaring the state’s adopted goal to have lower carbon emissions by 2050, and presenting the data showing that we’re utterly failing to make meaningful progress. Here’s the key takeaway from the report:  The yellow lines show the state’s stated goals, the dotted orange lines show the path we’re on.  And if you look at the black (“actual”) line, you’ll see that emissions have actually increased since 2010.  In short, in no way are we on track to meet our adopted goals.

 

Climate: Our Groundhog Day Doom Loop

Every year, the same story:  We profess to care about climate change, but we’re driving more and greenhouse gas emissions are rising rapidly.

Oregon is stuck in an endless loop of lofty rhetoric, distant goals, and zero actual progress

If it seems to you like you’ve read this before, you have:  We’re marking this year’s Groundhog Day noting that despite our stated goal to reduce our greenhouse gas emissions, we’re actually not making any progress on climate change.  The Oregon Global Warming Commission’s latest report reads like its predecessors.  It chronicles the growing climate crisis, yet again declaring the state’s adopted goal to have lower carbon emissions by 2050, and presenting the data showing that we’re utterly failing to make meaningful progress. Here’s the key takeaway from the report:  The yellow lines show the state’s stated goals, the dotted orange lines show the path we’re on.  And if you look at the black (“actual”) line, you’ll see that emissions have actually increased since 2010.  In short, in no way are we on track to meet our adopted goals.

Once again, we find ourselves in the same predicament as Bill Murray in the 1993 movie, Groundhog’s Day–waking up every morning to discover that it’s still February 2, and that he’s done nothing to change any of the behaviors that have messed up his life.

Last year we noted that Oregon leaders were caught in the trap of yet another allegorical movie “Don’t Look Up.”  We’re coping with the climate crisis—and our failure to take it seriousl—with a kind of second-order denial, not pretending that climate change isn’t real, but instead pretending that we’re actually doing anything to solve it, or more precisely that pretending that the things that we are doing are not ineffectual.

Since our first groundhog day commentary there have been a series of catastrophic forest fires smothered most of the state in clouds of smoke for a solid week in September 2020.  The fires destroyed hundreds of thousands of acres of forest, forced tens of thousands of Oregonians to evacuate their homes, destroyed hundreds of homes and businesses.  Having experienced it first hand, it was an apocalyptic scene.

Bottom line: the state is failing to meet its legislatively adopted greenhouse gas reduction goals.  The state had set an interim target of reducing emissions by a very modest 10 percent from 1990 levels by 2020.  The final 2020 data aren’t in yet, but it’s clear that the state will fail to meet that goal, and not by a little, but by a lot—26 percent to be exact—according to the commission:

While Oregon met the 2010 emissions reduction goal established in 2007, we are highly unlikely to meet the 2020 goal. Preliminary 2019 sector-based emissions data and GHG emissions projections indicate that Oregon’s 2020 emissions are likely to exceed the State’s 2020 emissions reduction goal by approximately 26 percent or 13.4 million metric tons of CO2e, erasing most of the gains we had made between 2010 and 2014.

We gave ourselves three decades to reduce greenhouse gas emissions by just 10 percent, and instead over that time, increased them by more than 10 percent.  How realistic is it to expect that we’ll meet our adopted goal of reducing emissions by 75 percent from 1990 levels (more than 77 percent from today’s levels) by 2050.

The original inspiration for our Groundhog’s Day commentary was the 2017 report of the Oregon Global Warming Commission, a body set up to monitor how well the state was doing in achieving its legally adopted goal to reduce greenhouse gas emissions by 75 percent from 1990 levels by the year 2050. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The short story four years ago was:  Not very good.  Although the state reduced some power plant and industrial emissions, nearly all these gains were wiped out by increased driving. The 2017 Legislature that received that report not only did essentially nothing in response, it arguably laid the groundwork to make the problem worse, by approving a new transportation finance package providing upwards of a billion dollars to widen Portland area freeways.

It’s now the case that transportation is the single largest source of greenhouse gases, and for the past several years, Oregon’s emissions are rising. And the culprit is clear:  More driving.  As we’ve pointed out at City Observatory: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Per capita greenhouse gas emissions from transportation have increased more than 14 percent since 2013, by about 1,000 pounds per person. Total greenhouse gas emissions from transportation have increased by about 1.6 million tons per year over that time.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, there’s no way the state on the path it needs to be on to reach its 2050 goal. Transportation emissions are actually increasing from their 2010 levels—at a time when the state’s climate strategy called for them to be declining. (In the chart below, the blue line is the glide slope to achieving the state’s adopted 2050 goal, and the pale green line is the estimate of where the state is headed).

The best estimate is that rather than the eighty percent reduction from 1990 levels that the state has set as a goal, we’ll see barely a 20 percent reduction by 2050.  And these reductions are predicated on assumptions about more rapid fleet turnover, lower sales of trucks and SUVs, and steadily tougher fuel economy standards, when in fact the fleet is getting older (and staying dirtier), trucks and SUVs have dramatically increased their market share, and the federal government has walked back its fuel economy standards.  So its highly unlikely that even the 20 percent reduction by 2050 will be realized.

As the commission notes in its latest report, it will take more than electrifying vehicles as fast as possible to come anywhere close to meeting the state’s goals:

It is critical that the state leverages resources to support the use of clean vehicles and fuels and reduce vehicle miles traveled per capita. Technological advancements and penetration of ZEVs alone won’t be enough to meet emissions goals. We encourage the Legislature to fully fund the necessary follow-on work identified by the agencies. We also need to take steps to help people drive less by strategically redesigning our communities and transportation systems. 

(emphasis added)

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation. After all, Oregon’s government passed a law mandating a reduction in greenhouse gases.  In the law adopted in 2007, the state committed to reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. Many other states and cities have similar adopted goals. But few cities are actually achieving these goals, as a recent audit by the Brookings Institution makes clear.

The reason is that there’s a deep flaw in the “pledge now, pollute less later” approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far so in the future, as to be beyond the expected political lifetime of any of the public officials adopting these goals. And there’s no mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for conspicuously expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

The Hippocratic Oath directs physicians, first, “to do no harm.”  The same ought to apply to state and local governments who profess to care about climate change. If we know driving is the biggest source of carbon pollution, and it’s causing us to increase emissions when we need to be decreasing them, the very last thing we should be doing is expanding freeways, which encourage people to drive even more. That’s why we think that Portland area freeway widening projects like the proposed $1.2 billion expansion of I-5 should be taken off the table.  That money—and billions of dollars in other subsidies to automobile transportation—would be far better spent in building communities that are designed to enable low carbon living.

The International Panel on Climate Change has made it clear in its most recent report that we’re rapidly approaching a point of no return if we’re to avoid serious and permanent damage to the climate. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. As we wrote in a year ago,  2021 would have been a good time to start taking climate change seriously.  We’ve mostly squandered another year, unless things change, next year’s  Groundhog Day will be depressingly similar, and even more grim.

 

Climate: Our Groundhog Day Doom Loop

Every year, the same story:  We profess to care about climate change, but we’re driving more and greenhouse gas emissions are rising rapidly.

Oregon is stuck in an endless loop of lofty rhetoric, distant goals, and zero actual progress

Another year, another Groundhog Day, and another bleak report that we’re not making any progress on Climate Change.  The Oregon Global Warming Commission’s latest report reads like its predecessors.  It chronicles the growing climate crisis, yet again declaring the state’s adopted goal to have lower carbon emissions by 2050, and presenting the data showing that we’re utterly failing to make meaningful progress. Here’s the key takeaway from the report:  The yellow lines show the state’s stated goals, the dotted orange lines show the path we’re on.  And if you look at the black (“actual”) line, you’ll see that emissions have actually increased since 2010.  In short, in no way are we on track to meet our adopted goals.

If that sounds like something you’ve heard before, because, just like the movie Groundhog’s Day, when it comes to climate change Oregon is caught in an endless loop, repeating the same dire diagnosis, litany of prescriptions and non-existent process.

Once again, we find ourselves in the same predicament as Bill Murray in the 1993 movie, Groundhog’s Day–waking up every morning to discover that it’s still February 2, and that he’s done nothing to change any of the behaviors that have messed up his life.

The difference this year was that a series of catastrophic forest fires smothered most of the state in clouds of smoke for a solid week in September 2020.  The fires destroyed hundreds of thousands of acres of forest, forced tens of thousands of Oregonians to evacuate their homes, destroyed hundreds of homes and businesses.  Having experienced it first hand, it was an apocalyptic scene.

Bottom line: the state is failing to meet its legislatively adopted greenhouse gas reduction goals.  The state had set an interim target of reducing emissions by a very modest 10 percent from 1990 levels by 2020.  The final 2020 data aren’t in yet, but it’s clear that the state will fail to meet that goal, and not by a little, but by a lot—26 percent to be exact—according to the commision:

While Oregon met the 2010 emissions reduction goal established in 2007, we are highly unlikely to meet the 2020 goal. Preliminary 2019 sector-based emissions data and GHG emissions projections indicate that Oregon’s 2020 emissions are likely to exceed the State’s 2020 emissions reduction goal by approximately 26 percent or 13.4 million metric tons of CO2e, erasing most of the gains we had made between 2010 and 2014.

We gave ourselves three decades to reduce greenhouse gas emissions by just 10 percent, and instead over that time, increased them by more than 10 percent.  How realistic is it to expect that we’ll meet our adopted goal of reducing emissions by 75 percent from 1990 levels (more than 77 percent from today’s levels) by 2050.

The original inspiration for our Groundhog’s Day commentary was the 2017 report of the Oregon Global Warming Commission, a body set up to monitor how well the state was doing in achieving its legally adopted goal to reduce greenhouse gas emissions by 75 percent from 1990 levels by the year 2050. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The short story four years ago was:  Not very good.  Although the state reduced some power plant and industrial emissions, nearly all these gains were wiped out by increased driving. The 2017 Legislature that received that report not only did essentially nothing in response, it arguably laid the groundwork to make the problem worse, by approving a new transportation finance package providing upwards of a billion dollars to widen Portland area freeways.

It’s now the case that transportation is the single largest source of greenhouse gases, and for the past several years, Oregon’s emissions are rising. And the culprit is clear:  More driving.  As we’ve pointed out at City Observatory: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Per capita greenhouse gas emissions from transportation have increased more than 14 percent since 2013, by about 1,000 pounds per person. Total greenhouse gas emissions from transportation have increased by about 1.6 million tons per year over that time.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, there’s no way the state on the path it needs to be on to reach its 2050 goal. Transportation emissions are actually increasing from their 2010 levels—at a time when the state’s climate strategy called for them to be declining. (In the chart below, the blue line is the glide slope to achieving the state’s adopted 2050 goal, and the pale green line is the estimate of where the state is headed).

The best estimate is that rather than the eighty percent reduction from 1990 levels that the state has set as a goal, we’ll see barely a 20 percent reduction by 2050.  And these reductions are predicated on assumptions about more rapid fleet turnover, lower sales of trucks and SUVs, and steadily tougher fuel economy standards, when in fact the fleet is getting older (and staying dirtier), trucks and SUVs have dramatically increased their market share, and the federal government has walked back its fuel economy standards.  So its highly unlikely that even the 20 percent reduction by 2050 will be realized.

As the commission notes in its latest report, it will take more than electrifying vehicles as fast as possible to come anywhere close to meeting the state’s goals:

It is critical that the state leverages resources to support the use of clean vehicles and fuels and reduce vehicle miles traveled per capita. Technological advancements and penetration of ZEVs alone won’t be enough to meet emissions goals. We encourage the Legislature to fully fund the necessary follow-on work identified by the agencies. We also need to take steps to help people drive less by strategically redesigning our communities and transportation systems. 

(emphasis added)

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation. After all, Oregon’s government passed a law mandating a reduction in greenhouse gases.  In the law adopted in 2007, the state committed to reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. Many other states and cities have similar adopted goals. But few cities are actually achieving these goals, as a recent audit by the Brookings Institution makes clear.

The reason is that there’s a deep flaw in the “pledge now, pollute less later” approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far so in the future, as to be beyond the expected political lifetime of any of the public officials adopting these goals. And there’s no mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for conspicuously expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

The Hippocratic Oath directs physicians, first, “to do no harm.”  The same ought to apply to state and local governments who profess to care about climate change. If we know driving is the biggest source of carbon pollution, and it’s causing us to increase emissions when we need to be decreasing them, the very last thing we should be doing is expanding freeways, which encourage people to drive even more. That’s why we think that Portland area freeway widening projects like the proposed $1.2 billion expansion of I-5 should be taken off the table.  That money—and billions of dollars in other subsidies to automobile transportation—would be far better spent in building communities that are designed to enable low carbon living.

The International Panel on Climate Change has made it clear in its most recent report that we’re rapidly approaching a point of no return if we’re to avoid serious and permanent damage to the climate. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. As we wrote in a year ago,  2021 would have been a good time to start taking climate change seriously.  We’ve mostly squandered another year, unless things change, next year’s  Groundhog Day will be depressingly similar, and even more grim.

 

Again, it’s Groundhog’s Day, again

Every year, the same story:  We profess to care about climate change, but we’re driving more and greenhouse gas emissions are rising rapidly.

Oregon is stuck in an endless loop of lofty rhetoric, distant goals, and zero actual progress

Another year, another Groundhog’s Day, and another bleak report that we’re not making any progress on Climate Change.  Last month, the Oregon Global Warming Commission released its latest annual report chronicling the growing climate crisis, yet again declaring the state’s adopted goal to have lower carbon emissions by 2050, and presenting the data showing that we’re utterly failing to make meaningful progress. Here’s the key takeaway from the report:  The yellow lines show the state’s stated goals, the dotted orange lines show the path we’re on.  And if you look at the black (“actual”) line, you’ll see that emissions have actually increased since 2010.  In short, in no way are we on track to meet our adopted goals.

If that sounds like something you’ve heard before, because, just like the movie Groundhog’s Day, when it comes to climate change Oregon is caught in an endless loop, repeating the same dire diagnosis, litany of prescriptions and non-existent process.

Once again, we find ourselves in the same predicament as Bill Murray in the 1993 movie, Groundhog’s Day–waking up every morning to discover that it’s still February 2, and that he’s done nothing to change any of the behaviors that have messed up his life.

The difference this year was that a series of catastrophic forest fires smothered most of the state in clouds of smoke for a solid week in September 2020.  The fires destroyed hundreds of thousands of acres of forest, forced tens of thousands of Oregonians to evacuate their homes, destroyed hundreds of homes and businesses.  Having experienced it first hand, it was an apocalyptic scene.

Bottom line: the state is failing to meet its legislatively adopted greenhouse gas reduction goals.  The state had set an interim target of reducing emissions by a very modest 10 percent from 1990 levels by 2020.  The final 2020 data aren’t in yet, but it’s clear that the state will fail to meet that goal, and not by a little, but by a lot—26 percent to be exact—according to the commision:

While Oregon met the 2010 emissions reduction goal established in 2007, we are highly unlikely to meet the 2020 goal. Preliminary 2019 sector-based emissions data and GHG emissions projections indicate that Oregon’s 2020 emissions are likely to exceed the State’s 2020 emissions reduction goal by approximately 26 percent or 13.4 million metric tons of CO2e, erasing most of the gains we had made between 2010 and 2014.

We gave ourselves three decades to reduce greenhouse gas emissions by just 10 percent, and instead over that time, increased them by more than 10 percent.  How realistic is it to expect that we’ll meet our adopted goal of reducing emissions by 75 percent from 1990 levels (more than 77 percent from today’s levels) by 2050.

The original inspiration for our Groundhog’s Day commentary was the 2017 report of the Oregon Global Warming Commission, a body set up to monitor how well the state was doing in achieving its legally adopted goal to reduce greenhouse gas emissions by 75 percent from 1990 levels by the year 2050. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The short story four years ago was:  Not very good.  Although the state reduced some power plant and industrial emissions, nearly all these gains were wiped out by increased driving. The 2017 Legislature that received that report not only did essentially nothing in response, it arguably laid the groundwork to make the problem worse, by approving a new transportation finance package providing upwards of a billion dollars to widen Portland area freeways.

It’s now the case that transportation is the single largest source of greenhouse gases, and for the past several years, Oregon’s emissions are rising. And the culprit is clear:  More driving.  As we’ve pointed out at City Observatory: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Per capita greenhouse gas emissions from transportation have increased more than 14 percent since 2013, by about 1,000 pounds per person. Total greenhouse gas emissions from transportation have increased by about 1.6 million tons per year over that time.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, there’s no way the state on the path it needs to be on to reach its 2050 goal. Transportation emissions are actually increasing from their 2010 levels—at a time when the state’s climate strategy called for them to be declining. (In the chart below, the blue line is the glide slope to achieving the state’s adopted 2050 goal, and the pale green line is the estimate of where the state is headed).

The best estimate is that rather than the eighty percent reduction from 1990 levels that the state has set as a goal, we’ll see barely a 20 percent reduction by 2050.  And these reductions are predicated on assumptions about more rapid fleet turnover, lower sales of trucks and SUVs, and steadily tougher fuel economy standards, when in fact the fleet is getting older (and staying dirtier), trucks and SUVs have dramatically increased their market share, and the federal government has walked back its fuel economy standards.  So its highly unlikely that even the 20 percent reduction by 2050 will be realized.

As the Commission notes in its latest report, it will take more than electrifying vehicles as fast as possible to come anywhere close to meeting the state’s goals:

It is critical that the state leverages resources to support the use of clean vehicles and fuels and reduce vehicle miles traveled per capita. Technological advancements and penetration of ZEVs alone won’t be enough to meet emissions goals. We encourage the Legislature to fully fund the necessary follow-on work identified by the agencies. We also need to take steps to help people drive less by strategically redesigning our communities and transportation systems. 

(emphasis added)

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation. After all, Oregon’s government passed a law mandating a reduction in greenhouse gases.  In the law adopted in 2007, the state committed to reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. Many other states and cities have similar adopted goals. But few cities are actually achieving these goals, as a recent audit by the Brookings Institution makes clear.

The reason is that there’s a deep flaw in the “pledge now, pollute less later” approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far so in the future, as to be beyond the expected political lifetime of any of the public officials adopting these goals. And there’s no mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for conspicuously expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

The Hippocratic Oath directs physicians, first, “to do no harm.”  The same ought to apply to state and local governments who profess to care about climate change. If we know driving is the biggest source of carbon pollution, and it’s causing us to increase emissions when we need to be decreasing them, the very last thing we should be doing is expanding freeways, which encourage people to drive even more. That’s why we think that Portland area freeway widening projects like the proposed $800 million expansion of I-5 should be taken off the table.  That money—and billions of dollars in other subsidies to automobile transportation—would be far better spent in building communities that are designed to enable low carbon living.

The International Panel on Climate Change has made it clear in its most recent report that we’re rapidly approaching a point of no return if we’re to avoid serious and permanent damage to the climate. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. As we wrote in our first commentary of the year, 2021 is the time when we need to start taking climate change seriously; if we don’t future Groundhog Day’s will be depressingly similar, and even more grim.

 

With climate change, it’s always Groundhog’s Day

Every year, the same story:  We profess to care about climate change, but we’re driving more and greenhouse gas emissions are rising rapidly.

Oregon is stuck in an endless loop of lofty rhetoric, distant goals, and zero actual progress

Sunday, February 2nd is Groundhog’s Day, and City Observatory has an annual tradition of looking around to see if we’re making progress on climate change.  The short answer is, we’re not.  If it seems like you’ve read this post before at City Observatory, you’re not wrong. For the past three years, every Groundhog’s Day, we’ve stuck our heads up and looked around to see whether anything has changed when it comes to coping with the growing crisis of global warming. Once again, we find ourselves in the same predicament as Bill Murray in the 1993 movie, Groundhog’s Day–waking up every morning to discover that it’s still February 2, and that he’s done nothing to change any of the behaviors that have messed up his life.

The original inspiration for our Groundhog’s Day commentary was the 2017 report of the Oregon Global Warming Commission, a body set up to monitor how well the state was doing in achieving its legally adopted goal to reduce greenhouse gas emissions by 75 percent from 1990 levels by the year 2050.  The short story two years ago was:  Not very good.  Although the state reduced some power plant and industrial emissions, nearly all these gains were wiped out by increased driving. The 2017 Legislature that received that report not only did essentially nothing in response, it arguably laid the groundwork to make the problem worse, by approving a new transportation finance package providing upwards of a billion dollars to widen Portland area freeways.

Two years later, the 2019 Oregon Global Warming Commission report told us that almost nothing had changed:  we’ve made no progress toward reducing our state’s greenhouse gas emissions. The stark reality of climate change was brought home in the late summer of 2018, when massive forest fires in California choked most of Oregon in acrid brown smoke, a fact captured on the cover of the 2019 report.

 

The reasons for the growing problem were itemized in the commission’s biennial report to the state Legislature. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The news is not good. Oregon’s emissions are rising. And the culprit is clear:  More driving.  Here’s the verdict from the commission’s report.

This finding confirms exactly what we’ve pointed out at City Observatory: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  As we’ve reported before, per capita greenhouse gas emissions from transportation have increased more than 14 percent since 2013, by about 1,000 pounds per person. Total greenhouse gas emissions from transportation have increased by about 1.6 million tons per year over that time.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, there’s no way the state on the path it needs to be on to reach its 2050 goal. Transportation emissions, which are now the single largest source of greenhouse gases in Oregon, are actually increasing from their 2010 levels–at a time when the state’s climate strategy called for them to be declining. (The blue line is the glide slope to achieving the state’s adopted 2050 goal, and the pale green line is the estimate of where the state is headed).  The best estimate is that rather than the eighty percent reduction from 1990 levels that the state has set as a goal, we’ll see barely a 20 percent reduction by 2050.  And these reductions are predicated on assumptions about more rapid fleet turnover, lower sales of trucks and SUVs, and steadily tougher fuel economy standards, when in fact the fleet is getting older (and staying dirtier), trucks and SUVs have dramatically increased their market share, and the federal government has walked back its fuel economy standards.  So its highly unlikely that even the 20 percent reduction by 2050 will be realized.

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation. After all, Oregon’s government passed a law mandating a reduction in greenhouse gases.

Oregon was, in fact, one of the first states to set its own local goals for reducing greenhouse gases. In a law adopted in 2007, the state committed to reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. Many other states and cities have similar adopted goals. Around the nation, much hope is being placed on the continued rhetorical commitment of many mayors and governors to achieving these reductions and thereby making progress to holding up America’s obligations under the the Paris accords. Ardent proponents of “The New Localism” tell us that even if the federal government ignores climate change, state and local leaders will get us on the right track.

Sadly, there’s a deep flaw in this approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far in the future, typically beyond the expected political lifetime of any of the public officials adopting these goals. And there’s little if any mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

The Hippocratic Oath directs physicians, first, “to do no harm.”  The same ought to apply to state and local governments who profess to care about climate change. If we know driving is the biggest source of carbon pollution, and it’s causing us to increase emissions when we need to be decreasing them, the very last thing we should be doing is expanding freeways, which encourage people to drive even more. That’s why we think that Portland area freeway widening projects like the proposed $800 million expansion of I-5 should be taken off the table.  That money–and billions of dollars in other subsidies to automobile transportation, would be far better spent in building communities that are designed to enable low carbon living.

The International Panel on Climate Change has made it clear in its most recent report that we’re rapidly approaching a point of no return if we’re to avoid serious and permanent damage to the climate. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. We’d like to believe that things will be very different next Groundhog’s Day, but alas, we’ve seen this movie before.

 

It’s Groundhog’s Day yet again, Oregon: How’s your climate change strategy working?

Another year later, and we’re still stuck with the same hypocrisy on climate change

If it seems like you’ve read this post before at City Observatory, you’re not wrong. For the past couple of years, every Groundhog’s Day, we’ve stuck our heads up and looked around to see whether anything has changed when it comes to coping with the growing crisis of global warming. Once again, we find ourselves in the same predicament as Bill Murray in the 1993 movie, Groundhog’s Day–waking up every morning to discover that it’s still February 2, and that he’s done nothing to change any of the behaviors that have messed up his life.

The original inspiration for our Groundhog’s Day commentary was the 2017 report of the Oregon Global Warming Commission, a body set up to monitor how well the state was doing in achieving its legally adopted goal to reduce greenhouse gas emissions by 75 percent from 1990 levels by the year 2050.  The short story two years ago was:  Not very good.  Although the state reduced some power plant and industrial emissions, nearly all these gains were wiped out by increased driving. The 2017 Legislature that received that report not only did essentially nothing in response, it arguably laid the groundwork to make the problem worse, by approving a new transportation finance package providing upwards of a billion dollars to widen Portland area freeways.

So it’s little surprise, really, that the new 2019 Oregon Global Warming Commission report tells us that in the past two years we’ve made no progress toward reducing our state’s greenhouse gas emissions. The stark reality of climate change was brought home in the late summer of 2018, when massive forest fires in California choked most of Oregon in acrid brown smoke, a fact captured on the cover of the 2019 report.

 

The reasons for the growing problem are itemized in the commission’s biennial report to the state Legislature. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The news is not good. Oregon’s emissions are rising. And the culprit is clear:  More driving.  Here’s the verdict from the commission’s report.

This finding confirms exactly what we’ve pointed out at City Observatory: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Oregon’s vehicle miles traveled, which had been declining steadily, ticked up in 2015 and 2016, as did its fatality rate.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, it looks like there’s no way the state is anywhere close to the path it needs to be on to reach its 2050 goal. Transportation emissions, which are now the single largest source of greenhouse gases in Oregon, are actually increasing from their 2010 levels–at a time when the state’s climate strategy called for them to be declining. (The blue line is the glide slope to achieving the state’s adopted 2050 goal, and the pale green line is the estimate of where the state is headed).  The best estimate is that rather than the eighty percent reduction from 1990 levels that the state has set as a goal, we’ll see barely a 20 percent reduction by 2050.

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation. After all, Oregon’s government passed a law mandating a reduction in greenhouse gases.

Oregon was, in fact, one of the first states to set its own local goals for reducing greenhouse gases. In a law adopted in 2007, the state committed to reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. Many other states and cities have similar adopted goals. Around the nation, much hope is being placed on the continued rhetorical commitment of many mayors and governors to achieving these reductions and thereby making progress to holding up America’s obligations under the the Paris accords. Ardent proponents of “The New Localism” tell us that even if the federal government ignores climate change, state and local leaders will get us on the right track.

Sadly, there’s a deep flaw in this approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far in the future, typically beyond the expected political lifetime of any of the public officials adopting these goals. And there’s little if any mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

The Hippocratic Oath directs physicians, first, “to do no harm.”  The same ought to apply to state and local governments who profess to care about climate change. If we know driving is the biggest source of carbon pollution, and it’s causing us to increase emissions when we need to be decreasing them, the very last thing we should be doing is expanding freeways, which encourage people to drive even more. That’s why we think that Portland area freeway widening projects like the proposed half-billion dollar expansion of I-5 should be taken off the table.  That money–and billions of dollars in other subsidies to automobile transportation, would be far better spent in building communities that are designed to enable low carbon living.

The International Panel on Climate Change has made it clear in its most recent report that we’re rapidly approaching a point of no return if we’re to avoid serious and permanent damage to the climate. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. We’d like to believe that things will be very different next Groundhog’s Day, but alas, we’ve seen this movie before.

 

It’s Groundhog’s Day again, Oregon: How’s your climate change strategy working?

A year later, and we’re still stuck with the same hypocrisy on climate change

The 1993 movie, Groundhog’s day has been a cultural touchstone for the endless do-loop of futility. Bill Murray finds himself waking up every morning to discover that its still February 2, and that he’s done nothing to change any of the behaviors that have messed up his life. Unfortunately, Groundhog’s Day is a metaphor for our approach to climate change: February 2nd rolls around again, and we find we’re in exactly the same mess we were in a year ago.  Case in point: Oregon.

Last year, we noted the release of Oregon’s Global Warming Commission report, which showed that after making some furtive progress after 2000, nearly all the gains in reducing greenhouse gas emissions had been wiped out by increased driving. At the time, we asked, why, if the state was serious about its legislatively adopted goals to dramatically reduce these emissions, it was largely ignoring its own report, and planning to spend several billion dollars widening highways.

It’s a year later, and nothing has changed; if anything the situation is worse. Transportation related emissions are still increasing, and the state Legislature has approved a plan that could lead to spending more than a billion dollars widening three Portland area freeways.

Like many states and cities, Oregon has set its own local goals for reducing greenhouse gases. In a law adopted in 2007, the state committed to reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. In light of the Trump Administration’s denial of the scientific evidence on global warming, and its withdrawal from the Paris accords, many environmental activists are pinning their hopes for progress on this kind of local effort.

Sadly, there’s a deep flaw in this approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far in the future, typically beyond the expected political lifetime of any of the public officials adopting these goals. And there’s little if any mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

Purple mountain tragedy. Oregon’s iconic Mt. Hood sees its snowpack disappearing.

That fear was brought home by the release of the Oregon Global Warming Commission’s biennial report to the state Legislature. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The news is not good. Oregon’s emissions are rising. The commission warns:

Key Takeaway: Rising transportation emissions are driving increases in statewide emissions.

As the updated greenhouse gas inventory data clearly indicate, Oregon’s emissions had been declining or holding relatively steady through 2014 but recorded a non-trivial increase between 2014 and 2015. The majority of this increase (60%) was due to increased emissions from the transportation sector, specifically the use of gasoline and diesel. The reversal of the recent trend in emissions declines, both in the transportation sector and statewide, likely means that Oregon will not meet its 2020 emission reduction goal. More action is needed, particularly in the transportation sector, if the state is to meet our longer-term GHG reduction goals.

This finding confirms exactly what we’ve pointed out at City Observatory: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Oregon’s vehicle miles traveled, which had been declining steadily, ticked up in 2015 and 2016, as did its fatality rate.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, it looks like there’s no way the state is anywhere close to the path it needs to be on to reach its 2050 goal. (The blue line shows progress to date, the yellow line is the glide slope to achieving the state’s adopted 2050 goal, and the red line is the commission’s estimate of where the state is headed).

Oregon Global Warming Commission, 2017

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation.

In the real world, legislators did pretty much the opposite, approving a transportation bill that promises as much as $1.5 billion to widen three Portland area freeways. They also postponed consideration of an actual climate change bill until the 2018 session.

Last year, we reported that climate data showed that 2016 was the warmest year on record. That record was nearly equalled by 2017, which was the warmest year ever recorded in which we didn’t have an El Nino event, and was among the three warmest years ever. Global warming is an increasingly obvious reality. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. We’ll check in again next Groundhog’s day to see if anything’s different.

 

Happy Groundhog’s Day, Oregon

Climate change gets lip service, highways get billions.

Like many states and cities, Oregon has been a leader in setting its own local goals for reducing greenhouse gases. In a law adopted in 2007, the state set the goal of reducing its greenhouse gas emissions by 10 percent from 1990 levels by the year 2020, and the further goal of reducing them by 75 percent by 2050. In light of the Trump Administration’s denial of the scientific evidence on global warming, many environmental activists are pinning their hopes for progress on this kind of local effort.

Sadly, there’s a deep flaw in this approach. Despite the high-minded and quantitative nature of these goals, the actual date for their achievement is set far in the future, typically beyond the expected political lifetime of any of the public officials adopting these goals. And there’s little if any mechanism, aside from moral suasion, to require accomplishment of these goals. So in practice, what they may do is simply give politicians cover for expressing concern about climate change, without actually having to do anything substantive or difficult to attain it.

Purple mountain tragedy. Oregon’s iconic Mt. Hood sees its snowpack disappearing.

That fear was brought home by the release of the Oregon Global Warming Commission’s biennial report to the state Legislature. In addition to its goal, Oregon has a tiny citizen commission charged with riding herd on the state’s emissions inventory, and looking to see what, if any progress the state is making in reducing greenhouse gases. The news is not good. Oregon’s emissions are rising. The commission warns:

Key Takeaway: Rising transportation emissions are driving increases in statewide emissions.

As the updated greenhouse gas inventory data clearly indicate, Oregon’s emissions had been declining or holding relatively steady through 2014 but recorded a non-trivial increase between 2014 and 2015. The majority of this increase (60%) was due to increased emissions from the transportation sector, specifically the use of gasoline and diesel. The reversal of the recent trend in emissions declines, both in the transportation sector and statewide, likely means that Oregon will not meet its 2020 emission reduction goal. More action is needed, particularly in the transportation sector, if the state is to meet our longer-term GHG reduction goals.

This finding confirms exactly what we’ve pointed out at City Observatory over the past year: the decline in gasoline prices in mid-2014 prompted an increasing in driving and with it, an increase in crashes and carbon pollution.  Oregon’s vehicle miles traveled, which had been declining steadily, ticked up in 2015, as did its fatality rate.

As a result of the growth in driving and related emissions, and slower than expected progress in reducing emissions from other sources, it looks like there’s no way the state is anywhere close to the path it needs to be on to reach its 2050 goal. (The blue line shows progress to date, the yellow line is the glide slope to achieving the state’s adopted 2050 goal, and the red line is the commission’s estimate of where the state is headed).

Oregon Global Warming Commission, 2017

In the best of all possible worlds, this warning would prompt the Governor and legislators–ever mindful of their legally enacted commitment to reduce greenhouse gas emissions–to redouble their efforts and figure out ways to discourage carbon pollution, especially from transportation.

In the real world, legislators are actually poised to do pretty much the opposite. One of the leading priorities for the 2017 Oregon legislature is the enactment of a major new transportation package. Among its chief priorities: adding additional capacity to the state’s freeways. A new legislative report identifies a “need” for $1.3 billion annually in additional transportation funds for state and local highway projects.  As has been demonstrated time and again, additional capacity produces additional driving and increased emissions. And ironically, this proposal comes forward as the state faces a $1.7 billion shortfall in its general fund budget for the coming biennium, which is likely to lead to significant cuts to funding for schools and other public services.

Climate data now shows that 2016 was the warmest year on record. Global warming is an increasingly obvious reality. We’re going to need something more than soothing rhetoric and distant goals to avoid dramatically altering our planet. We’ll check in again next Groundhog’s day to see if anything’s different.

 

Happy Birthday America; Thanks Immigrants!

We celebrate the fourth of July by remembering that a nation composed overwhelmingly of immigrants owes them a special debt.

Lighting the way to a stronger US economy since 1886.

America is a nation of immigrants, and its economy is propelled and activated by its openness to immigration and the new ideas and entrepreneurial energy that immigrants provide. Its commonplace to remind ourselves that many of the nation’s greatest thinkers and entrepreneurs, Andrew Carnegie, Albert Einstein, Andy Grove and hundreds of others, were immigrants, if not refugees. All six of America’s 2016 Nobel Laureates were immigrants. The latest American Survey of Entrepreneurs, conducted by the Census Bureau shows that immigrant entrepreneurs who have started companies are both more likely to be concentrated in the important computer and information technology industries, and that these companies are stronger in innovation that similar firms started by native born Americans.

The fact that America stood as a beacon of freedom, and a haven from hate and oppression, has continually renewed and added to the nation’s talent and ideas. Immigration has also played a critical role in helping revitalize many previously depressed urban areas and neighborhoods. As Joel Mokyr explained in his terrific book “A Culture of Growth,” the key factor triggering the Enlightenment and the Industrial Revolution was the ease with which heterodox and creative thinkers could find sanctuary in other countries and spread their thinking across borders. The US was founded on the kind of openness and tolerance than underpinned this process, and flourished accordingly.

What’s true for our nation is especially true for the cities that power the nation’s economy. The health and prosperity of city economies hinges directly on their ability to be open to newcomers and new ideas. Immigrants are a source of economic energy for urban economies. Robert Shiller, winner of the Nobel Prize in Economics, quoting urbanist Jane Jacobs:

Cities grow organically, she said, capturing a certain dynamic, a virtuous circle, a specialized culture of expertise, with one industry leading to another, and with a reputation that attracts motivated and capable immigrants.

The critical role of immigration is abundantly clear when we look at the health and productivity of the nation’s urban economies. The metro areas with the highest fractions of foreign-born well-educated workers are among the nation’s most productive.

Metros with the most foreign born talent

Our benchmark for measuring foreign-born talent is to look at the proportion of a region’s college-educated population born outside the United States. We tap data from the Census Bureau’s American Community Survey, which tells us what share of those aged 25 and older who have at least a four-year college degree were born outside the United States. (This tabulation doesn’t distinguish between those who came to the US as children and were educated here, and those who may have immigrated to the US later in life as adults, but shows the gross effect of all immigration). In the typical large metropolitan area in the United States, about one in seven college educated adults was born outside the nation. And in some of our largest and most economically important metropolitan areas, the share is much higher: a majority of those with four-year or higher degrees in Silicon Valley are from elsewhere, as are a third of the best educated in New York, Los Angeles, and Miami.

 

Foreign born talent and productivity

We’ve plotted the relationship between the share of a metropolitan area’s college-educated population born outside the United States and its productivity, as measured by gross metropolitan product per capita.  Gross metropolitan product is the regional analog of gross domestic product, the total value of goods and services produced, and is calculated by the Bureau of Economic Analysis.  The sizes of the circles shown in this chart are proportional to the population of each of these metropolitan areas.

These data show a clear positive relationship between the presence of foreign-born talent and productivity.  Several of the nation’s most productive metropolitan areas–San Jose, San Francisco, New York and Seattle–all have above average levels of foreign-born persons among their best educated.

Of course, these data represent only a correlation, and there are good reasons to believe that the arrows of causality run in both directions: more well-educated immigrants make an area more productive and more productive areas tend to attract (and retain) more talented immigrants. But it’s striking that some of the nation’s most vibrant economies, places that are at the forefront of generating the new ideas and technology that sustain US global economic leadership, are places that are open and welcoming to the best and brightest from around the world.

There are many of reasons to oppose Trump Administration’s repeated attempts to close America’s borders to immigrants. The most important reasons are moral, ethical and legal. But on top of them, there’s a strongly pragmatic, economic rationale as well: the health and dynamism of the US economy, and of the metropolitan areas that power the knowledge-driven sectors of that economy, depend critically on the openness to smart people from around the world.

 

 

What makes America great, as always: Immigrants

Happy Independence Day, America!

All Americans are immigrants (Even the Native American tribes trace their origins to Asians who migrated over the Siberian-Alaskan land bridge during the last ice age). And this nation of immigrants has always grown stronger by embracing newcomers who want to share in, and help build the American dream. So here, on Independence Day, is a short reminder of why immigration matters to our economic success, viewed as we usually do, through the lens of our nation’s cities.

Lighting the way to a stronger US economy since 1886.

 

America is a nation of immigrants, and its economy is propelled and activated by its openness to immigration and the new ideas and entrepreneurial energy that immigrants provide. Its commonplace to remind ourselves that many of the nation’s greatest thinkers and entrepreneurs, Andrew Carnegie, Albert Einstein, Andy Grove and hundreds of others, were immigrants, if not refugees. All six of America’s 2016 Nobel Laureates were immigrants. The fact that America stood as a beacon of freedom, and a haven from hate and oppression, has continually renewed and added to the nation’s talent and ideas. Immigration has also played a critical role in helping revitalize many previously depressed urban areas and neighborhoods. As Joel Mokyr explains in his terrific new book “A Culture of Growth,” the key factor triggering the Enlightenment and the Industrial Revolution was the ease with which heterodox and creative thinkers could find sanctuary in other countries and spread their thinking across borders. The US was founded on the kind of openness and tolerance than underpinned this process, and flourished accordingly.

The critical role of immigration is abundantly clear when we look at the health and productivity of the nation’s urban economies. The metro areas with the highest fractions of foreign-born well-educated workers are among the nation’s most productive.

Metros with the most foreign born talent

Our benchmark for measuring foreign-born talent is to look at the proportion of a region’s college-educated population born outside the United States. We tap data from the Census Bureau’s American Community Survey, which tells us what share of those aged 25 and older who have at least a four-year college degree were born outside the United States. (This tabulation doesn’t distinguish between those who came to the US as children and were educated here, and those who may have immigrated to the US later in life as adults, but shows the gross effect of all immigration). In the typical large metropolitan area in the United States, about one in seven college educated adults was born outside the nation. And in some of our largest and most economically important metropolitan areas, the share is much higher: a majority of those with four-year or higher degrees in Silicon Valley are from elsewhere, as are a third of the best educated in New York, Los Angeles, and Miami.

 

Foreign born talent and productivity

We’ve plotted the relationship between the share of a metropolitan area’s college-educated population born outside the United States and its productivity, as measured by gross metropolitan product per capita.  Gross metropolitan product is the regional analog of gross domestic product, the total value of goods and services produced, and is calculated by the Bureau of Economic Analysis.  The sizes of the circles shown in this chart are proportional to the population of each of these metropolitan areas.

These data show a clear positive relationship between the presence of foreign-born talent and productivity.  Several of the nation’s most productive metropolitan areas–San Jose, San Francisco, New York and Seattle–all have above average levels of foreign-born persons among their best educated.

Of course, these data represent only a correlation, and there are good reasons to believe that the arrows of causality run in both directions: more well-educated immigrants make an area more productive and more productive areas tend to attract (and retain) more talented immigrants. But it’s striking that some of the nation’s most vibrant economies, places that are at the forefront of generating the new ideas and technology that sustain US global economic leadership, are places that are open and welcoming to the best and brightest from around the world.

There are a lot of reasons to oppose President Trump’s ban on immigration from these Islamic countries. The most important reasons are moral, ethical and legal. But on top of them, there’s a strongly pragmatic, economic rationale as well: the health and dynamism of the US economy, and of the metropolitan areas that power the knowledge-driven sectors of that economy, depend critically on the openness to smart people from around the world.

 

 

Openness to immigration drives economic success

Last Friday, President Trump signed an Executive Order effectively blocking entry to the US for nationals of seven countries—Iraq, Iran, Libya, Somalia, Sudan, Syria, and Yemen. We’ll leave aside the fearful, xenophobic and anti-American aspects of this policy: others have addressed them far more eloquently than we can at City Observatory.  And while there’s no question that the moral, ethical and constitutional problems with this order are more that sufficient to invalidate it, to these we’ll add an economic angle, which though secondary, is hardly minor.

Lighting the way to a stronger US economy since 1886.

 

America is a nation of immigrants, and its economy is propelled and activated by its openness to immigration and the new ideas and entrepreneurial energy that immigrants provide. Its commonplace to remind ourselves that many of the nation’s greatest thinkers and entrepreneurs, Andrew Carnegie, Albert Einstein, Andy Grove and hundreds of others, were immigrants, if not refugees. All six of America’s 2016 Nobel Laureates were immigrants. The fact that America stood as a beacon of freedom, and a haven from hate and oppression, has continually renewed and added to the nation’s talent and ideas. Immigration has also played a critical role in helping revitalize many previously depressed urban areas and neighborhoods. As Joel Mokyr explains in his terrific new book “A Culture of Growth,” the key factor triggering the Enlightenment and the Industrial Revolution was the ease with which heterodox and creative thinkers could find sanctuary in other countries and spread their thinking across borders. The US was founded on the kind of openness and tolerance than underpinned this process, and flourished accordingly.

The critical role of immigration is abundantly clear when we look at the health and productivity of the nation’s urban economies. The metro areas with the highest fractions of foreign-born well-educated workers are among the nation’s most productive.

Metros with the most foreign born talent

Our benchmark for measuring foreign-born talent is to look at the proportion of a region’s college-educated population born outside the United States. We tap data from the Census Bureau’s American Community Survey, which tells us what share of those aged 25 and older who have at least a four-year college degree were born outside the United States. (This tabulation doesn’t distinguish between those who came to the US as children and were educated here, and those who may have immigrated to the US later in life as adults, but shows the gross effect of all immigration). In the typical large metropolitan area in the United States, about one in seven college educated adults was born outside the nation. And in some of our largest and most economically important metropolitan areas, the share is much higher: a majority of those with four-year or higher degrees in Silicon Valley are from elsewhere, as are a third of the best educated in New York, Los Angeles, and Miami.

 

Foreign born talent and productivity

We’ve plotted the relationship between the share of a metropolitan area’s college-educated population born outside the United States and its productivity, as measured by gross metropolitan product per capita.  Gross metropolitan product is the regional analog of gross domestic product, the total value of goods and services produced, and is calculated by the Bureau of Economic Analysis.  The sizes of the circles shown in this chart are proportional to the population of each of these metropolitan areas.

These data show a clear positive relationship between the presence of foreign-born talent and productivity.  Several of the nation’s most productive metropolitan areas–San Jose, San Francisco, New York and Seattle–all have above average levels of foreign-born persons among their best educated.

Of course, these data represent only a correlation, and there are good reasons to believe that the arrows of causality run in both directions: more well-educated immigrants make an area more productive and more productive areas tend to attract (and retain) more talented immigrants. But it’s striking that some of the nation’s most vibrant economies, places that are at the forefront of generating the new ideas and technology that sustain US global economic leadership, are places that are open and welcoming to the best and brightest from around the world.

There are a lot of reasons to oppose President Trump’s ban on immigration from these Islamic countries. The most important reasons are moral, ethical and legal. But on top of them, there’s a strongly pragmatic, economic rationale as well: the health and dynamism of the US economy, and of the metropolitan areas that power the knowledge-driven sectors of that economy, depend critically on the openness to smart people from around the world.

 

 

The constancy of change in neighborhood populations

Neighborhoods are always changing; half of all renters move every two years.

There’s a subtle perceptual bias that underlies many of the stories about gentrification and neighborhood change. The canonical journalistic account of gentrification focuses on the observable fact that different people now live in a neighborhood than used to live there at some previous time. We seem to assume that most neighborhoods are stable and unchanging, and that absent some dramatic change, like gentrification, the people who lived in that neighborhood are the same ones who lived their a decade ago, and without such change, would be likely to live there a decade hence.  But constant population change or turnover is a regular feature of most neighborhoods, a fact confirmed by a recent study. To summarize the key takeaways:

  • The population of urban neighborhoods is always changing because moving is so common, especially for renters.
  • There’s little evidence that gentrification causes overall rates of moving to increase, either for homeowners or renters.
  • Homeowners don’t seem to be affected at all, and there’s no evidence that higher property taxes (or property tax breaks) influence moving decisions.
  • While involuntary moves for renters increase slightly in gentrified neighborhoods, there’s no significant change in total moves

In an article published in Urban Affairs Review, “Gentrification, Property Tax Limitation and Displacement,” Isaac William Martin and Kevin Beck present their analysis of longitudinal data from the Panel Survey of Income Dynamics that track family moves over more than a decade.  An un-gated version of the paper is available here. One of the challenges of studying gentrification and neighborhood change is that most data simply provides snapshots of a neighborhood’s population at a given point in time, and provides little information about the comings and goings of different households. The PSID sample is unusual, in that in tracks households and individuals over a period of decades–this study uses data on the movement of household heads from 1987 through 2009. Martin and Beck were able to access confidential data that reports neighborhood location and enables them to identify the movement of households to different neighborhoods.

Richard Florida reviewed the Martin and Beck paper at City Lab and highlighted two of the study’s key findings:  that homeowners don’t seem to be displaced by gentrification and a subsidiary finding that property taxes (and tax breaks for homeowners) don’t seem to affect displacement.  These are both significant findings, but we want to step back and look at the broader picture this study paints of how neighborhoods change, because this study provides a useful context for understanding the complex dynamics of migration that are often left out of discussions of gentrification.

Change is a constant–Most renters have moved on after two years

One of the most striking findings from this study is how frequently renters move. These data show than in any given two-year period a majority (54 percent) of renter households had moved to a different neighborhood. The average tenure (length of time they’ve lived in their current residence) is on average 1.7 years. (Table 1).  Moving rates are lower (16 percent over two years) for homeowners, and average tenures are considerably longer (4.9 years, on average).  But the important thing to keep in mind is just how much volatility and turnover there is in neighborhood populations. Statistically, if about half of all renters move out of a neighborhood every two years, the probability than any current renter will live in that neighborhood ten years hence is about 3 percent (0.5 raised to the fifth power).

Many of the public discussions of gentrification assume that somehow, in the absence of gentrification, neighborhoods would somehow remain just the same, and that few or no residents would move away. This study shows reminds us that this isn’t true. In addition, we know that for poor neighborhoods that don’t see reductions in poverty rates, that population steadily declines. Lost in Place, our own study of poor neighborhoods shows that over 4 decades, the three-quarters of poor neighborhoods that didn’t rebound lost 40 percent of their population.

Most moves are voluntary

Unlike many other studies, the Martin and Beck paper is able to use survey data to try and discern the motivations for household moves. Broadly speaking they divide moves into “voluntary” and “involuntary” moves.  The PSID asks movers why they moved, and those that respond to this open-ended question with answers coded as “moved in response to outside events including being evicted, health reasons, divorce, joining the armed services, or other involuntary reasons” are treated as involuntary moves.As they note, the distinction isn’t always as sharp as one would like, and it may be that some respondents rationalize some involuntary moves as voluntary ones, but the self-reported data are clear:  among renters, voluntary moves dramatically outnumber involuntary ones.  About 54 percent of all renters moved in the last two years; about 13 percent of all renters reported an involuntary move.  That means that about 75 percent of all renter moves were voluntary and about 25 percent of renter moves were involuntary.  As Margery Turner and her colleagues at the Urban Institute have shown, moving to another neighborhood is often the way poor families get better access to jobs, better quality schools, safer neighborhoods and better housing.

Gentrification has no impact on overall renter moves, but is associated with a small increase in involuntary moves

One of the most important studies of gentrification is Lance Freeman’s 2005 paper “Displacement or Succession?: Residential Mobility in Gentrifying Neighborhoods” which found that gentrification had essentially no effect on the rate at which households moved out of gentrifying neighborhoods.  Martin and Beck replicate this finding for all moves by renter households, they write:

Consistent with Freeman’s findings, Model 2 indicates that we cannot be confident that the average effect of gentrification on the probability of moving out is different from zero.

Graphically, Martin and Beck’s findings are can be depicted as follows.  About 54 percent of all renters move within two years. According to Martin and Beck’s modeling, the probability that a person in a gentrifying neighborhood moves in two years is about 1.7 percentage points  greater than for the typical person (after controlling for individual household characteristics). That suggests that for a typical resident, their probability of moving in a gentrifying neighborhood is about 55.7 percent, but that estimate in not statistically significant.

When they look just at “involuntary” moves, however, they find that there is a statistically significant effect of gentrification on the probability of moving.  Specifically, they find that rental households in in gentrifying neighborhoods are about 2.6 percent points more likely to report an in “involuntary move” in the past two years than those who don’t live in gentrifying neighborhoods.  Its important to put that in context.  According to the paper, about 54% of all renters moved in the last two years, and about 13 percent of them experienced an “involuntary move.”  The estimate in the paper is that the effect of living in a gentrifying neighborhood is about a 2.6 percentage point increase in the likelihood of an “involuntary” move.  That means if the average renter has a 13 percent chance of an involuntary move, a renter in a gentrifying neighborhood has a 15.6 percent chance of such a move.  These results are shown below:

Here, the estimate that a renter makes an involuntary move from a gentrifying neighborhood  (+2.6 percentage points) is greater than the 95 percent confidence interval, which suggest that there is a statistically significant difference between the share of the population experiencing involuntary moves in gentrifying neighborhoods as compared to all neighborhoods.


What would that look like in a typical neighborhood?  If you have a neighborhood with 2,000 households (about 5,000 people, with about 2.5 persons per household), and about half are renters and half are homeowners, you would expect of the 1,000 renting households that about 130 households would experience an involuntary move over a two year period.  If that tract gentrified, you would expect an additional 26 households to experience an “involuntary move.” But you would also expect 530 total households to have moved out of the neighborhood in that time, for all reasons, voluntary and involuntary.  These data put the scale of the gentrification effect in perspective. Whether or not they gentrify, there’s going to be enormous change in the renter population of any given urban neighborhood.

Gentrification has no impact on homeowner moves

Martin and Beck find no evidence that homeowners in gentrifying neighobrhoods are more likely to move, either in the aggregate, or involuntarily.  They test a number of different models of the connection between gentrification and moving: none produce statistically significant correlations between gentrification and moving; in some cases (though statistically insignificant) the correlation is negative: gentrification is associated with fewer homeowners moving from a neighborhood.  Their conclusion: for homeowners, their study “produces no evidence of displacement from gentrifying neighborhoods.”

Property taxes (and tax breaks) seem to have no connection with homeowner movement from gentrifying neighborhoods

One popular argument is that gentrification pushes up property values and results in higher property taxes for homeowners, and that especially for households with a fixed income, the burden of higher property taxes is likely to force them to move. Martin and Beck look closely at this question, and examine how changes in property assessments and property taxes correlate with the probability of moving. They find that there’s no statistically significant link between property taxes and moving in gentrifying neighborhoods.  Several states and localities have enacted property tax or assessment limitations, in part with the objective of lessening the financial exposure of fixed income households to the burden of higher property taxes. Martin and Beck look at the relationship between such limits and the probability of moving, and find that such limits don’t seem to have any effect on whether homeowners move out of gentrifying neighborhoods or not.

While homeowners in gentrifying neighborhoods have to shoulder the burden of paying higher property taxes, its typically only because their homes have appreciated more in value. In most cities, property taxes are levied at a rate equal to about 1 to 2 percent of a property’s market value, so the wealth effect of property appreciation dwarfs the negative income effect of having to pay higher property taxes.

Urban renters are a highly mobile group. Most renting households are likely to have changed neighborhoods in the past two years. We observe the same overall level of movement out of neighborhoods whether they gentrify or not.  This study suggests that somewhat more of those moves would be involuntary rather than voluntary.

 

 

The constancy of change in neighborhood populations

Neighborhoods are always changing; half of all renters move every two years.

There’s a subtle perceptual bias that underlies many of the stories about gentrification and neighborhood change. The canonical journalistic account of gentrification focuses on the observable fact that different people now live in a neighborhood than used to live there at some previous time. We seem to assume that most neighborhoods are stable and un-changing, and that absent some dramatic change, like gentrification, the people who lived in that neighborhood are the same ones who lived their a decade ago, and without such change, would be likely to live there a decade hence.  But constant population change or turnover is a regular feature of most neighborhoods, a fact confirmed by a recent study. To summarize the key takeaways:

  • The population of urban neighborhoods is always changing because moving is so common, especially for renters.
  • There’s little evidence that gentrification causes overall rates of moving to increase, either for homeowners or renters.
  • Homeowners don’t seem to be affected at all, and there’s no evidence that higher property taxes (or property tax breaks) influence moving decisions.
  • While involuntary moves for renters increase slightly in gentrified neighborhoods, there’s no significant change in total moves

In an article published in Urban Affairs Review, “Gentrification, Property Tax Limitation and Displacement,” Isaac William Martin and Keven Beck present their analysis of longitudinal data from the Panel Survey of Income Dynamics that track family moves over more than a decade.  An un-gated version of the paper is available here. One of the challenges of studying gentrification and neighborhood change is that most data simply provides snapshots of a neighborhood’s population at a given point in time, and provides little information about the comings and goings of different households. The PSID sample is unusual, in that in tracks households and individuals over a period of decades–this study uses data on the movement of household heads from 1987 through 2009. Martin and Beck were able to access confidential data that reports neighborhood location and enables them to identify the movement of households to different neighborhoods.

Richard Florida reviewed the Martin and Beck paper at City Lab and highlighted two of the study’s key findings:  that homeowners don’t seem to be displaced by gentrification and a subsidiary finding that property taxes (and tax breaks for homeowners) don’t seem to affect displacement.  These are both significant findings, but we want to step back and look at the broader picture this study paints of how neighborhoods change, because this study provides a useful context for understanding the complex dynamics of migration that are often left out of discussions of gentrification.

Change is a constant–Most renters have moved on after two years

One of the most striking findings from this study is how frequently renters move. These data show than in any given two-year period a majority (54 percent) of renter households had moved to a different neighborhood. The average tenure (length of time they’ve lived in their current residence) is on average 1.7 years. (Table 1).  Moving rates are lower (16 percent over two years) for homeowners, and average tenures are considerably longer (4.9 years, on average).  But the important thing to keep in mind is just how much volatility and turnover there is in neighborhood populations. Statistically, if about half of all renters move out of a neighborhood every two years, the probability than any current renter will live in that neighborhood ten years hence is about 3 percent (0.5 raised to the fifth power).

Many of the public discussions of gentrification assume that somehow, in the absence of gentrification, neighborhoods would somehow remain just the same, and that few or no residents would move away. This study shows reminds us that this isn’t true. In addition, we know that for poor neighborhoods that don’t see reductions in poverty rates, that population steadily declines. Our own study of poor neighborhoods shows that over 4 decades, the three-quarters of poor neighborhoods that didn’t rebound lost 40 percent of their population.

Most moves are voluntary

Unlike many other studies, the Martin and Beck paper is able to use survey data to try and discern the motivations for household moves. Broadly speaking they divide moves into “voluntary” and “involuntary” moves.  The PSID asks movers why they moved, and those that respond to this open-ended question with answers coded as “moved in response to outside events including being evicted, health reasons, divorce, joining the armed services, or other involuntary reasons” are treated as involuntary moves.As they note, the distinction isn’t always as sharp as one would like, and it may be that some respondents rationalize some involuntary moves as voluntary ones, but the self-reported data are clear:  among renters, voluntary moves dramatically outnumber involuntary ones.  About 54 percent of all renters moved in the last two years; about 13 percent of all renters reported an involuntary move.  That means that about 75 percent of all renter moves were voluntary and about 25 percent of renter moves were involuntary.  As Margery Turner and her colleagues at the Urban Institute have shown, moving to another neighborhood is often the way poor families get better access to jobs, better quality schools, safer neighborhoods and better housing.

Gentrification has no impact on overall renter moves, but is associated with a small increase in involuntary moves

One of the most important studies of gentrification is Lance Freeman’s 2005 paper “Displacement or Succession?: Residential Mobility in Gentrifying Neighborhoods” which found that gentrification had essentially no effect on the rate at which households moved out of gentrifying neighborhoods.  Martin and Beck replicate this finding for all moves by renter households, they write:

Consistent with Freeman’s findings, Model 2 indicates that we cannot be confident that the average effect of gentrification on the probability of moving out is different from zero.

Graphically, Martin and Beck’s findings are can be depicted as follows.  About 54 percent of all renters move within two years. According to Martin and Beck’s modeling, the probability that a person in a gentrifying neighborhood moves in two years is about 1.7 percentage points  greater than for the typical person (after controlling for individual household characteristics). That suggests that for a typical resident, their probability of moving in a gentrifying neighborhood is about 55.7 percent, but that estimate in not statistically significant.

When they look just at “involuntary” moves, however, they find that there is a statistically significant effect of gentrification on the probability of moving.  Specifically, they find that rental households in in gentrifying neighborhoods are about 2.6 percent points more likely to report an in “involuntary move” in the past two years than those who don’t live in gentrifying neighborhoods.  Its important to put that in context.  According to the paper, about 54% of all renters moved in the last two years, and about 13 percent of them experienced an “involuntary move.”  The estimate in the paper is that the effect of living in a gentrifying neighborhood is about a 2.6 percentage point increase in the likelihood of an “involuntary” move.  That means if the average renter has a 13 percent chance of an involuntary move, a renter in a gentrifying neighborhood has a 15.6 percent chance of such a move.  These results are shown below:

Here, the estimate that a renter makes an involuntary move from a gentrifying neighborhood  (+2.6 percentage points) is greater than the 95 percent confidence interval, which suggest that there is a statistically significant difference between the share of the population experiencing involuntary moves in gentrifying neighborhoods as compared to all neighborhoods.


What would that look like in a typical neighborhood?  If you have a neighborhood with 2,000 households (about 5,000 people, with about 2.5 persons per household), and about half are renters and half are homeowners, you would expect of the 1,000 renting households that about 130 households would experience an involuntary move over a two year period.  If that tract gentrified, you would expect an additional 26 households to experience an “involuntary move.” But you would also expect 530 total households to have moved out of the neighborhood in that time, for all reasons, voluntary and involuntary.  These data put the scale of the gentrification effect in perspective. Whether or not they gentrify, there’s going to be enormous change in the renter population of any given urban neighborhood.

Gentrification has no impact on homeowner moves

Martin and Beck find no evidence that homeowners in gentrifying neighobrhoods are more likely to move, either in the aggregate, or involuntarily.  They test a number of different models of the connection between gentrification and moving: none produce statistically significant correlations between gentrification and moving; in some cases (though statistically insignificant) the correlation is negative: gentrification is associated with fewer homeowners moving from a neighborhood.  Their conclusion: for homeowners, their study “produces no evidence of displacement from gentrifying neighborhoods.”

Property taxes (and tax breaks) seem to have no connection with homeowner movement from gentrifying neighborhoods

One popular argument is that gentrification pushes up property values and results in higher property taxes for homeowners, and that especially for households with a fixed income, the burden of higher property taxes is likely to force them to move. Martin and Beck look closely at this question, and examine how changes in property assessments and property taxes correlate with the probability of moving. They find that there’s no statistically significant link between property taxes and moving in gentrifying neighborhoods.  Several states and localities have enacted property tax or assessment limitations, in part with the objective of lessening the financial exposure of fixed income households to the burden of higher property taxes. Martin and Beck look at the relationship between such limits and the probability of moving, and find that such limits don’t seem to have any effect on whether homeowners move out of gentrifying neighborhoods or not.

While homeowners in gentrifying neighborhoods have to shoulder the burden of paying higher property taxes, its typically only because their homes have appreciated more in value. In most cities, property taxes are levied at a rate equal to about 1 to 2 percent of a property’s market value, so the wealth effect of property appreciation dwarfs the negative income effect of having to pay higher property taxes.

Urban renters are a highly mobile group. Most renting households are likely to have changed neighborhoods in the past two years. We observe the same overall level of movement out of neighborhoods whether they gentrify or not.  This study suggests that somewhat more of those moves would be involuntary rather than voluntary.

 

 

Constant change and gentrification

A new study of gentrification sheds light on how neighborhoods change.  Here are the takeaways:

  • The population of urban neighborhoods is always changing because moving is so common, especially for renters.
  • There’s little evidence that gentrification causes overall rates of moving to increase, either for homeowners or renters.
  • Homeowners don’t seem to be affected at all, and there’s no evidence that higher property taxes (or property tax breaks) influence moving decisions.
  • While involuntary moves for renters increase slightly in gentrified neighborhoods, there’s no significant change in total moves

We’ve been closely reading a new study on gentrification and neighborhood change. In an article published in Urban Affairs Review, “Gentrification, Property Tax Limitation and Displacement,” Isaac William Martin and Keven Beck present their analysis of longitudinal data from the Panel Survey of Income Dynamics that track family moves over more than a decade.  An un-gated version of the paper is available here. One of the challenges of studying gentrification and neighborhood change is that most data simply provides snapshots of a neighborhood’s population at a given point in time, and provides little information about the comings and goings of different households. The PSID sample is unusual, in that in tracks households and individuals over a period of decades–this study uses data on the movement of household heads from 1987 through 2009. Martin and Beck were able to access confidential data that reports neighborhood location and enables them to identify the movement of households to different neighborhoods.

Richard Florida reviewed the Martin and Beck paper at City Lab and highlighted two of the study’s key findings:  that homeowners don’t seem to be displaced by gentrification and a subsidiary finding that property taxes (and tax breaks for homeowners) don’t seem to affect displacement.  These are both significant findings, but we want to step back and look at the broader picture this study paints of how neighborhoods change, because this study provides a useful context for understanding the complex dynamics of migration that are often left out of discussions of gentrification.

Change is a constant–Most renters have moved on after two years

One of the most striking findings from this study is how frequently renters move. These data show than in any given two-year period a majority (54 percent) of renter households had moved to a different neighborhood. The average tenure (length of time they’ve lived in their current residence) is on average 1.7 years. (Table 1).  Moving rates are lower (16 percent over two years) for homeowners, and average tenures are considerably longer (4.9 years, on average).  But the important thing to keep in mind is just how much volatility and turnover there is in neighborhood populations. Statistically, if about half of all renters move out of a neighborhood every two years, the probability than any current renter will live in that neighborhood ten years hence is about 3 percent (0.5 raised to the fifth power).

Many of the public discussions of gentrification assume that somehow, in the absence of gentrification, neighborhoods would somehow remain just the same, and that few or no residents would move away. This study shows reminds us that this isn’t true. In addition, we know that for poor neighborhoods that don’t see reductions in poverty rates, that population steadily declines. Our own study of poor neighborhoods shows that over 4 decades, the three-quarters of poor neighborhoods that didn’t rebound lost 40 percent of their population.

Most moves are voluntary

Unlike many other studies, the Martin and Beck paper is able to use survey data to try and discern the motivations for household moves. Broadly speaking they divide moves into “voluntary” and “involuntary” moves.  The PSID asks movers why they moved, and those that respond to this open-ended question with answers coded as “moved in response to outside events including being evicted, health reasons, divorce, joining the armed services, or other involuntary reasons” are treated as involuntary moves.As they note, the distinction isn’t always as sharp as one would like, and it may be that some respondents rationalize some involuntary moves as voluntary ones, but the self-reported data are clear:  among renters, voluntary moves dramatically outnumber involuntary ones.  About 54 percent of all renters moved in the last two years; about 13 percent of all renters reported an involuntary move.  That means that about 75 percent of all renter moves were voluntary and about 25 percent of renter moves were involuntary.  As Margery Turner and her colleagues at the Urban Institute have shown, moving to another neighborhood is often the way poor families get better access to jobs, better quality schools, safer neighborhoods and better housing.

Gentrification has no impact on overall renter moves, but is associated with a small increase in involuntary moves

One of the most important studies of gentrification is Lance Freeman’s 2005 paper “Displacement or Succession?: Residential Mobility in Gentrifying Neighborhoods” which found that gentrification had essentially no effect on the rate at which households moved out of gentrifying neighborhoods.  Martin and Beck replicate this finding for all moves by renter households, they write:

Consistent with Freeman’s findings, Model 2 indicates that we cannot be confident that the average effect of gentrification on the probability of moving out is different from zero.

Graphically, Martin and Beck’s findings are can be depicted as follows.  About 54 percent of all renters move within two years. According to Martin and Beck’s modeling, the probability that a person in a gentrifying neighborhood moves in two years is about 1.7 percentage points  greater than for the typical person (after controlling for individual household characteristics). That suggests that for a typical resident, their probability of moving in a gentrifying neighborhood is about 55.7 percent, but that estimate in not statistically significant.

When they look just at “involuntary” moves, however, they find that there is a statistically significant effect of gentrification on the probability of moving.  Specifically, they find that rental households in in gentrifying neighborhoods are about 2.6 percent points more likely to report an in “involuntary move” in the past two years than those who don’t live in gentrifying neighborhoods.  Its important to put that in context.  According to the paper, about 54% of all renters moved in the last two years, and about 13 percent of them experienced an “involuntary move.”  The estimate in the paper is that the effect of living in a gentrifying neighborhood is about a 2.6 percentage point increase in the likelihood of an “involuntary” move.  That means if the average renter has a 13 percent chance of an involuntary move, a renter in a gentrifying neighborhood has a 15.6 percent chance of such a move.  These results are shown below:

Here, the estimate that a renter makes an involuntary move from a gentrifying neighborhood  (+2.6 percentage points) is greater than the 95 percent confidence interval, which suggest that there is a statistically significant difference between the share of the population experiencing involuntary moves in gentrifying neighborhoods as compared to all neighborhoods.


What would that look like in a typical neighborhood?  If you have a neighborhood with 2,000 households (about 5,000 people, with about 2.5 persons per household), and about half are renters and half are homeowners, you would expect of the 1,000 renting households that about 130 households would experience an involuntary move over a two year period.  If that tract gentrified, you would expect an additional 26 households to experience an “involuntary move.” But you would also expect 530 total households to have moved out of the neighborhood in that time, for all reasons, voluntary and involuntary.  These data put the scale of the gentrification effect in perspective. Whether or not they gentrify, there’s going to be enormous change in the renter population of any given urban neighborhood.

Gentrification has no impact on homeowner moves

Martin and Beck find no evidence that homeowners in gentrifying neighobrhoods are more likely to move, either in the aggregate, or involuntarily.  They test a number of different models of the connection between gentrification and moving: none produce statistically significant correlations between gentrification and moving; in some cases (though statistically insignificant) the correlation is negative: gentrification is associated with fewer homeowners moving from a neighborhood.  Their conclusion: for homeowners, their study “produces no evidence of displacement from gentrifying neighborhoods.”

Property taxes (and tax breaks) seem to have no connection with homeowner movement from gentrifying neighborhoods

One popular argument is that gentrification pushes up property values and results in higher property taxes for homeowners, and that especially for households with a fixed income, the burden of higher property taxes is likely to force them to move. Martin and Beck look closely at this question, and examine how changes in property assessments and property taxes correlate with the probability of moving. They find that there’s no statistically significant link between property taxes and moving in gentrifying neighborhoods.  Several states and localities have enacted property tax or assessment limitations, in part with the objective of lessening the financial exposure of fixed income households to the burden of higher property taxes. Martin and Beck look at the relationship between such limits and the probability of moving, and find that such limits don’t seem to have any effect on whether homeowners move out of gentrifying neighborhoods or not.

While homeowners in gentrifying neighborhoods have to shoulder the burden of paying higher property taxes, its typically only because their homes have appreciated more in value. In most cities, property taxes are levied at a rate equal to about 1 to 2 percent of a property’s market value, so the wealth effect of property appreciation dwarfs the negative income effect of having to pay higher property taxes.

Urban renters are a highly mobile group. Most renting households are likely to have changed neighborhoods in the past two years. We observe the same overall level of movement out of neighborhoods whether they gentrify or not.  This study suggests that somewhat more of those moves would be involuntary rather than voluntary.

 

This post has been revised to correct typographical errors, and to replace an earlier data table with charts illustrating the same information.

Louisville’s experiment in transportation economics

As we pointed out yesterday, there’s some initial visual evidence–from peak hour traffic cameras–suggesting that Louisville’s decision to toll its downtown freeway bridges but leave a parallel four-lane bridge un-tolled has produced a significant diversion of traffic away from the freeway. Perhaps without knowing it, Louisville has embarked on an interesting and useful economic experiment.

One of the big questions in transportation economics is what value people attach to travel time savings: How much is it worth to me to shave five or ten minutes off my daily commute?  There are a lot of theoretical arguments about the value, but there’s nothing quite like an actual experiment which gives people real world choices and observes the results. And that’s just what Louisville has done. If you’re traveling across the Ohio River between Jeffersonville, Indiana and Louisville, Kentucky, you have a couple of choices: you can pay between $1-$4 and drive across the shiny new multi-lane I-65 bridges on the freeway, or you can use the old US 31 route, and take the 1930s-era Second Street Bridge for free.

At least as of last Tuesday, it looked like a lot of people were choosing the “free” way, instead of the “freeway.” The following photographs were taken by Louisville’s traffic cameras at shortly after 5 o’clock local time on January 17th. The lefthand photo shows the new freeway bridges; the right hand shows the Second Street bridge.

Toll: $1-$4.Free.

“I just take Second Street,” said Tijuan Howard, who lives in Louisville. “That’s just common sense. And whatever way you need to get to Jeffersonville, you can just take the back streets. People are going to figure that out. It’s not hard.”

 

The choices that actual travelers like Mr. Howard make will tell us a lot about how much monetary value people attach to travel time savings.  In traffic forecasting parlance this decision comes down to the “value of time”: how much do travelers value their time on an hourly basis. For a $3 toll to justify a two minute time savings, one’s time has to be worth about $90 an hour; to justify a $4 toll for two minutes of time savings, one’s time has to be worth $120 an hour. If you face the standard $3 toll, and your time is worth $15 an hour—a standard estimate in travel time studies—you’d wouldn’t find it economically worthwhile to use the toll crossing unless it saved you about 12 minutes of travel time.

This calculus suggests that is very likely that the tolls on the I-65 bridge will prompt many motorists to pull off the freeway and use the free crossing. Of course, if a large number of travelers exit the freeway, and take the parallel Second Street route, that will produce congestion, and increase travel times for those avoiding the toll. The limiting factor on this toll-related diversion is likely to be the capacity of the Second Street Bridge (which has two lanes in each direction) and the capacity of the surface streets connecting the Second Street Bridge to I-65 on each side of the river. We would expect delays to increase on this route at peak hours.

We can estimate how long the delays on the Second Street Bridge route are likely to be, given the value of commuter time. If commuters value their time at $15 per hour, then those facing a $4 toll would be willing to put up with a about 16 minutes of delay (4/15*60 = 16) before they’d be willing to pay a toll to use the bridge; those who had to pay a $3 toll would be willing to tolerate about 12 minutes of delay. Many motorists will attach value to the convenience and certainty of the tolled route, but it would be surprising if the diversion to the Second Avenue Bridge didn’t result in delays of ten minutes or more compared to using the tolled I-65 crossing at rush hours.

At off-peak hours, the diversion rate is likely to be even higher. Because the toll doesn’t vary by time of day, I-65 users will have to pay the same toll regardless of when they cross. While it might be a comparatively good deal to pay the toll during rush hours when the Second Street Bridge is crowded, at off peak hours, the additional time penalty will be closer to the traffic-free two minute estimate provided by Google Maps.

There’s one more wrinkle here as well: The $2-$4 tolls are for cars; medium and heavy trucks pay much higher tolls–up to $10 to $12 for large trucks.  While for some kinds of deliveries, the time savings may be worth the cost of the toll, in most cases, neither shippers or truck drivers are willing to pay extra to save just a few minutes. And the same math applies to trucks as to cars. We can assume that the value of time of a truck is in the $35/hour range (reflecting the compensation of the driver and the operating cost of the truck itself). If the toll is $12, a commercial driver will find it more profitable to get off the freeway and use the Second Street Bridge rather than pay a toll, even if the trip takes as much as 15 or 20 minutes longer than the freeway. According to the project’s Supplemental Environmental Impact Statement, the Second Street Bridge carried about 22,000 vehicles per day, and was operating at about 58 percent of its capacity in 2010.

In addition, many if not most commercial drivers have no financial or operational advantage from using tolled roads. Independent truckers and many shipping companies are paid a fixed amount per load (based on distance) and as long as they meet delivery deadlines, don’t get paid any extra for time saved in transit. Effectively, an independent trucker may have to pay the toll out of his profit on the trip. Unless he’s under considerable pressure to make a delivery deadline, he may prefer to spend an extra few minutes taking the free route, rather than pay the toll out of his own pocket. A study prepared for the Transportation Research Board concluded:

“ . . . truck drivers stated an extremely low willingness to pay even a token toll for different time savings scenarios . . . a large cross section of the trucking business cannot monetize toll road benefits.”

Also, unlike car traffic, which is highly concentrated during morning and evening rush hours (and in the case of the I-65 bridge, very much a morning-southbound, evening-northbound peak), truck traffic is much more evenly spread throughout the day. Proportionately more trucks will be crossing the river when the Second Street Bridge is less congested, and therefore will be a more attractive route.

There are more than a few ironies to this situation: First, after spending a billion and a half dollars on the new Lincoln Bridge, and doubling the number of freeway lanes crossing the river on I-65, the new bridge will likely carry fewer vehicles for the foreseeable future (through at least 2030) that it did in 2005. Second, a project avowedly designed to reduce congestion will actually lead to regular congestion of the Second Street Bridge—which is expected to see an 20 percent increase in traffic according to the project’s own estimates (but we believe this figure is probably a substantial under-estimate). What toll diversion—and the permanently depressed level of traffic predicted for the I-65 bridges signals is that a significant fraction of bridge users don’t value the time savings provided by the project to pay for them—even though tolls cover less than half of the cost of the bridge improvement project. In short, this is clear economic evidence that the project isn’t economically warranted.

By tolling some of the bridges across the Ohio, and leaving others toll-free, Indiana and Kentucky are conducting a real-world behavioral economics experiment. Over the next few months we’ll be watching to see how travelers respond to the financial incentives they’ve been provided, and how their travel behavior shifts in response. The results will be interesting.

Postcard from Louisville: Tolls Trump Traffic

Tolls cut traffic levels on I-65 in half; So did we really need 6 more lanes?

Last month, we wrote about Louisville’s newly opened toll bridges across the Ohio River.  As you may recall, Ohio and Indiana completed a major expansion of highway capacity across the Ohio, doubling the I-65 freeway crossing from six lanes to twelve near downtown and adding a beltway bridge to complete the freeway loop around the region. To pay help pay for the multi-billion dollar project, the two states imposed tolls ranging from $1-4 for cars using I-65 and the new East End/Lewis & Clark Bridge.

But travelers through Louisville still have a toll-free river crossing alternative, the old four-lane  Clark Memorial (Second Street) Bridge, just a few hundred yards downstream from the larger newer toll bridges, continues to be free. 

In our earlier commentary, we speculated that the addition of tolls to the freeway bridge, coupled with the presence of a nearby free alternative would dramatically reduce car traffic.  In the absence of any actual data, we took our first cues on what was happening from rush hour traffic cams showing cars on the I-65 and Second Street Bridges. The tolled freeway bridges were nearly empty, and the Second Street Bridge looked pretty busy. That seemed to confirm our hypothesis.  Nonetheless, several readers took us to task for not waiting until we had actual traffic counts. Well, now we do.

And they confirm what the traffic cameras were telling us: traffic is very light on the newly tolled freeway crossing.  The data from Riverlink (the joint Indiana-Kentucky toll operator) show that average weekday traffic on the two bridges was about 66,000 vehicles per day in January.  To put that number in context, it is about half the level of traffic (122,000 vehicles per day) that used I-65 in 2012–before the new Lincoln Bridge opened. To put that a slightly different way: the two states spent over a billion dollars to double the capacity of the I-65 crossing, and now it is used by about half as many vehicles as used it before.

 

Source: Riverlink, February 1, 2017

This is important because the principal objective of the Ohio River Bridges Project, according to the “purpose and need”: spelled out in its environmental impact statement was to reduce traffic congestion. Unless the river crossing was expanded, the EIS claimed, traffic would have increased to 155,000 vehicles per day, more than 23 percent over capacity, by 2030. Now its apparent that with tolling, traffic is actually unlikely to rebound to pre-construction levels.

There’s a lot more to dig into here. One of the big unanswered questions is has tolling I-65 worsened traffic congestion elsewhere in Louisville, particularly on the un-tolled Second Street bridge and nearby streets.  Because Riverlink doesn’t toll this bridge, they don’t have data on traffic levels–so we’ll have to wait until traffic counts are published by the two states.

But for the record:  one more set of photos of traffic over the I-65 and Second Street Bridges.  These were taken Monday afternoon February 13 at about 5:20 PM, from the local “TRIMARC” traffic monitoring website.

First, traffic on the tolled Kennedy and Lincoln Bridges (Camera 0333A).

Next, here’s an image of traffic coming off the Second Street Bridge as it enters downtown Louisville.  This picture was captured at 5:25 pm EST (despite the time-stamp on the camera).

 

The data and the traffic cam photos suggest that Louisville has demonstrated a powerful, fast-acting solution for reducing traffic congestion: charge a toll. It’s too bad they found out only after spending in excess of a billion dollars building new road capacity that apparently wasn’t needed or valued by those who travel across the Ohio River each day. Maybe other cities can learn from the Louisville’s expensive experiment.

 

The latest from the Louisville traffic experiment

Even with the free alternative closed, traffic is very light on the new I-65 bridges

Time for one of our periodic check-ins on our real world transportation pricing experiment in Louisville, Kentucky.  As you recall, we’ve been watching Louisville closely, because just at the end of last year, the city started what amounts to a laboratory experiment in transportation behavior.  Kentucky and Indiana build a new bridge to double the capacity of the I-65 freeway as it crosses the Ohio River near downtown Louisville. At the same time, it put tolls on the I-65 crossing, but not on the nearby Second Street Bridge, an older, four-lane highway bridge that connects Louisville to the Indiana suburbs north of the River.

As we reported in February, the initial month’s worth of data on bridge traffic shows that adding tolls (which run from $1 to $4 for cars) have caused traffic levels to fall by almost half, from about 122,000 vehicles per day to about 66,000.  We showed photographs from area traffic-cams that show rush hour traffic on the tolled bridges almost empty, while traffic was fairly think on the free Second Street Bridge.

The latest phase of our experiment came this past  this weekend, courtesy of “Thunder Over Louisville” a kind of combined concert, airshow and fireworks display that is held annually. To handle the big crowds the come downtown, and afford great vantage points, the city closes the Second Street Bridge. It did so on Thursday.  So we looked to see how this affected traffic levels.  (The festival itself didn’t start until Friday night, so Thursday was still a reasonably typical business day).

As you can see traffic on the tolled I-65 bridges was still very light.  (This photo was taken about 5 minutes after 5pm on April 20 and comes from the local “TRIMARC” traffic monitoring website.).  The peak direction of traffic is moving away from the camera’s vantage point, on the right hand side of the photo, north-bound, from downtown Louisville.

As noted, the Second Street Bridge was closed, with barricades. This is the bridgehead in downtown Louisville.

What these images suggest is that, even with the nearby free alternative closed, there’s way, way more capacity on I-65 that there is peak hour demand for travel. You can compare these photos to ones we captured two months ago when the Second Street Bridge was open to traffic at rush hour. While ostensibly the crossing was widened from 6 lanes to 12 to eliminate congestion, the real congestion-fighting investment was the decision to ask users to pay just a portion of the cost of widening the road. With tolls in place, drivers have voted with their feet (or perhaps, wheels) that they didn’t really need additional capacity.

There’s another point as well. It isn’t just that traffic has shifted to the “free” alternative. Its that with tolling in place, apparently many other trips just simply evaporated. That tendency of traffic to disappear when there’s a toll is an indication that people have much more flexibility about when, where, and how much they travel than is usually contemplated in policy discussions or travel demand models. The mental model that says traffic levels are some inexorable natural force like the tides, which must be accommodated or else, is just wrong.

Of course, photos of one moment in time (even at the height of what should be rush hour) are hardly the best evidence of how well a bridge is being used.  For that, we need actual traffic counts–the kind of data that would be generated, for example, if you had cameras, and license plate readers and transponder readers on a bridge, which is exactly what we have on I-65. But while Riverlink, the toll-bridge operator quickly posted January’s weekly traffic counts on February 1, its website has no new traffic data for the intervening two-and-a-half months. In fact, the Riverlink website which featured more or less weekly press releases since November, contains no new press releases since that February 1, traffic data.

There are many reasons why they might not yet have pulled together the data, and we’ll be watching eagerly to see when it becomes available.  But we couldn’t help but notice a story that appeared in local press just last week. It turns out that the Kentucky Transportation Cabinet has just signed a $300,000 contract with a Florida consulting firm to help it “determine whether the toll revenue generated by the RiverLink bridges is enough to make debt payments on the project’s bonds.” The financing of the widened I-65 crossing (and another beltway freeway crossing several miles to the East) hinges on tolls generating enough revenue to repay the bonds that Kentucky issued to pay for the project. If toll revenues don’t grow fast enough in the years ahead, the state will have to find some other source of funds to make these payments, which could make this particular experiment in transportation behavior a particularly expensive one. Stay tuned!

 

 

 

 

Who pays the price of inclusionary zoning?

Requiring inclusionary housing seems free, but could mean less money for schools and local services

Last month, the Portland City Council voted 5-0 to adopt a sweeping new inclusionary housing requirement for new apartment buildings. The unanimous decision came with the usual round of self-congratulatory comments about how they were doing something to address the city’s housing affordability problem. No one mentioned that they were also in effect voting to cut funding for schools or other local government services. But in Portland’s case, that’s exactly what the inclusionary program is likely to do, according to the city’s own budget office.

On the surface, one of the compelling policy attractions of inclusionary zoning (“IZ”)is that it doesn’t seem to cost any money:  You require developers to build 1 or 2 affordable housing units for every ten new apartments that they build.  Maybe your city offers up a density bonus, or expedites permit handling, but unlike conventional public housing, the city doesn’t have to lay out any of its cash to get more new affordable housing. That’s why Evan Roberts of StreetsMN described it as politically understandable, though terrible policy.

Portland new ordinance is one of the nation’s most demanding inclusionary housing requirements.  Basically, the city will require that all new apartment buildings of 20 or more units set aside 20 percent of their units for renters with no more than 80 percent of the region’s median household income (about $56,000).  Alternatively, developers could set aside 10 percent of their units for households earning less than 60 percent of the region’s multifamily housing. (As we’ve noted at City Observatory, most cities have far lower inclusionary requirements, offer exceptions, or only apply the requirement to newly up-zoned properties or those projects receiving city subsidies).

The city’s plan includes the usual list of non-cash aid to developers–lighter parking requirements, faster permit processing and density bonuses–although there’s considerable dispute as to whether these effectively allow developers to build more than they would have otherwise.

All of the analyses of the city’s inclusionary zoning plan concluded that unless the city offset the cost to developers, fewer units would get built. As a result, in addition to regulatory concessions, the city is also assuming that new apartment buildings would also get subsidized via a property tax exemption. And, as it turns out, this is where inclusionary housing starts get to costly for the public sector.

Portland’s plan offers up two levels of property tax exemption. For most new apartment buildings, the property tax exemption would apply only to the affordable housing units. For developments with a FAR of 5.0 or more (meaning for example, that a developer is building a 50,000 square foot or larger building on a 10,000 square foot lot), which would typically be an apartment tower, developers would get a property tax exemption for all of the apartments in the building.)

The amount of revenue foregone due to the tax exemptions is difficult to estimate.  It depends on whether developers choose to set-aside 10 percent of their units for families at 60 percent of median income or 20 percent of their units for families at 80 percent of median income. It also depends on how many units are actually built.  Even with property tax breaks and other incentives, many developers argue that it will no longer be financially attractive to build new apartments in Portland.  The City Budget Office has developed estimates of lost property revenue from the inclusionary housing program based on the assumption that developers will mostly go the 10/60 route (which minimizes their construction costs and gets them the largest property tax benefit per affordable unit). They also assume that the IZ program doesn’t impair housing construction–that the city builds as many units as its Comprehensive Plan calls for between now and 2035.  Under these assumptions the IZ program will cost the city $15.8 million in tax and fee revenue per year.

The total cost of the property tax breaks and city fee waivers per unit of affordable housing ranges from about $21,000 to almost $220,000 per unit.  But the cost to the City of Portland is far less than this amount, because most of the foregone property tax revenue is lost to other local property taxing entities, including K-12 schools, community colleges, Multnomah County and a handful of other local governments.  In essence, the City Council voted to have these other taxing entities pay about three-fourths of the fiscal costs associated with inclusionary housing, with the result that these other governments will have less revenue to pay for schools and other local services.  Here’s the takeaway quote from the City Budget Office:

CBO Analysis: The proposed policy would result in an estimated per-affordable unit cost that ranges from $20,787/unit to $218,663/unit, depending on project location and incentive package selected. The cost to the City General Fund is less – ranging from $4,674/unit to $57,529/unit – due to the property tax exemption costs being spread across schools, the County and other local public agencies.

In essense, the City has voted for other taxing jurisdictions to pay three-quarters of the public fiscal cost associated with inclusionary zoning.

Whether or not developers will get this volume of tax exemptions is still in doubt. The city doesn’t have exclusive jurisdiction over property tax exemptions, and by mutual agreement with Multnomah County, the city has agreed to abide by a $3 million annual cap on revenue lost to property tax exemptions.  The City and County will face a major dilemma in the months ahead as the program goes in to effect. If the $3 million cap isn’t lifted, there won’t be the necessary subsidies to make the inclusionary zoning program attractive enough for developers, and the city’s housing supply will suffer. If the $3 million cap is lifted, the city–plus the county, schools, and other local governments-will have a significant revenue shortfall to make up.

The time-tested adage of economists is “There’s no such thing as a free lunch.”  And when it comes to inclusionary zoning we might well add:  “There’s no such thing as free affordable housing.”

Has Louisville figured out how to eliminate traffic congestion?

Louisville is in the transportation world spotlight just now.  It has formally opened two big new freeway bridges across the Ohio River, and also rebuilt its famous (or infamous) “spaghetti junction” interchange in downtown Louisville. A story at Vox excoriated the decision to rebuild the interchange rather than tear out the riverfront freeway as a “a testament to how cars degrade cities.” In our first story about the Louisville area’s new $3 billion bridge project, we described how a bizarre toll structure will actually encourage wasteful driving, and probably lead to periodic congestion

But there’s another problem with the Louisville bridges: Only half the highway crossings are tolled. The main, north-south I-65 crossing, now encompassing twelve traffic lanes (split between the new Abraham Lincoln Bridge and a renovated Kennedy Bridge) will be tolled as will a new “East End Crossing,” about eight miles to the north. But two other highway bridges, the Sherman Minton bridge carrying I-64 from the West, and the venerable 1920s era Clark Memorial Bridge (usually referred to as the Second Street Bridge) continues to be free.

Louisville’s Clark Memorial aka Second Street Bridge (Wikipedia)

The other tolled route is the East End Bridge new crossing, far from existing routes. But the I-65 crossing–carried on the parallel Kennedy and Lincoln bridges–is just a few hundred feet upriver from the Second Street Bridge. While regular commuters will pay just a dollar a trip to use I-65 (provided they cross it at least forty times a month), occasional users will pay three or four dollars each way if they don’t sign up for a transponder, and instead, rely on toll-operator RiverLink to capture an image of their license plate.

According to Google Maps, taking the Second Street Bridge rather than the I-65 bridge adds about two minutes to a car trip from Jeffersonville Indiana, across the river to Louisville. So the question many motorists will face is: is it worth $3 or $4 to save two minutes on their trip.

 

The big question here is:  How many commuters will take the Second Street Bridge to avoid paying tolls on the new Lincoln and Kennedy I-65 bridges? Of course, it will be months–or longer–before we get the detailed and definitive traffic counts that will enable us to answer that question, but given our curiosity about the results of this economic experiment, we decided to take a quick peak at the traffic cams for the two facilities. This is of course an inexact and unscientific comparison, but the results are interesting.

In Louisville, Kentucky’s Transportation Cabinet operates a website called “TRIMARC“–”Traffic Response and Incident Management Assisting the River City”–which has a network of traffic cameras monitoring the region’s principal arterials and freeways. On January 17, shortly after 5pm we captured images of the traffic on the Second Street Bridge and the Kennedy and Lincoln Bridges from this website. Tuesday, January 17th was an ordinary business day, following Monday’s Martin Luther King Day holiday, and the weather was clear and dry, with temperatures in the mid-50s. Tolls had been in effect on the bridges for almost three weeks–since December 30.

First, here’s an image of traffic just North of the Kennedy and Lincoln I-65 bridges at milepost 0.1 in Indiana. This view is from Camera 3869, looking North toward Indiana, with the Southbound travel lanes (entering Louisville) on the viewer’s left and and the Northbound travel lanes (leaving Louisville) on the viewer’s right. The metal lattice structure stretching across the freeway in the foreground is the gantry holding the bridge’s toll collection cameras and sensors. This picture was taken at 5:07 pm EST.

I-65 Bridges at 5:07PM January 17 (TRIMARC Camera 3869).

Each of these bridges is striped for  five lanes of traffic at this point. As you can see there traffic is extremely light. (For you highway buffs, this is Level of Service “A”).

Next, here’s an image of traffic coming off the Second Avenue Bridge as it enters downtown Louisville. The vehicles on the left are Southbound into the city, the vehicles on the right are Northbound leaving the city. This picture was captured at 5:05 pm EST.

Second Street Bridge at 5:05PM, January 17, (TRIMARC Camera 065)

The Second Street Bridge has four travel lanes–two in each direction.  As the two Southbound lanes enter downtown Louisville (on the left of this image) they branch into two right turn lanes and two through lanes.  Although this camera is set at a much lower height, and therefore shows a shorter segment of roadway than the I-65 camera above, there are clearly more vehicles shown crossing the Second Street Bridge than are crossing the I-65 bridge at nearly the same time. (The truck in the center of the picture is saving $12 compared to the cost of driving over the nearby I-65 bridge).

This very anecdotal, if visual, information suggests a couple of hypotheses.  First, it does appear that a fairly large segment of traffic crossing the Ohio River in Louisville on this particular afternoon chose the older, slower and non-tolled route over the newer, faster and more expensive tolled freeway bridges. Second, it seems like there is plenty of capacity crossing the Ohio River at this particular point to accommodate all these vehicles. Not only is the freeway nearly deserted, but traffic appears to be well below capacity on the Second Street Bridge as well.

Of course, this is a small sample and a highly unscientific set of observations. But taken at face value, these pictures call into question the decision to spend several billion dollars to increase highway capacity over the Ohio River. If so few cars are actually crossing the river at peak hour on a typical business day, and if such a relatively large proportion of them apparently prefer to use the older, slower route rather than pay to use the fancy new crossing, did the states of Kentucky and Indiana have any reason to spend so much to build a giant new bridge?  And will the two states, who are counting on toll revenues to pay back a major share of the cost of the project, be able to cover their debts?

There may be some very good news here. The image of I-65 is compelling evidence of how to alleviate, if not completely eliminate peak hour traffic congestion:  Charge a toll. When faced with a positive price for driving (the toll is $1 each way for regular commuters) apparently very few people want to drive on the freeway. As we’ve long maintained at City Observatory, traffic congestion is a direct consequence of charging too low a price to road users. Kentucky and Indiana have apparently demonstrated that a very modest level of tolling can work wonders for alleviating traffic congestion. It’s just too bad that they also had to spend several billion dollars for highway capacity that motorists apparently don’t want in order to find that out.

 

 

Housing supply is catching up to demand

As Noah Smith observed, economists invariably encounter monumental resistance to the proposition that increasing housing supply will do anything meaningful to address the problem of rising rents–especially because new units are so costly. One of the frustrations that we (and increasingly cost-burdened) renters share is the “temporal mismatch” between supply and demand.  Demand can change quickly, while supply responds only slowly, thanks to the long time it takes to detect a market need, plan for, permit, finance and then finally build new apartments. Rents go up quickly, and economists can only counsel patience while this process is unfolding.

But it’s increasingly apparent that housing supply is now responding. Several cities are recording impressive increases in the number of new apartments permitted and under construction. Take Seattle, which saw rents increase nearly 10 percent in 2015. Like the rest of the country, the city saw a fall off in new construction after 2007 with the advent of the Great Recession. But since then, developers and investors have been pouring money into the local housing market. Nearly 10,000 new apartments are expected to be completed in just the next year. As the Seattle Times highlighted, more apartments will open this decade in Seattle than in the previous half century.

Source: Seattle Times

Even though these apartments haven’t been completed yet, the growing supply, and the prospect of more units coming on line is already having an effect on prices. According to the Seattle Times, one local market analyst is saying apartment rents have reached a turning point; rents in some of that city’s hottest neighborhoods have declined between 3 and 4 percent in the past year.

Just about three hours south, down Interstate 5, Portland’s housing market has seen a similar cycle, with events lagging about six to a year or more behind Seattle.  Just a year ago, Portland had the highest rent increases in the nation–12.4 percent year-over-year in October 2015–according to real estate firm Axiometrics. But since then builders have been moving aggressively to add more apartments. The city has permitted between 3,000 and 4,000 apartments in each of the last two years.  According to local market analysts Barry and Associates, 7,000 units are currently under construction and and additional 15,000 are in the planning stages. In Portland, as in other cities, supply is responding, in a big way, to past rent increases.

To be sure, the supply response hasn’t actually produced declines in rent levels–yet. But as more and more new units come on line, and their owners seek tenants, the balance of power in the rental marketplace will shift from sellers to buyers.

Rising rents have generated a tangible response.

The recent sharp growth in supply comes, ironically, just as the City of Portland has enacted one of the nation’s most demanding inclusionary zoning requirements. Developers building 20 or more apartments will be required to set aside 20 percent of their units for households earning no more than 80 percent of the region’s median household income. (Indeed, a big part of the reason for the recent surge of applications to build new apartments has been developers seeking to obtain building permission prior to the February 1, 2017 effective date of the new regulations). Almost exactly the same thing happened in San Francisco, as developers scrambled to beat the deadline for that city’s more stringent inclusionary zoning requirements.

There’s a growing body of evidence that as more housing gets built, rent inflation moderates, and rents even decline. Nationally, Axiometrics sees a slow down in rental growth, with actual declines in some markets. In Houston, New York and the San Francisco Bay, growing inventories have produced “negative rent growth.” Last month the Chicago Tribune reported that while rents were up about 2.3 percent citywide, they were down about 0.3 percent inside “The Loop.”  The New York Times even floated the notion that 2017 will be “The Year of the Renter” as growing supply takes the edge of rent increases in New York.

For the moment, then, the good news is that in these coastal cities, where rents have been rising at alarming rates, that the forces of supply and demand are operating. As more apartments are completed–even at the high end of the market–the added supply is dampening the rate of rent increases. The key question going forward is whether the demanding inclusionary requirements enacted in places like Portland and San Francisco–and which are still pending in Seattle–will prompt developers to pull back from the current frenzied pace of building.

Pollyanna’s ride-sharing breakthrough

A new study says ride-sharing apps cut cut traffic 85 percent. We’re skeptical

We’ve developed a calloused disregard for the uncritical techno-optimism that surrounds most media stories about self-driving cars and how fleets of shared-ride vehicles will neatly solve all of our urban transportation problems.

But a new story last week re-kindled our annoyance, because it so neatly captures three distinct fallacies that suggest that fleets of shared autonomous vehicles, if actually deployed in the real world, would produce a dramatically different outcome than the one imagined.  The story in question last week described a new study which reportedly proved that ride-hailing apps could reduce traffic congestion by 85 percent. The headline at Mashable was typical:

And don’t blame the writers at Mashable. Their interpretation closely mirrors a florid press release produced by MIT, which summarized the study as follows: “One way to improve traffic is through ride-sharing – and a new MIT study suggests that using carpooling options from companies like Uber and Lyft could reduce the number of vehicles on the road 75 percent without significantly impacting travel time.”

These can be replaced. Mathematics proves it.

The press stories were based on a new paper entitled “On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment,” written by Javier Alonso-Mora, Samitha Samaranayake, Alex Wallar, Emilio Frazzoli and Danila Rus, a team of engineers and computer scientists from MIT and Cornell. Their key finding:

Our results show that 2,000 vehicles (15% of the taxi fleet) of capacity 10 or 3,000 of capacity 4 can serve 98% of the demand within a mean waiting time of 2.8 min and mean trip delay of 3.5 min.

Despite the press release announcing the study–entitled “Study: Carpooling apps could reduce traffic 75 percent”–the study isn’t referring to all traffic, its just referring to New York City’s existing taxicab fleet. What the study is really saying is that 3,000 ten passenger vehicles with a sophisticated real time ride-hailing and vehicle assignment system could provide just as many trips as 14,000 yellow taxis, many of  which are idle or simply cruising Manhattan looking for one- or two-person fares headed to a single destination. Some of the gain comes from higher occupancy (up to 10 passengers, rather than mostly a single passenger), and other of the gain comes from better dispatching (fewer miles driven with an empty vehicle). So far, so good. But going from this observation to the conclusion that this will solve our urban congestion problems quickly runs afoul of three major fallacies.

The fixed demand fallacy

Autonomous vehicle designers may be using LIDAR, imaging, vehicle-to-vehicle communication and prodigious computing power, but down deep they’re still engineers, and they’ve apparently given no thought whatsoever to induced demand.  Just as highway engineers have assumed that there’s a fixed demand for travel and that highways need to be sized accordingly, and ignored the effect of new capacity on in stimulating added travel, the MIT study assumed that the current level of taxi use exactly captures future travel demand. Its worth noting that the demand for taxis is limited, in large part, because New York City has long regulated the number of licensed cabs via its medallion system.

There’s no reason to believe the demand for 10- and 4 passenger vehicles would be restricted to just those who currently patronize cabs. Taxis handle about 360,000 rides in the Manhattan daily. About 2.8 million travel to or from Manhattan by public transit.  If their were suddenly a viable on-street ride sharing option–especially if it were cheaper–the system could have much more demand–which could swamp the congestion reducing benefits.

The big urban transportation challenge is not simply optimizing a pre-determined set of trips, its coping with the complex feedback loops that produce a fundamental law of road congestion. This study glosses over that inconvenient truth.

The big data fallacy

A big part of what propels the illusion of fixed demand is our second fallacy:  big data.  Thanks to GIS systems in taxis, cheap telecommunications, and abundant computing power, we now live in a world where we can easily access copious data on the origin and destination of everyone of the several million annual taxi trips in New York City.  While the data is massive, it isn’t infallible or immutable: it simply reflects that decisions that travelers made with a particular technology (taxis), a particular set of prices and a set of land uses and congestions levels and alternatives that were in place at the time. It may be richly detailed, but it’s dumb: it tells us nothing about how people would behave in a different set of circumstances, with different technology and different prices.  And as big as the dataset is that’s used here, it leaves out the overwhelming majority of travelers and trips in New York who travel by train, bus, bike and foot. As we’ve suggested at City Observatory, the presence of highly selective forms of “big data” is a classic “drunk under the streetlamp” problem that focuses our attention on a few selected forms of travel, to the detriment of others. Optimizing the travel system for a relatively small segment of the population–the one for which we have rich data–doesn’t prove that this will work in the real world.

The mathematical model fallacy

The third fallacy is the mathematical model fallacy. A mathematical model can be a useful tool for sussing out the scale of problems. But in this case, it involves abstracting from and greatly simplifying the nature of the system at work. Yet press accounts, like those at Mashable are awestruck by the author’s use of a mathematical model:

This study, released Monday, used a mathematical model to figure out exactly how vehicles could best meet demand through ride-sharing.

and

“There’s a mathematical model for the autonomous future.”

The authors have constructed a very sophisticated route-setting and ride matching algorithm (taking that big data on origins and destinations as a starting point) and figured out how many 10 person person and how many 2 person vehicles it might take to handle all the 14 million trips. This requires formidable math skills, to be sure. And let’s take nothing away from the authors’ technical prowess: they’ve figured out how to solve a very complicated and huge math problem almost in real time. But simply using math to model how this might work doesn’t prove that people would actually use such a system.

Consider another possibility. Suppose we banned private cars and taxis in Manhattan and increased the number and frequency of city buses five-fold.  A bus would go by every bus stop every two or three minutes. And bus travel times would be faster, because there’d be no private cars on the road. We could probably carry all those taxi trips with even fewer vehicles.  One could even construct a mathematical model to evaluate the efficiency of this system. If we extrapolate from the MIT paper, it seems likely that if  2000, 10-person vehicles or 3,000 4-person vehicles cars can eliminate 85  percent of the trips percent of the trips, then its probably not a stretch to suggest that maybe 1,000 40-person buses could do the same thing.

Would the dogs actually eat this dog food?

Its worth asking a simple real world question: Would all or many of the people currently traveling in taxis agree to travel in ten-person or even four-person shared vehicles? If all that people were paying taxi fares for was travel between points A and B, maybe so. But there are many reasons to think they wouldn’t. For starters, travelers in pooled vehicles  get a slower trip. On average their pooled trip is going to take three and half minutes longer according to the MIT study. And people taking taxis are often paying for much more than just getting from A to B. In addition to getting to their destination as quickly as possible, they may also want the amenities of a dedicated vehicle, such as privacy–they don’t want to share their ride with other persons (even if it costs less). As David King notes at CityLab, taxis are a premium service in the world of urban transportation. And especially, what riders may be paying for is greater certainty that they are getting reliable and high priority service. The authors report the average wait time for a shared vehicle would be 2.8 minutes and the average added travel time would be 3.5 minutes, but some riders would face longer waits and greater delays. And avoiding that uncertainty or variability is a big part of why people pay for a solo taxi. And finally, it may be about status–being driven as the only occupant in a car. Often times, its faster to travel between most points in Manhattan by subway, yet many people take a taxi or uber because of just these other considerations such as comfort, convenience, priority, privacy and status.

Here’s an analogy: If Per Se or Momofuku Ko (two of New York’s swankier restaurants) served all of their food cafeteria style, they could dispense with waiters and serve 200 percent more diners than they do today. A good MBA student could probably produce a pro forma that could calculate, to the penny, how much more profit the owners would make. But it’s likely that high-end restaurant patrons–like taxi customers–are paying for something more than just the basics, and that in the real world, this wouldn’t attract many customers.

Its unfortunately too easy to oversimplify the nature of the urban transportation problem. We tend to be beguiled by new technology and blinded by big data in ways that lead us to overlook some fundamental questions about, for example, geometry. As we look to implement new technologies, like autonomous vehicles and shared ride services, we need to remember some of the hard earned lessons about things like induced demand.

 

Beer and cities: A toast to 2017

Celebrating the new year, city-style, with a local brew

Champagne may be the traditional beverage for ringing in the new year, but we suspect that a locally brewed ale may be the drink of choice for many urbanists today. Much has changed about American beer in the past two decades. Most of the post-prohibition era was characterized by the industrialization of beer-making and the consolidation of the brewing industry; the number of breweries in the US fell from roughly 500 at the close of World War II to only about 100 in 1980. But since then, consumers have turned away from the big national brands and increasingly patronize local, even neighborhood micro-brewers, who offer a wide array of ales, stouts, porters and other distinctive brews, often featuring local ingredients.

’tis the Saison – Happy New Year.

While the overall market for beer grew only about 0.6 percent in 2015, the consumption of craft beer grew by more than 10 percent, according to Nielsen. In the past five years, the market share of craft brewers has doubled, from less than 6 percent of beer sold, to more than 12 percent, according to the Brewer’s Association. Now avid beer drinkers in cities around the country have dozens–and in some cases hundreds–of local brews to choose from. And at City Observatory, that set us to asking:  which city has the largest concentration of micro-breweries.

Which cities have the greatest density of microbreweries?

To answer this question, we turned to BreweryMap.com which has mapped the locations of a comprehensive database of US microbreweries. The site allows users to search for breweries by location, and we used their on-line search tool to look for all microbreweries within 5 miles of the city center of each of the 53 principal cities of those metropolitan areas with more than a million population. Using a standard five-mile radius allows us to eliminate the variation of computed density that would arise from the very different geographic areas included in city boundaries. Here’s the BreweryMap of Portland.

Portland has 89 microbreweries within this 5 mile circle, the highest density of microbreweries of any large city in the United States.  The following table shows the density (number of microbreweries within 5 miles of the city center) for each of the 53 largest metro areas in the US.  Denver and Seattle rank second and third, respectively, followed by Chicago and New York.  Virginia Beach and Hartford have the lowest density of microbreweries in their urban cores.

Consumer survey data confirm the same geographic patterns of preferences for micro-brewed beer.  Scarborough Research, an arm of Nielsen, reports that Denver, Portland and Seattle had the highest reported rates of micro-brewed consumption of any US metropolitan areas. About 5 percent of American adults have had a micro-brewed beer in the past month: In Denver and Portland, the figure is about two and a half times higher: 13 percent.

In an era in which so much of what we consume is commoditized and globalized, its nice to see a distinctive local product flourishing in so many places around the country. Here’s to a Happy New Year!

 

Our ten most popular posts of 2016

As 2016 draws to a close, we look back at our most popular commentaries of the year.  Hear they are, in reverse order:

#10. Introducing the sprawl tax

#9. Urban myth busting: New rental housing and median income households

#8. What filtering can and can’t do.

#7.  What I learned playing Sim City

#6. In some cities, the housing construction boom is starting to pay off

#5. Reducing congestion: Katy didn’t

#4. US DOT to shut down nation’s roads, citing safety concerns

#3. The Storefront Index

#2.  Housing can’t be a good investment and affordable.

#1.  The illegal city of Somerville

We’ll see you all again in the New Year!

Denver backs away from inclusionary zoning

At the top of most housing activist wish-lists is the idea that cities should adopt inclusionary housing requirements:  when developers build new housing, they ought to be required to set-aside some portion of the units–say 10 or 20 percent–for low or moderate income families.  Dozens of cities around the country have adopted some variant of the inclusionary idea.

Portland’s City Council is weighing adoption of an inclusionary housing requirement that would be among the nation’s most stringent:  it would require all multi-family developments of 20 or more units to set aside 20 percent of newly constructed apartments for families earning no more than 80 percent of the region’s median household income.  Unlike inclusionary zoning programs in many other cities, like New York and Chicago, which apply only when a developer is seeking an up-zoning or has some form of public subsidy, Portland’s ordinance would apply to virtually all development, including that that only seeks to build at density levels already authorized by the zoning code.

One of the principal arguments advanced by proponents of the ordinance is the policy wonk version of the “all the other kids are doing it,” refrain well known to parents everywhere. For example, in testimony to the Portland City Council on December 13, Professor George Galster assured the city council that inclusionary zoning was a well-established practice, in use widely around the country for more than forty years, concluding:

. . . they are in operation in hundreds of cities and counties across the United States, including fast-growing Portland-sized places like Denver and Minneapolis.

(Portland City Council Video, December 13, 2016 (at 56:30)

In a narrow statistical sense, that statement is mostly true:  lots of places have adopted something they call “inclusionary zoning” or “inclusionary housing.”  But that appellation is applied to a wide range of programs, most of them tiny or toothless. As we’ve reported at City Observatory, there’s less to most inclusionary zoning programs than meets the eye: While impressive sounding on paper (and perhaps in the press) they tend to produce very few units of new housing, typically due to the limited scope and discretionary application.

And in the case of Denver and Minneapolis, the two instances specifically cited by Dr. Galster, there’s even less that meets the eye.  Minneapolis does not in fact have an inclusionary housing requirement, although it does have a voluntary density bonus for developments that include affordable housing (which no developer has apparently ever used).   And, as of September, Denver has repealed its inclusionary housing requirement.  Section 27-105(a) of the city’s development code had required some new developments of 20 or more units to set aside 10 percent of newly added units for households with less than 80 percent of the area’s median income.  That requirement is repealed effective January 1.  (For what its worth, as we reported at City Observatory earlier, the Denver program had produced a paltry 77 units since it was established in 2002).  Here’s the pertinent City of Denver Ordinance:

In its place, Denver has adopted a new Permanent Housing Trust Fund, which will provide an estimated $15 million per year for the next decade to help acquire and rehabilitate low and moderate income housing.  The fund will get revenue from a city-wide property tax as well as “linkage fees” on a wide variety of new development projects, including residential and commercial development. This approach was designed explicitly to spread the burden of subsidizing housing as widely as possible and avoiding creating disincentives to new residential development.  And for those who think Portland is somehow lagging Denver in promoting housing affordability, Portland’s recently approved housing bond of $258 million, is actually larger than Denver’s new fund.

As a legal and policy matter, a wide variety of ordinances and programs clothe themselves in the appealing term “inclusionary housing.”  But here especially, the devil is in the details. Even Mayor Bill de Blasio’s vaunted “Mandatory Inclusionary Housing” requirements apply only if developers seek up-zoning.

Here’s why this matters: Advocates are arguing that the experience of all these other places shows that inclusionary requirements have no negative effects on new privately financed housing construction. But if the programs in New York, Chicago, Denver and Minneapolis are so much smaller, are voluntary, have been repealed or simply don’t exist, then they provide no evidence that the program being proposed in Portland will not greatly reduce new housing construction–and thereby exacerbate the city’s housing shortage, and actually worsen rent inflation.

When advocates sweep these substantive policy differences under the rug, and don’t acknowledge the limited scope of real-world inclusionary programs–as well as significant back-sliding from inclusionary zoning, as in Denver, they’re mis-informing policy makers.  As we pointed out earlier this month, the scope of the Portland program is much broader than virtually every other extant inclusionary zoning program and is highly likely to have a devastating effect on new housing construction. Ultimately, details matter, and sweeping claims that elide the great variation in policies that carry the appellation “inclusionary” is misleading; no better than an eight year-old claiming that “all the other kids do”–when in fact they don’t.

Irony Squared: Inclusionary Zoning Edition

Minneapolis is considering inclusionary zoning (IZ), but has qualms based on Portland’s experience. Ironically, a non-existent Minneapolis IZ program was a key part of the argument for adopting Portland’s IZ law in December 2016.

Parts of this commentary are going to seem like a major-league distortion in the space-time continuum, so let’s start with a simple fact:  the City of Minneapolis doesn’t have an inclusionary housing requirement–it never has.

Regular readers of City Observatory will know  inclusionary housing requirements are a favored policy of many housing advocates:  when developers build new housing, they ought to be required to set-aside some portion of the units–say 10 or 20 percent–for low or moderate income families. Dozens of cities around the country have adopted some variant of the inclusionary idea.

There’s a terrific story in MinnPost, the Twin Cities on-line public policy journalism enterprise, describing the upcoming deliberations of the Minneapolis City Council, which is considering adopting some version of IZ. According to MinnPost, the decision is very much informed by recent reports on the struggles surrounding Portland’s new IZ program.

In December 2016, Portland’s City Council enacted a citywide inclusionary housing requirement that would that is among the nation’s most stringent:  beginning in February 2017, it required all multi-family developments of 20 or more units to set aside 20 percent of newly constructed apartments for families earning no more than 80 percent of the region’s median household income.  A new city staff report and several press reports (here and here) suggest that the inclusionary zoning program has brought new apartment proposals in Portland to a near standstill. Wisely, the city leaders of Minneapolis want to understand what’s happening in Portland so they avoid a similar problem if they move forward.

And here’s the irony: back in 2016, advocates for Portland’s IZ program argued specifically that IZ was up and running in Minneapolis, proving that it had few, if any adverse effects. You might call it  the policy wonk version of the “all the other kids are doing it,” refrain well known to parents everywhere. For example, in testimony to the Portland City Council on December 13, Professor George Galster assured the city council that inclusionary zoning was a well-established practice, in use widely around the country for more than forty years, concluding:

. . . they are in operation in hundreds of cities and counties across the United States, including fast-growing Portland-sized places like Denver and Minneapolis.

(Portland City Council Video, December 13, 2016 (at 56:30)

In a narrow statistical sense, that statement is mostly true, it simply wrong in the case of both of the cities specifically mentioned: Neither Denver nor Minneapolis had an IZ program in December 2016. Lots of places have adopted something they call “inclusionary zoning” or “inclusionary housing.”  But that appellation is applied to a wide range of programs, most of them tiny or toothless. As we’ve reported at City Observatory, there’s less to most inclusionary zoning programs than meets the eye: While impressive sounding on paper (and perhaps in the press) they tend to produce very few units of new housing, typically due to the limited scope and discretionary application.

And in the case of Denver and Minneapolis, the two instances specifically cited by Dr. Galster, there’s even less that meets the eye.  Minneapolis did not in fact have an inclusionary housing requirement, although it does have a voluntary density bonus for developments that include affordable housing (which no developer has apparently ever used).   And, as of September 2016, Denver has repealed its inclusionary housing requirement.

As a legal and policy matter, a wide variety of ordinances and programs clothe themselves in the appealing term “inclusionary housing.”  But here especially, the devil is in the details. Even Mayor Bill de Blasio’s vaunted “Mandatory Inclusionary Housing” requirements apply only if developers seek up-zoning.

Here’s why this matters: Advocates are arguing that the experience of all these other places shows that inclusionary requirements have no negative effects on new privately financed housing construction. But if the programs in New York, Chicago, Denver and Minneapolis are so much smaller, are voluntary, have been repealed or simply don’t exist, then they provide no evidence that the program Portland enacted would not greatly reduce new housing construction–and thereby exacerbate the city’s housing shortage, which is what it appears to be doing.

When advocates sweep these substantive policy differences under the rug, and don’t acknowledge the limited scope of real-world inclusionary programs–as well as significant back-sliding from inclusionary zoning, as in Denver, they’re mis-informing policy makers.  As we pointed out earlier, the scope of the Portland program is much broader than virtually every other extant inclusionary zoning program and is highly likely to have a devastating effect on new housing construction. Ultimately, details matter, and sweeping claims that elide the great variation in policies that carry the appellation “inclusionary” is misleading; no better than an eight year-old claiming that “all the other kids do”–when in fact they don’t.

Portland didn’t (and of course couldn’t) learn anything from Minneapolis’ non-existent inclusionary zoning program. (It’s not apparent that city leaders even bothered to verify the accuracy of Galster’s claim). Today, though, Minneapolis is fortunate to be in the position to learn from Portland’s mistakes. Thanks to MinnPost for following this story so closely.

More evidence for peer effects: Help with homework edition

There’s a large a growing body of research that shows the importance of peer effects on lifetime economic success of kids.  For example, while the education level your parents is a strong determinant of your level of education, it turns out that the education level of your neighbors is nearly half as strong.  Much of this effect has to do with the level of resources and performance level of local schools:  people who live in neighborhoods with lots of well-educated people have schools with more resources and stronger parental support.  And there’s also a fair argument that a better educated peer group provides access to social networks and role models that shape aspirations and opportunities.

Parental investment. (“Homework helping hand.” Flickr: Pete)

A new University of Chicago working paper from Josh Kinsler and Ronni Pavan underscores another, more subtle way that peer effects operate in schools.  It’s titled:  “Parental Beliefs and Investment in Children:  The Distortionary Impact of Schools.”  We know that one critical factor in explaining student achievement is what education scholars call “parental investment.”  By this they mean the amount of time (rather than money) that parents dedicate to helping advance their child’s learning by, for example, helping with homework, or participating in school activities, or arranging tutoring or extra-curricular learning opportunities.

The study uses data from a national longitudinal survey covering Kindergarten, First- and Third-Grade students, and looks at the connection between parental beliefs about student performance generally, and in math and reading, and how this the amount of time parents spend helping children do homework and similar activities.

Kinsler and Pavan find that there’s a strong correlation between parental beliefs about their child’s relative performance and their investment in these kinds of time intensive learning activities.  Parents who think their children are at or above the average, tend to invest less time in doing things like helping with homework.  And there’s the critical part of their finding:  parents tend to base their assessment of their child’s performance relative to other students in his or her class or school, rather than other schools, or the state or nation as a whole.  This is mostly unsurprising: parents are going to get most of their information about academic performance by comparing their child to his or her classmates.

But the effect of this “local bias” in comparisons is that parents of students attending low performing schools will tend to have an inflated assessment of how well their child is doing–relative to all other students.  This over-optimism will lead them to under invest in helping with homework, and doing other things to enrich their child’s educational opportunities.  Its well-understood that low-income and single-parent households already start off with more limited time and resources to help support their children’s education.  What this suggests is that given all the competing demands for their time and attention, they may be lulled into a false sense that their children are doing “well-enough” in school.  As Kinsler and Pavan conclude:

Parents of low skill children who attend schools where average skill is also low will perform fewer remedial type investments than parents of similarly able children who attend schools where average skill is higher. Because of the tendency for students and families to sort into schools and neighborhoods, low skill children are more likely to attend schools where average skill is also low. As a result, the distortion in parental beliefs generated by local skill comparisons leads to underinvestment for low skill children.

As a result, one of the subtle and pernicious ways that economic segregation and the concentration poverty influence children’s lifetime incomes is by giving parents (and probably children) too limited a basis for measuring their performance and lead them to under-invest in educational skills.

Urban Transportation’s Camel Problem

There’s a lot of glib talk about how technology–ranging from ride-hailing services like Uber and Lyft, to instrumented Smart Cities and, ultimately, autonomous vehicles–will fundamentally reshape urban transportation. We’re told, for example, that autonomous vehicles will eliminate traffic fatalities, obviate the need for parking lots, and solve transit’s “last mile” problem. But there are good reasons to be skeptical.

As Jarrett Walker has famously pronounced, these would-be alternatives have a geometry problem. Solutions that rely upon trying to put more travelers in lots of smaller, often single-occupancy vehicles will inevitably run out of space in urban environments. In Walker’s view, the space efficiency of mass transit–city buses and rail lines–makes them the only feasible way of moving large numbers of people into, out of, and around big cities.

So a bus with 4o people on it today is blown apart into, what, little driverless vans with an average of two each, a 20-fold increase in the number of vehicles?   It doesn’t matter if they’re electric or driverless.  Where will they all fit in the urban street?  And when they take over, what room will be left for wider sidewalks, bike lanes, pocket parks, or indeed anything but a vast river of vehicles?

No amount of technology can overcome the limits imposed by simple geometry.

There’s a lot of merit to this view.  And too little thought has been given to how technological solutions might actually scale in actual urban environments. Even in New York City, with very sophisticated instrumentation of the taxi fleet and copious reports of activity from Uber and Lyft, there’s actually no comprehensive assessment of how the growth of these services has affected travel times and congestion, according to Charles Komanoff.

While the geometry problem is real, and under-appreciated, we think these new technological solutions will have to simultaneously face another problem, which we call the “camel problem.”  The demand for urban transportation is not simple and linear.  Walker’s geometry point is that demand for transportation has an important spatial component.  To that we would add that it also has a temporal (time-based) component as well, one that’s well illustrated by our friend, below:

 

Standing in the way of urban transportation reform (Flickr: Adair Broughton)
Standing in the way of urban transportation reform (Flickr: Adair Broughton)

Like the famous Bactrian camel, urban travel demand has has two humps.  There’s a peak travel hour in the morning and a second one in the evening in virtually every large city in the US (and most places around the world).  It seems to be a regular systemic feature of human activity:  we sleep and eat in one set of places, and work, study, shop, and socialize in a different set of places, and disproportionately tend to make trips between these sets of places at the same hours of the day. There’s an abundance of data on this point.  Transportation scholars (Tony Downs’ Still Stuck in Traffic is the definitive guide) and traffic engineering text books have documented it for decades.  We observed it by pointing a Placemeter camera outside the window of City Observatory’s offices.  And the latest bit of evidence for the “camel” view of transportation comes from New York City’s bike share program.  Our friends at New York City Economic Development Corporation have an excellent report summarizing some of the trip data from the CitiBike program, showing, among other things the average age of riders (skewing towards young adults), and the most frequent routes traveled (more scenic routes along the West Side, and places not well-served by subways, among others).  But the most interesting chart shows when people are riding CitiBikes, by hour of the day.  It’s a camel, too:

The CITIbike Camel.
The CITIbike Camel.

Just as with other modes of transportation (whether it’s the subway, city streets and bridges, or the bus system), travel exhibits two distinct peaks, one corresponding to the morning travel period, and a second in the late afternoon.  About twice as many bikes are in use in the morning and afternoon peak hours as in the middle of the day.

The “camel” of urban transportation demand has important implications for designing and operating any new system of getting around cities.  For example, a fleet of self-driving cars sized to meet peak hour demand would be more than 50 percent idle during most of the day.  Except for an hour or two in the morning and perhaps two to three hours in the late afternoon, most vehicles would be idle.

While we think that there is merit to both the Jarrett Walker “geometry problem” and our own “camel problem,” it’s actually the case that the camel problem trumps geometry.  The urban transportation system doesn’t have a geometry problem at 2AM, or indeed most of the day.  The geometry problem becomes a problem chiefly in peak hours.   Walker is almost certainly correct that geometry will guarantee that solutions like fleets of self-driving cars will never have the capacity to handle traffic loads–during peak hours. But the off-peak hours are a different situation.  It seems almost certain that operators of fleets of self-driving cars will use surge-pricing to manage demand (and reap profits) associated with peak hour travel.  The competitive challenge for transit is likely to be that fleets of self-driving cars will have abundant capacity during off-peak hours, and they will likely be tempted to offer discounted fares for vehicles that might otherwise be idle (and would probably also cross-subsidize the cost of these trips from profits earned at the peak).  As we reported earlier, the best current estimates suggest that self-driving vehicles may cost an average of 30 to 40 cents per mile to operate.  It seems likely that the price charged may be higher at the peak, but then discounted from that amount for off-peak hours.  That’s a price point that many transit operators would be hard pressed to match.

It’s tempting to visualize alternatives to current transportation systems as a one-traveler or one vehicle-at-a-time problem.  But the urban transportation problem is not so much about individual vehicles and trips, as the way trips cumulate in both space and time. The problem is complex one, and will defy simple solutions.  Geometry–and camels–will be with us for the foreseeable future.

 

 

Does rent control work? Evidence from Berlin

As housing affordability becomes an increasingly challenging and widespread problem in many US cities, there are growing calls for the imposition of rent control.  While there’s broad agreement among economists that rent control is ineffective and even counterproductive, it still seems like a tempting and direct solution to the problem.  In Oregon, State House speaker Tina Kotek has made repealing the state’s ban on municipal rent control laws one of her major legislative objectives for 2017.  What happens when a big city imposes rent control?  A new study of rent control in Berlin offers some insight.

Nikolaiviertel, Berlin (Flickr: oh_berlin).
Nikolaiviertel, Berlin (Flickr: oh_berlin).

Housing affordability has been a big issue in Berlin for the past several years. The city’s economy has grown due to an influx of young adults and the creative class, and the resulting demand has pushed up rents in many neighborhoods. And unlike the US, most Germans are renters, rather than homeowners–in Berlin, 85 percent of households rent their dwellings. As a result, there’s been strong political support for rent control.  In June 2015, the city enacted its “Mietpreisbremse”– literally a brake on rents–setting a  cap on rent increases.  A new paper “Distributional price effects of rent controls in Berlin: When expectation meets reality” from Lorenz Thomschke of the University of Munster looks at what has happened in the first year since the law was enacted.

Berlin’s rent control ordinance is a complex one:  it’s not a freeze on rents per se, but rather a limit on rent increases on existing rental units.  The city has developed a complex formula based on an apartment’s age, size, number of floors and amenities, that prescribes a rent level and allows only modest increases over time. New construction is exempt from the rent control limits, and apartments that are substantially renovated are also free to charge higher rents as well.

The key finding of the Thomschke paper is that the initial enactment of the law has reduced rents in rent-controlled flats compared to those not included in the scheme, but the effects have been less than the law intended, and are the benefits of rent control are not evenly distributed among different types of apartments.   He concludes that the biggest impact of the rent control law has been to lower the prices charged for the largest and most expensive apartments. So, paradoxically, while the law was aimed at easing affordability problems for low and moderate income households, the chief beneficiaries of the law to date have been upper income households.  Tomschke concludes:

The original goal of the reform – more affordable rental housing for low- and middle-income households – has therefore not been achieved after one year of the MPB.

While Berlin’s rent control scheme is novel and complex, and the findings of this study may not be directly applicable policy proposals in the US, they are a stark reminder that while posed as a way of promoting affordability for low income households, in practice, rent control may actually provide greater benefits for higher income renters. High income renters may be more savvy in dealing with landlords and exercising their rights, and less subject to the economic dislocations that force low income households to move from rent controlled apartments. Over time, having acquired the “right” to live in a rent controlled apartment, some better off households may choose not to move, or to buy a home, with the result being a lower rate of turnover in apartments: further restricting the supply of housing.

We’ll continue to watch the Berlin experiment with rent control in the year ahead. Its likely to have important lessons for those contemplating rent control on this side of the Atlantic.

 

 

The growth of global neighborhoods

As the US grows more diverse, so too do its urban neighborhoods. A new paper—“ Global Neighborhoods:
 Beyond the Multiethnic Metropolis”–published in Demography by Wenquan Zhang and John Logan traces out the changes in the racial and ethnic composition of US neighborhoods over the past three decades. Their chief finding: more and more American’s live in multi-ethnic “global neighborhoods”—places that have a significant representation of whites, African-Americans, Latinos and Asians. Between 1980 and 2010, the number of global neighborhoods in the US tripled, while the number of predominantly white neighborhoods fell by half.

Count of Census Tracts, by Neighborhood Category, 1980 and 2010

Neighborhood Category 1980 2010 Pct. Change
 All White 16,254 8,145 -50%
All Minority 3,762 6,408 70%
Semi-Global 4,495 9,346 108%
Global 3,858 10,378 169%

Source: Zhang & Logan, Tables 3-6.

What exactly is a “global neighborhood”?

Zhang and Logan define a global neighborhood as one that has the presence of all of four major ethnic groups (white, black, Hispanic and Asian). Presence is defined as having an ethnic group represented at a level equal to at least one-quarter of the level found in metropolitan areas a whole in the United States. In 2010, metro areas were 42 percent white, so the threshold for white presence was 10.5 percent. Similarly, the thresholds in 2010 were 4.5 percent for black presence, 7.4 percent for Hispanic presence and 2.4 percent for Asian presence.  The study includes data for 342 metropolitan areas, all with populations of at least 50,000 persons.

25680385576_4f566fd9f4_h

Despite fears that gentrification causes formerly minority-dominated neighborhoods to become “all white,” Zhang and Long’s data show that this almost never happens. Of 3,762 all-minority neighborhoods in 1980 (i.e. census tracts where whites made up fewer than 15.8 percent of the population), only 2 tracts changed to all white in the succeeding 30 years (i.e. a tract where fewer than 7.4 percent were Hispanic, fewer than 4.5 percent were black and fewer than 2.4 percent were Asian.

Some of the growing neighborhood level diversity of US cities and suburbs is baked in the demographic cake. Because the share of the nation’s population that is white, non-Hispanic is decreasing and the share that is Asian, African-American or Latino is increasing, the typical American neighborhood is automatically becoming more diverse. One of the significant findings of Zhang and Logan’s work is that neighborhoods are becoming diverse even faster than is explained by the demographic shift alone.

Invasion-succession versus buffers

One dominant thesis of racial neighborhood change has been the so-called “invasion-succession” model. Neighborhood change happens when a neighborhood dominated by one group (usually whites) is “invaded” by a different group, (usually blacks), and because of white flight, the neighborhood changes from predominantly white to predominantly black. This model of implies that neighborhood change happens, but that integration is an unstable and temporary state, and that ultimately, segregation gets recapitulated with new geographic boundaries.

While the “invasion-succession” model may have done a reasonably good job of explaining neighborhood change in the 1950s, 60s and 70s, it now appears to be far less common. Earlier at City Observatory we reviewed a study looking at neighborhood change which found that once integrated, neighborhoods tend to stay that way.

Zhang and Logan confirm this general trend, and what’s more, offer an interesting insight into how racial buffering is easing the transition to more diverse neighborhoods. They point out that predominantly white neighborhoods with relatively high shares of Latinos and Asians both stay diverse, but then also attract more black residents over time.   As they explain

Hispanics and Asians provide an effective social cushion and/or spatial separation between blacks and whites in integrated communities. The buffer in some way absorbs tensions and fosters acceptance between groups, making it possible for blacks and whites to share a neighborhood despite racial barriers in the society at large.

One trend continues to reinforce traditional patterns of segregation: the very low probabilities that majority minority neighborhoods transition to becoming more integrated. Of the predominantly non-white neighborhoods in 1980, more than 80 percent were still non-white neighborhoods in 2010.

 

Your guide to the debate over the Trump Infrastructure Plan

There’s a lot of ink being spilled — or is it pixels rearranged? — over the size, shape, merits and even existence of a Trump Administration infrastructure plan. Infrastructure was one of just a handful of substantive policy talking points in the campaign, and the President-elect reiterated this one on election night.  It also appears that this might be one area where there is some interest on the part of Democrats in Congress in working with the incoming administration.

Chaotianmen Bridge, Chongqing (Flickr: Robert Cortright)
Chaotianmen Bridge, Chongqing (Flickr: Robert Cortright)

While there’s one campaign-vintage white paper the sketches out a way that private investors might get tax credits for investing in infrastructure projects, and also some rumbling about using the taxes on repatriated corporate profits now avoiding US taxation by being parked overseas, there isn’t actually a definitive infrastructure proposal. So while will continue to track this issue as it develops, for now, we’ll just give you a quick guide to what is being said, by whom about what might be in the offing.

All we have to go on so far on the outlines of a Trump Administration infrastructure plan is a month-old campaign document. Peter Navarro and Wilbur Ross have produced a sketchy, 10-page white paper asserting that an 82 percent tax credit for private investment in infrastructure could attract about $167 billion in private equity, and leverage enough debt to support $1 trillion in infrastructure spending.  It goes on to assert that the cost to the federal government would be offset by the tax revenues paid by workers and businesses, making the plan revenue neutral. The “revenue neutral” claim is almost certainly wrong, because it assumes that none of these workers would be employed in the absence of the plan, and also that workers would pay a 28 percent tax rate on all of their earnings (28 percent is the marginal rate of taxation for many blue collar workers, but not the average tax rate).

The most prominent critic of this proposal to date is Paul Krugman, writing in The New York Times, in an article entitled: “Infrastructure Build or Privatization Scam.”  Krugman sees the proposal as dubious and rife with prospects for cronyism. Why, he asks, is it necessary to involve private investors at all–especially when public borrowing costs are so low, and the projects in question are public assets? In addition, requiring projects to repay investors from a stream of revenues effectively excludes the most needed and highest leverage investments–like maintenance and repair–and pushes investment in things we don’t need:  like more toll highways. Finally, Krugman points out that little of the investment is likely to be “additional” — i.e. projects not undertaken anyway, which obliterates the claim that the proposal will be revenue neutral.

Similarly, former Clinton campaign adviser, Ronald Klain thinks the Trump infrastructure plan is a trap.  In a Washington Post Op-Ed Klain warns that “There’s no requirement that the tax breaks be used for incremental or otherwise expanded construction efforts; they could all go just to fatten the pockets of investors in previously planned projects.” In Klain’s view, the tax breaks would produce little or no new jobs or investment, but they would worsen the deficit, and then be used as an excuse to cut other domestic spending.

Writing at Vox, Brad Plumer concludes “Donald Trump’s infrastructure plan wouldn’t actually fix America’s infrastructure problems.”  Plumer offers a good introduction to many of the practical and political pitfalls associated with trying to move to a private investment model for infrastructure. To date, there’s been little actual experience with public-private-partnerships (PPPs), and many have failed. The need to generate project revenues to repay investors skews investment decisions in favor of the kinds of infrastructure that are already relatively well-funded and systematically overlooks other investments that may have larger social or environmental benefits.

The University of Minnesota’s David Levinson is highly skeptical of the utility of public-private-partnerships as a vehicle for addressing our most pressing transportation needs.  In a post at his Transportist blog, argues that the fiscal analysis of the tax credit overlooks opportunity costs, that tax credits wouldn’t leverage a substantial pool of tax-free capital (pensions, overseas investors), that many socially valuable infrastructure investments don’t generate the revenue that would attract private investment and that a sale-and-lease-back arrangement might be a better means of privatizing assets than a complex tax credit scheme.

Democratic congressional leaders are succumbing to highway propaganda, at least in the view of Angie Schmidt and Ronald Fried, writing at Streetsblog.  The key problem with our transportation system is not so much a lack of resources, but a systematic bias toward projects that aim at expanding highway capacity and neglecting maintenance. The so-called report cards produced by the American Society of Civil Engineers, and others, are a misleading basis for setting investment policy.

Yonah Freemark echoes many of the concerns raised by Krugman, but also questions whether we need a massive increase in transportation spending, especially if it goes to create new road capacity.  Even if transit investments continue to get 20 percent of federal transportation funding (and that’s a big “if”), the effect would be strongly biased to subsidizing additional driving; as Freemark writes, far from “a reaffirming of the status quo; it would represent a dramatic incentive to get many more people driving.” Since tax credits are paid for via reduced general fund revenues, the finance scheme amounts to a subsidy for driving paid for by the general population.

Politico takes the temperature of key Republican and Democratic leaders on Capitol Hill. Committee chairs Bud Shuster and John Thune both voiced concerns about “how to pay for it.” It also looks like the Trump Administration may be backing away from the campaign white paper; one set of advisers are calling for the tapping tax revenue repatriation of corporate profits now parked overseas, while another is floating the idea of an infrastructure bank–which was Hillary Clinton’s campaign proposal.  The infrastructure debate may end up producing strange political bedfellows, with many Democrats eager to see more transportation spending, and many Republicans leery of the prospect.

While Republicans in Congress are stressing their fiscal skepticism over infrastructure, conservative economists are nearly apoplectic.  George Mason University economist Tyler Cowen, sees the Trump proposal as a slightly re-branded version of the Obama stimulus package. In an opinion piece at Bloomberg —  “The Trouble with Trump’s Infrastructure Plan”–Cowen argues that the GDP gains associated with greater infrastructure investment will be illusory, because more government spending will crowd out productive private investment, sucha as “[writing] business plans, building client lists, developing marketing strategies, cultivating customer relations” and that the additional debt to finance infrastructure today will necessarily lower GDP growth in future years.

In a piece pitched as a response to Paul Krugman, Tim Worstall takes different tack.  Writing for Forbes, “Paul Krugman’s Terrible Misunderstanding of Trump’s infrastructure plans,” Worstall sings the praises of utility privatization, pointing out that in the UK, private water operators are more productive and efficient that public utilities.  There’s a robust debate about the merits of privatization, but its hard to claim that if privatization is economically sensible that it would need an 82 percent tax credit for equity investment to pencil out. As Worstall concedes, “such a plan could degenerate into a mad scramble to collect the taxpayers’ cash.”

Economist Brad DeLong, looking for ways to push the incoming Trump Administration in a politically neutral, technocratic direction, thinks that a major investment in infrastructure could be productive common ground.  But Delong share’s Krugman’s concerns that the current proposal of a tax credit for private investment would essentially do nothing to either stimulate the economy, nor address the nation’s infrastructure challenges.

.. at the moment it does indeed look like money for nothing: have the government pay for projects most of which would have been built by privates anyway, and then entrench monopoly pricing of what ought to be free public-good infrastructure for a generation: a zero on the short-term Keynesian boost to employment and production, a zero on the medium-term Wicksellian rebalancing to allow the normalization of interest rates, and a zero on boosting America’s long-term potential by filling some of the infrastructure gap.

And rather than just focus on roads, DeLong also has in mind broadening the definition of infrastructure:

The first natural place for positive technocratic policy to focus on is, therefore, in making the very strong case for a real and substantial infrastructure construction-led fiscal expansion–and making sure that people remember that investing in the human capital of twelve year olds is a very durable piece of infrastructure indeed. The math that shows that at current interest rates borrow-and-build is indeed a no-brainer for the economy is math that is correct, and math that ought to be very familiar to Donald Trump. (emphasis added).

We’ll continue to follow this issue closely at City Observatory.  Stay tuned.

 

 

Cities and Elections

It’s election day, 2016. Here’s some of what we know about cities and voting.

Well, at last. Today is election day. While we’re all eagerly awaiting the results of the vote, we thought we’d highlight a few things we know about voting, especially as they relate to cities. Its food for thought as we get ready to digest and understand the results today’s elections.

It's time. (Flickr: Amanda Wood).
It’s time. (Flickr: Amanda Wood).

Democrats and density

In the past few elections, there’s been an increasingly strong relationship between population density and the share of the vote going to the democratic candidate. Dave Troy has plotted county level election returns from 2012 against population density. Low density counties voted overwhelmingly for Mitt Romney; higher density ones voted for Barack Obama. That same pattern is likely to be in evidence today.

Density and Voting

As a result, as Emily Badger wrote in The New York Times last week, the Republican party has essentially abandoned cities in presidential elections.

Homeowners are voters

Regular readers of City Observatory are very familiar with the homevoter hypothesis propounded by William Fischel, which observes that homeowners participate actively in the formation of local policies, as a way of protecting and enhancing the value of their homes. The practical implication is that homeowners support density restrictions and other policies that tend to raise home values and rents. In contrast, renters are generally under-represented in the electorate, especially in purely local elections. More data on that point was presented recently by the website ApartmentList.com. According to tabulations of self-reported data from the Census, for voting in the 2012 general election, about 77% of homeowners vote, compared to only 58% of renters – in other words, homeowners are 25% more likely to make their voice heard in an election. Part of the difference is explained by length of tenure—homeowners have generally lived in their houses longer than renters, but homeowners are more likely to vote than renters for any given length of tenure. Homeowners who have lived in their homes for 1-2 years are more likely to vote than renters who have lived in their homes for more than five years.

The gerontocracy of local elections

A new study from Portland State University takes a close look at the demographics of voter turnout in local elections. (Full disclosure: it’s lead authors include our friends and colleagues Phil Keisling and Jason Jurjevich). Their final report, “Who votes for Mayor?” provides a detailed look at turnout patterns in 50 of the nation’s largest cities. A particular virtue of this study is that it uses data from election records—more than 22 million in all–rather than after-the-fact surveys, which can be subject to mis-reporting (respondents may be reluctant to tell pollsters than they didn’t vote). Among their key findings: older people are much more likely to vote, especially in purely local elections than are younger ones. In cities, the variation in turnout by age heavily skews who chooses mayors and other local leaders. In the typical local election, the median voter is a full generation older than the overall electorate. As a result, at the municipal level, we have a gerontocracy, rather than a fully functioning democracy.

Moving and voting

Recent survey data for the current election, collected in September, zeroed in on an interesting aspect of voter preference: whether someone lived in or near the place where they were born. Summarizing the results of the Atlantic/PRRI survey, Daniel Cox and Robert Jones examined the presidential preference of white voters based on how close they lived to where they were born. Whites who reported living in their childhood hometown favored Trump 57 to 31 percent; those who lived outside their home town, but within two hours favored Trump 50 to 41 percent, and those who lived more than two hours away favored Clinton 46 to 40 percent. Of course, migration is a non-random and self-selected behavior, and is strongly correlated with education. But a key point here is that those whites who’ve chosen to move and live in different places are statistically more likely to favor Hillary Clinton than Donald Trump.

screenshot-2016-11-02-10-11-19

In this election—as in every election—many key urban issues, including age, education, migration, density and homeownership—play important roles in shaping electoral outcomes. In the next few days we’ll be examining the results of the 2016 election to see what role each of these factors has played.

 

 

 

Affordable Housing: Not just for a favored few

As we all know, 2016 is the year that reality television made its way to the national political stage. Less well noticed is how another idea from reality television has insinuated its way into our thinking about housing policy.

From 2006 to 2011, ABC television featured a popular reality television show called “Extreme Makeover: Home Edition.” In the show, a team of designers led by Ty Pennington worked with a small army of construction workers to completely rebuild one family’s usually tiny and dilapidated house in the space of about 48 hours. The lucky families were chosen based on audition tapes they submitted to the shows producers that showed that the family been victimized by a form of loss or tragic event, experienced a certain hardship.

extreme-makeover

In many respects, Extreme Makeover was “good” reality television; arguably it did at least tell compelling stories about the hardships and misfortunes that have struck many families, and provided a kind of telegenic barn-raising that made a tangible difference to the lives of those families. It didn’t degrade the participants families, and might have helped educate some viewers to the plight of others less fortunate than themselves.

But in many ways, we’re employing the Extreme Makeover approach to housing policy. A growing number of housing programs aid to address the housing needs of some specific, worthy group. While that’s well-intended, it may be a serious misstep. The key reason is that the problem of housing affordability is one of scale: fixing affordability, not just for a relative handful of people, but making the kind of system level changes that fix the underlying problems of constrained supply.

Unfortunately, too much of what gets labeled as housing affordability policies amount to token efforts to help a few favored groups. We have housing subsidies for the poor, but they reach fewer than a quarter of the eligible households. Inclusionary zoning programs provide so few units relative to the potential need that subsidized housing is allocated by an arcane lottery system that is so difficult to navigate that it gives well-educated applicants a big edge.

Two recent studies have criticized the tendency to carve out set-asides for favored groups for eligibility for subsidized housing. Writing last week at The American Prospect, Rachel Cohen, questioned a California proposal to dedicate a portion of that state’s low income housing tax credits to provide affordable housing for teachers. No doubt, housing is so expensive in many parts of the state that teachers can’t afford to buy the typical home. But as Cohen points out, few teachers fall below the 60 percent of area median income figure used to target low income housing tax credits.

In Minnesota, Myron Orfield has excoriated the use of these same tax credits—as well as other funding dedicated to affordable housing to provide subsidized housing for artists. In a white paper entitled “The Rise of White-Segregated Subsidized Housing.” he argues that artists housing is not only expensive and opulent, especially relative to other public housing, but that it serves mostly white populations, and actually serves to intensify patterns of racial segregation (unlike other public housing, artists public housing tends to get built in disproportionately white neighborhoods).

A good case can be made, of course, that more communities should be affordable to artists and teachers. But an equally strong case can be made that communities ought to be affordable to everyone who earns as much as artists and teachers, perhaps also even those who earn less. Trying to fix this problem for one deserving group at a time strikes us a “solution” that will never approach the scale of the problem.

Ultimately, focusing our attention on the worthiness of various different groups—artists, teachers, veterans and seniors—distracts attention from the underlying problem that we simply aren’t providing enough housing, in the right places, in the face of changing demand. The unseemly competition between these different groups just amplifies the zero sum nature of the current approach, without leading to reform of zoning laws or redressing the inequitable distribution of housing subsidies by income. This is one of those cases where fixing a problem for a few, may mean making the problem worse for everyone else.

 

 

 

Lies, damn lies, and (on-line shopping) statistics.

Here’s an eye-catching statistic: “people in the US buying more things online than in brick-and-mortar stores.” This appears in the lead of a story published this week by Next City.

There’s one problem with this claim: it’s not remotely close to true.

One of the things we pay taxes for is the Census Bureau, which gathers copious amounts of data about us and our economy. For a decade now, they’ve been tracking e-commerce sales. According to their latest report, e-commerce retail sales in the second quarter of 2016 were $97 billion, equal to about 8.1 percent of total retail sales in the U.S.

screenshot-2016-10-25-11-23-17

So how could anyone come up with the claim that we’re buying more on-line than in physical stores? Everyone knows the classic game of telephone, where a series of people are asked to repeat a simple phrase in succession. Between the first telling and the sixth or seventh repetition, some key words have changed, and often the meaning has become garbled or unintelligible. That’s what’s seemed to have happened here.

If you click through to the source cited by Next City, you are taken to a short write-up by Fortune magazine, of a proprietary study conducted for UPS by comScore, a web marketing research firm. Their headline: “Consumers are now doing most of their shopping online.” The article leads with the claim “For the first time ever, shoppers are going to the web for most of their purchases.” While it quotes the comScore and UPS as its source, Fortune doesn’t bother to provide a link to either company’s research (although it does show their stock tickers).

But that’s no problem, we can easily google “comScore, UPS, shopping, survey.” What we then find is comScore’s press release describing the study, and a link to the UPS website with a PDF of the white paper reporting the survey’s results. (Thanks comScore & UPS: That’s good form!)

The headline of the press release makes it clear that its authors are making a much more limited claim: “UPS Study: Avid Online Shoppers Making More Than Half of Their Purchases With E-commerce.” The key word here is “avid.” When you read the UPS study, conducted by Com Score, you discover exactly what they mean by avid online shoppers. First, it’s a survey sample is drawn from ComScore’s proprietary panel, which is composed of heavy internet users. Second, UPS and ComScore restricted survey participation to those panel members who shopped on line at least two times in the previous three months. Third, UPS and ComScore weighted the sample of participants so that at least 40 percent of participants were those who had shopped on line at least seven times in the past three months.   So, right of the bat, you have to recognize that this survey is only of tech savvy people, and has been skewed to exclude people who never or rarely shop on line, and weighted heavily toward respondents who do a lot of on line shopping.

What’s more, the white paper is impressively vague about the scope of what’s being measured. Is it the dollar value of purchases? Is the number of different products purchased? Is it only goods, or large goods, or durable goods? Unfortunately the white paper doesn’t include the survey instrument or basic tabulations of the answers to its questions, so there’s no way to tell for sure. Its pretty clear from the text that the survey excluded grocery purchases, and its highly likely that it also excluded purchases of things like gasoline and cars. There’s no breakdown by product category (i.e. apparel, books, electronics, etc), so its just impossible to tell what “a majority of purchases” means, even for this very select group.

There’s nothing wrong with that, if you’re trying to tell merchants about the behaviors of heavy on-line shoppers. But this kind of survey tells you nothing about whether this select sample is typical of all Americans, and there’s no way you can make a claim about whether this reflects what a majority of Americans are or aren’t doing. And, on top of this, we have clear evidence from the Census Bureau that on-line shopping accounts for less than a twelfth of all retail sales.

The growth of on-line shopping is likely to have major impacts on travel behavior, on urban form, and on employment. There’s no question that on line shopping is growing (Census data show it growing nearly 16 percent annually, compared to about 2.3 percent for all retail sales). As we wrote last year, the growth of e-commerce could have a major positive effect on urban congestion, as it eliminates consumer shopping trips (typically by single occupancy vehicle) and improves the efficiency of delivery services (by increasing the number of deliveries per mile traveled). So its really important that we have a good understanding of what’s changing and where (and we’ll have more to say about that in the weeks ahead at City Observatory).

In a simpler time, Paul Krugman once wrote that one ought not to publish statements that can be disproven in five minutes with a pocket calculator and a copy of the Statistical Abstract of the United States. Anybody who publishes an sweeping statistical claim like this one ought to have the ability to click through to the original source or just google a few statistics to verify they’ve got it right. But sadly, that’s not the standard here.

 

 

Cities and the price of parking

What the price of parking shows us about urban transportation 

Yesterday, we rolled out our parking price index, showing the variation in parking prices among large US cities.  Gleaning data from ParkMe, a web-based directory of parking lots and rates, we showed how much it cost to park on a monthly basis in different cities. There’s a surprising degree of variation:  while the typical rate is somewhere in the range of $200 a month, in some cities (New York) parking costs more than $700 a month, while in others (Oklahoma City) its less than $30 a month.

As Donald Shoup has exhaustively explained in its tome, The High Cost of Free Parking, parking has a tremendous impact on urban form. And while Shoup’s work focuses chiefly on the side effects of parking requirements and under-priced street parking, we’re going to use our index of parking prices to explore how market-provided parking relates to the urban transportation system.

In the United States, the majority of commuters travel alone by private automobile to their place of work.  But in some places–in large cities, and in dense downtowns–more people travel by transit, bicycle or walk to work.  It’s worth asking why more people don’t drive:  after all the cost of car ownership is essentially the same everywhere in the US.  The short answer is that in cities, parking isn’t free. And when parking isn’t free, more people take transit or other modes of transportation.

To see just how strong an explanation that parking prices provide for transit use, we’ve plotted the number of transit trips per capita in each of the largest metropolitan areas against the typical price of a month of parking in the city center.  Each data point represents a single metropolitan area.  There’s a very strong positive correlation between transit rides per capita and parking rates.  Cities with higher parking rates have more transit rides per capita than cities with lower parking rates.  The statistical correlation between the two measures is extremely strong:  the coefficient of determination (R2) is .83, suggesting that parking rates statistically explain 83 percent of the variation in transit use among cities.

Its worth noting that this relationship is based on extremely coarse data about both parking prices and transit use.  We’ve measured transit use for entire metropolitan areas (including dense centers and distant suburbs) and looked only at parking rates in and around the city hall of the largest city in each metropolitan area.  A more nuanced examination of parking rates and transit ridership (one, for example, that looked at parking rates and transit use in particular neighborhoods), might show an even stronger relationship.

What this points out is that private car commuting is extremely sensitive to the price commuters must pay.  For most commutes, commuters don’t have to pay for parking–their employers provide (often by regulatory fiat) “free” parking.  When confronted with paying the cost of parking (and the average is about $6 per day at a monthly rate), many more people choose to travel by other modes of transportation.

This suggests that there is much more opportunity to influence travel behavior by pricing than we commonly appreciate. In effect, what pricing of parking in some metropolitan areas is doing is correcting for the market failure of not pricing roads.

As we’ve frequently noted at City Observatory, we don’t directly charge for road use in the United States.  Motorists pay some road use fees, based almost entirely on fuel consumption (which, incidentally, don’t come close to covering the cost of the roads system).  Importantly, the way we charge for roads through fuel taxes bears no relationship to the roads motorists actually use, or the time that they use them.  And, as a practical matter, the cost and capacity of the road system are largely shaped by peak hour travel in urban places.

Its worth asking why private sector firms build and operate parking lots in some locations (and not others) and why car owners pay much higher rates to park in some cities than others.  An essential fact of private car travel is that it requires that owners have a place to store their vehicle at their origin and at their destination. In urban centers, there’s more demand for travel–and parking spaces–than can be met, with the effect that the price of parking is higher than elsewhere. In effect, private parking lots capture the the value associated with peak period car travel to dense urban destinations. Because we don’t charge for the use of the roads during the peak hour, private lots are able to capture some of the economic rents associated with access to the urban center at the peak hour.

The high value that people attach to access to urban centers is attracting a disruptive new entrant to the urban transportation market:  ride-hailing services like Uber and Lyft.  Our data show that the growth of these services–as proxied by the Brookings Institution’s recent estimates of the growth of non-employer transportation service providers–is also closely correlated with high parking rates.  In the following chart, we show the correlation between city parking rates (on the vertical axis) and the number of transportation service non-employers per 100,000 metropolitan population.  As with transit trips, there’s a strong, positive correlation.  The coefficient of determination is .68, implying that parking prices statistically explain about 68 percent of the variation in the penetration of ride hailing services among metropolitan areas.

This makes perfect sense: the richest market for ride-hailing is going to be in those places where it is most inconvenient and expensive to park a car.  Ride hailing is highly attractive if one’s alternative is to drive your own car, and have to hunt for, drive to, and then pay for parking. Conversely, if parking is free and abundant at your destination, there’s much less incentive to use Uber or Lyft, particularly if one already owns a private vehicle.

The strong relationship between parking prices and transit use, and between parking prices and the uptake of ridesharing has important implications for the future of urban transportation. First and foremost, it serves as reminder that prices offer powerful incentives that shape travel behavior. Transit is most heavily patronized in those cities where motorists have to pay relatively high prices for parking, and least used where parking is free. Second, it suggests that the most lucrative markets for ride-hailing services will be in relatively dense places (with lots of potential customers) and where parking is more expensive or scarce (making ride-hailing more attractive). We would expect low density suburbs and rural areas to be the least attractive markets ride-hailing services. Third, the price of parking currently operates as a kind of surrogate or shadow-price for roads in dense central cities. Fewer people drive, and more people take transit another modes, because of the high cost of parking. But as ride-sharing services expand, the constraint on demand for car travel in central cities imposed by high parking prices will disappear, with the effect that there will likely be much more demand for on-street travel. While city streets are un-priced, peak hour travel by Uber and Lyft is not. In fact, these operators both utilize surge pricing. As a result, it seems likely that the growth of ride-hailing, particularly with services that use surge pricing, transportation providers will capture some of the economic rents associated with peak period congestion. Profits for this sector are built in part on capturing the scarcity value of urban streets, which are un-priced or under-priced for both vehicle movement and storage.

The price of parking is an under appreciated aspect of the urban transportation system. As we wrestle with the disruptions from ride-hailing services, and perhaps soon, autonomous vehicles, what happens to parking prices could have major impacts on our cities.

 

 

 

 

The new mythology of rich cities and poor suburbs

There’s a new narrative going around about place. Like so many narratives, it’s based on a perceptible grain of truth, but then has a degree of exaggeration that the evidence can’t support.

6929385615_7d648fb222_b
Portland’s Pearl District (Flickr: Can Eldem)

Cities, we are told, are becoming playgrounds of the rich. Last week, Quartz headlined Richard Florida’s recent talk about the future of cities as A world famous urbanist says New York becoming “gated suburb.” Florida is more nuanced in his talk–highlighting a handful of neighborhoods where rich families are living in large apartments (with garages!) in the city’s upper east side. But the nuance gets lost in the journalistic retelling that focuses heavily on Florida’s warning that urban revival has a “dark side” that is creating “winner take all” cities.

It’s certainly true that we’ve witnessed a considerable rebound in the condition of America’s cities. After decades of decentralization and urban decline, things have started turning around. Population in a few cities began growing in the 1990s, and after 2000 even more cities moved forward.  In 2013, Brookings Demographer Bill Frey noted that cities had grown faster than their surrounding suburbs for two successive years and that this might constitute a “big city growth revival.”

The corollary of this narrative of city revitalization is that the suburbs are becoming poorer. The most definitive statement of this claim came in the Brookings Institution’s 2010 report, The Suburbanization of Poverty, which concluded: “By 2008, suburbs were home to the largest and fastest-growing poor population in the country.” The Brookings report showed that more poor people now lived in suburbs than in central cities and that suburban poverty was growing faster.

Statistically, both those statements are completely true. But let’s spend a minute unpacking that analysis. First, it helps to know how Brookings defines “city” and “suburb.” It defines city as the largest municipality in a metropolitan area, and “suburb” as virtually everything else. It turns out that by that definition roughly 70 percent of the metro population now lives in suburbs. So it’s hardly surprising that a majority of the poor no longer live in central cities.

The second part of the claim is that poverty is growing “faster” in suburbs. While that’s true, it’s from a far smaller base. But despite faster growth, poverty rates—the share of people living in suburbs who are below the poverty line—are far lower than they are in cities.

What’s more, dividing all urban space into just two categories (city and suburb) and reporting totals for each makes it seem like poverty is somehow increasing and evenly spread in every suburb. But that’s not true. Some poor suburbs are the older, first tier towns just outside the larger central city. For example, Camden and Hoboken New Jersey, East Hartford, Connecticut and Fall River Massachusetts–all struggling older cities are technically classified as–“suburbs” in the Brookings typology.

Alan Ehrenhalt made these kinds of claims that theme of his book, “The Great Inversion.” In it he argued that the half century long pattern of wealthy people living in the suburbs and the poor being concentrated in the central city was inverting.

As a description of the direction of change, these stories are right: many city neighborhoods are attracting more better educated and higher income residents. And some suburbs—usually older, blue collar ones—are seeing a growing number of families living in poverty.

Wildly overstating the trend

But the narrative of “rich cities, poor suburbs” represents a vast overstatement of the scale of these changes.

The magnitude of these changes hasn’t yet come close to fundamentally altering the pattern of income and urbanization in the US. It is still the case that the poor are disproportionately found in or near the city center, and the wealthy live in the suburbs.

Rather that using a crude binary classification of cities and suburbs, with cleavages drawn at arbitrary and often random political boundaries, it is much more illuminating to look at the exact pattern of correlations between neighborhood income and distance to the central business district. The University of Virginia’s Luke Juday has mapped this data to illustrate the relationship between centrality (distance to the center of the central business district) and income. (It is available on the web at http://statchatva.org/changing-shape-of-american-cities/). Helpfully, Juday has made this calculation for 1990 and for 2012 using Census data.

Here’s what the poverty gradient looks like for the average of the 50 largest U.S. metropolitan areas. The vertical axis shows the poverty rate (higher is poorer) and the horizontal axis shows the distance in miles from each neighborhood and the center of the CBD. (Values near zero are neighborhoods in and near downtown, higher values represent the more distant suburbs).

juday_povertyData for 50 largest US metro areas, 1990 (orange), 2012 (brown)

Here’s what these data show. First and foremost, poverty rates are highest in the center and lowest toward the suburban fringe. The farther you get from the center, the lower the poverty rate. So despite talk of “gated cities” and “poor suburbs,” the so-called great inversion simply hasn’t happened yet.  If you look closely at the difference between the poverty line for 1990 (orange) and 2012 (brown), you’ll see there is a difference. Poverty rates are now somewhat lower in the center than before, and somewhat higher in the suburbs than before.

It will be many decades before city poverty declines to suburban levels

It’s useful to do the mental exercise of estimating how long, at current rates of change, it would take for poverty rates to equalize across the metropolitan landscape (i.e. for the line to go from downward sloping left-to-right to nearly flat (meaning equal rates of poverty everywhere). And note: this would not be an imagined Paris-style inversion, where the rich live in the center, and the poor in the suburbs, but just simply an equal level of poverty across the metro region.

In the 22 years between 1990 and 2012, the central city poverty rate (1 mile from the center of the CBD) declined from 26 percent to 25 percent, while the poverty rate 20 miles away in the suburbs increased from 7 percent to 9 percent. Poverty in the center was on average 15 percentage points higher in the center than in the periphery. The gap between the two areas thus closed by 3 percentage points. At 1.5 percentage points per decade, it would take until the end of the century before poverty levels equalized between the urban core and twenty miles out.

Unless there’s a tremendous acceleration of this rate of change, an actual inversion is several decades away. And the idea that these lines would be inverted, i.e. sloping upward and to the right, meaning that poverty was higher in the suburbs than in the center, simply shows almost no signs of happening.

To be sure, a few neighborhoods have seen dramatic change, but to conclude that entire cities are becoming “gated suburbs” is a wild exaggeration of what so far, is a vary modest trend that has slightly ameliorated the centralized pattern of poverty in the US. The overall pattern of poverty in New York has hardly changed since 1990; the highest poverty rates are between 5 and 10 miles from the city center, and are more than triple the poverty rate in suburbs more than 15 miles away.

While the high profile gentrification of some urban neighborhoods attracts widespread media attention, the real story of neighborhood change in the United States is the persistence and spread of concentrated poverty. Over the last four decades, only about one in twenty urban high poverty neighborhoods rebounded—meaning that they went from a poverty rate of more than 30 percent in 1970 to less than 15 percent in 2010. (Fifteen percent is roughly the national average poverty rate). In that same time, the number of high poverty neighborhoods tripled, and the number of poor people living in them doubled. And these neighborhoods of concentrated poverty—which Raj Chetty’s work shows are toxic to the children growing up there—are disproportionately concentrated in central cities.

How can we harness this change to tackle real problems?

Rather than raise the alarm about what is in the vast majority of cities a very slow-moving non-crisis, our energy might be better spent thinking about how we might leverage the growing interest in urban living into a force that will undo the pattern of income segregation that has characterized the last half century or so of suburbanization — what Robert Reich called the secession of the successful.

Cities have been plagued for decades by desertion and disinvestment. The middle income families that could provide the fiscal and civic support for a vital city have been exiting. Now that some younger people are starting to come back to invest in city neighborhoods, commit to city schools, and exercise citizenship, there’s a huge opportunity to leverage this momentum to address the city’s poverty and segregation problems.

There are some practical policy steps that every city could take to make sure that the benefits of revitalization are widely shared. For one thing, revitalization means new jobs in or near places that have long been said to suffer from a “spatial mismatch.” Training and placing local residents for jobs in everything from construction (building and rehabilitating housing) to working in the growing retail and service businesses that are expanding in cities would directly address economic needs. The good news, as we’ve shown at City Observatory, is that the decentralization of job growth that has proceeded for decades is at an end, and jobs are moving back into cities.

Cities can also tap the added investment associated with revitalization to create more affordable housing in revitalizing neighborhoods. That’s exactly what Portland has done—dedicating about a third of the tax increment revenues from new housing development to pay for subsidized housing in urban renewal areas. The city’s tony new Pearl District is home to galleries, restaurants, theaters, high end condos and new market rate apartments. It also has more than 2,300 units of affordable housing, subsidized by tax increment financing. The result, far from being a “gated suburb” is a lively, walkable mixed income neighborhood with more economic diversity in a small area than any other part of the region.

There’s no reason why cities can’t use the economic momentum created by the interest in city living to build more affordable housing in revitalizing neighborhoods and create the kind of just, inclusive communities that we all seem to think would be a good idea. If we view economic integration as an important objective, the trends we see in cities ought to be regarded as shining examples of opportunity, rather than an inevitable “dark side” of the urban renaissance.

The apocalyptic exaggeration of nascent trends generates headlines but it’s a poor basis for making sensible policy. For too long, urban policy in the United States consisted of little more than triage and managed decline. If we’re really optimistic about cities, then we ought to be focusing our attention on constructive ways to manage this historic opportunity.

The most interesting neighborhood in the world

Where are the most interesting streetscapes and popular destinations in your city? Even among your friends and colleagues, there might be some lively disagreement about that question. But recently, search giant Google weighed in on this question when it overhauled Google Maps this summer.  Now it has a new feature, an creamsicle orange shading in certain city neighborhoods, that it calls “areas of interest.”  But what makes a neighborhood interesting? And do Google’s new peachy orange blobs correspond to anyone’s idea of what constitutes interesting?

most_interesting_man
Reputedly he is, or at least once was, the most interesting man in the world. Opinions differ.

The addition was part of a graphic facelift for Google Maps, which was generally applauded in the design community.  The new maps are a bit lighter, more prominently include neighborhood names, and highlight notable landmarks. Freeways and major arterials, parks, and the new peachy areas of interest are the outstanding features on these maps.

google_areas_of_interest

 

But not everyone was enamored of the new orange blotches. Writing at CityLab, Laura Bliss detected a bias.  Could it be she asked, that Google was only interested in areas with certain levels of income, ethnic compositions and levels of internet access?  Examining data for selected neighborhoods in Washington, Los Angeles, and Boston, she argued that low income neighborhoods of color tended to be less likely to get Google’s peachy designation.  

For example, while Westlake, a neighborhood towards the east side of Los Angeles is dense, relatively low-income, predominantly Latino area, with many restaurants, businesses, and schools only a few lots are highlighted in orange. In contrast, the mostly residential, mostly white neighborhood of Sawtelle, on the wealthier, west side of Los Angeles includes Wilshire and Santa Monica boulevards and a wide residential area, but the “nearly the entire area is shaded orange, for no clear reason.”

It’s a fair point to suggest that not everyone will find the same set of destinations “interesting,” and it’s likely given capitalism, demographics and math, that any algorithm-based means of identifying interesting areas will tend to select places that appeal to the masses, the mass market and the majority, and may leave out or fail to detect places that have appeal to subgroups of the population. And the fact that Google–while acknowledging that the presence of commercial activities influences its scoring– has been mostly vague about how it has identified areas of interest can add to the concern.

Earlier this year, at City Observatory, we set about tackling a similar question, using data on the location of customer-facing retail and service businesses to create a Storefront Index.  Essentially, we used a business directory database to map the locations of millions of retail and service businesses, in the process identifying places that have strong clusters of these businesses that form the nuclei of walkable areas. The special sauce in the index is the use of a nearest neighbor algorithm that provides that we map a business only if it is within 100 meters of another storefront business.

Because our algorithm is transparent (you can see each dot on the map) and because we’ve made our methodology public (details here), we thought it would be interesting to compare the Storefront Index clusters with Google’s Areas of Interest.  And in the process, perhaps we can marshal some evidence that will bear on Laura Bliss’s concern that there’s some latent bias hiding in the Google approach.

We’ve overlaid our storefront index map on Google Maps, so you can see how closely the two concepts align.  We haven’t undertaken any kind of statistical analysis, but a casual visual inspection shows that most areas of interest do in fact have high concentrations of storefronts.  Our City Observatory colleague, Dillon Mahmoudi has mapped storefronts in the 50 largest US metropolitan areas, and you can use this map to see how storefront clusters correspond to areas of interest. Here’s downtown Los Angeles.  (Click on the image to visit our interactive web page, with maps of other metropolitan areas).

screenshot-2016-10-07-17-05-29

Let’s take a closer look at a couple of the neighborhoods that Laura Bliss felt were slighted by Google Maps.  The first row shows two neighborhoods in Los Angeles, the second row two neighborhoods in Boston. The neighborhoods on the left were ones with very few and small areas of interest according to Google (and perhaps under-appreciated, according to Bliss); the ones on the right have relatively large shaded areas of interest.  The dots on each map correspond to our measure of storefronts–cluster of customer facing retail and service businesses.  Both of the “slighted” neighborhoods do have some clusters of storefront businesses (though their numbers are smaller, and there concentrations less dense than in the corresponding “favored” neighborhoods in the right hand column. While we’ve come up well-short of reverse engineering Google’s algorithm, these data do suggest that storefronts are a key driver of areas of interest. 

Slighted?Peachy?
Westlake
Westlake
Sawtelle
Sawtelle
Ashmont
Ashmont
Newbury
Newbury

It’s a fair question to ask as to whose preferences are reflected in any description of an “area of interest.” Given the diversity of the population and the heterogeneity of tastes and interests, what will be interesting to some people will be banal or off-putting to others. Or maybe its a semantic problem: by describing some areas as “interesting,” it seems like Google may be implicitly characterizing other areas as “uninteresting.”  Many of these concerns could be assuaged, we think, if Google chose to be a little more transparent about its basis for describing these areas, and if it called them by a different and more narrowly descriptive name, like “most searched” or “most popular.”  

Ultimately, the solution to the problem Laura Bliss has identified may be democratization and competition. The more data (including everything geolocated on the web, including Google maps and listings, tweets, user reviews, and traffic data) are widely available to end users, and the more different people are crafting their own maps, the better we may be able to create images that reflect the diversity of interests of map users.

 

The price of parking

How much does it cost to park a car in different cities around the nation?

Today, we’re presenting some new data on a surprisingly under-measured aspect of cities and the cost of living: how much it costs to park a car in different cities. There are regular comparisons of rents and housing costs between cities. The Bureau of Economic Analysis reports on regional price variations among states. But the price of parking falls into a kind of unlit corner of the statistical world.

Parking is central to the operation of our automobile dominated transportation system. There are more than 260 million cars and trucks in the United States, and most cars sit parked about 95 percent of the time.

parking_reflexblue
It isn’t free, in any sense of the word. (Flickr: reflexblue)

While we have copious data about cars—the number registered, the number of gallons of gasoline they burn (over 140 billion), the number of miles they travel (over 3 trillion)—we actually know precious little about the scale of the nation’s parking system.The best estimates suggest that there are somewhere between 722 million and more than 2 billion parking spaces in the United States.

The cost of constructing all of this parking is considerable. Surface parking spaces cost about 5,000 to $10,000 to construct (including the value of the land they occupy).  Structured parking costs between 25,000 and $50,000 per space.  And while expensive to build, the actual users of these parking spaces are seldom charged a price for using them.

The most common places where parking is priced in the marketplace are in city centers. Private owners of parking lots and structures charge hourly, daily and monthly rates to their users.  Cities collect revenue from metered on-street parking spaces.

But because all of these markets are intensely local, there’s effectively no national data on parking prices. In the absence of government statistics on parking prices, we turned to the next logical place: the Internet.

ParkMe is a web-based service that provides users with directions to parking structures and lots. And, importantly for our purposes, it gathers and reports on data the price of parking cities around the country. Here’s a snapshot of a typical search using Park.me.  We’ve used it to look for monthly parking available in downtown Seattle. The site shows the least expensive monthly parking rates within easy walking distance of the Seattle public library. Expect to pay around $200 a month to get a parking space in this part of Seattle.

parkme_seattle

We used the website to search for parking in each of the nation’s largest cities. For purposes of constructing reasonable comparisons among cities, we looked for parking lots and structures near the City Hall of each of the largest cities in each of the 50 largest metropolitan areas. We identified the five listed parking lots closest to City Hall and recorded the monthly price of parking for each lot. The we took the average of the five prices. The results are shown below:

 

 

There’s a huge variation among different cities in the price of parking. In the largest, densest cities, parking is the most expensive.  New York tops the list. Parking near City Hall costs a whopping $770 per month.  But in many other cities, the monthly cost of parking is much less. In downtown Oklahoma City, monthly parking costs only about $25 per month–or about a dollar a day.  For the forty-six cities for which we were able to obtain data (ParkMe didn’t have data for all the 50 largest metros), the median prices of monthly parking in the city center was about $120. If you’re commuting to work 20 or so days a month, that works out to a daily cost of about $6.

Its interesting to look at the geographic patterns of variation in parking prices.  The highest prices are for cities in the Northeast and on the West Coast.  In the heartland and in the South, parking prices are generally much lower.

To our knowledge this is the first comprehensive comparison of inter-city differences in parking prices. Even Donald Shoup’s magisterial and encyclopedic treatment of the issue — the High Cost of Free Parking — doesn’t report such a list.  The price of parking has a lot to do with travel behavior, and with urban form. In tomorrow’s commentary, we’ll present some initial findings on the relationship of parking prices to other aspects of the urban environment.

 

Memo to Stockholm

Next Monday, very early, before anyone in North America is out of bed, the Royal Swedish Academy of Sciences will announce the name of the 2016 Nobel Laureate in economic sciences. No doubt the decision has already long since been made by the prize committee. But if they’re still undecided, we have a suggestion.

Paul Romer at TED University. TED2011. February 28 - March 4, Long Beach, CA. Credit: James Duncan Davidson / TED
Paul Romer at TED University. TED2011. February 28 – March 4, Long Beach, CA. Credit: James Duncan Davidson / TED

Its hard to find a field more staid and often impenetrable than economics. Just explaining the contributions of a Nobel Laureate in the field is a taxing task for journalists, and seldom of huge interest to the general public. Plus, in economics at least, the Nobel seems a bit like the profession’s equivalent of Hollywood’s Irving G. Thalberg memorial award, something that recognizes lifetime achievement and is handed out to venerable scholars, decades after their most productive years. Since the laureate isn’t generally awarded posthumously, there’s always an eye on the older members of the profession.  There are a variety of predictions of who might win the prize. Thomson-Reuters uses an analysis of academic citations; their favorite is macroeconomist Olivier Blanchard.

In our view, the academy might want to closely consider giving the award to Paul Romer, recently appointed to be the chief economist for the World Bank, for two reasons.

First, in a series of papers published a couple of decades ago, Romer was responsible for some of the key breakthroughs in what is called “New Growth Theory,” which re-writes the mechanics of long-term economic growth in a fundamental and optimistic way. We described the key insights from of these theories a couple of months ago at City Observatory. Romer’s long been short-listed for the prize on account of this work, awaiting it seems, only sufficient quantities of gray hair to take his turn.

Second, in the past few weeks, Romer has turned the economic world on its head with a scathing critique of deep flaws in the past two decades of macroeconomic theorizing. In a paper entitled, “The Trouble with Macroeconomics,” Romer indicts the state of macroeconomics, and its growing detachment from the real world. The abstract of this paper reads as follows:

For more than three decades, macroeconomics has gone backwards. The treatment of identification now is no more credible than in the early 1970s but escapes challenge because it is so much more opaque. Macroeconomic theorists dismiss mere facts by feigning an obtuse ignorance about such simple assertions as “tight monetary policy can cause a recession.” Their models attribute fluctuations in aggregate variables to imaginary causal forces that are not influenced by the action that any person takes. A parallel with string theory from physics hints at a general failure mode of science that is triggered when respect for highly regarded leaders evolves into a deference to authority that displaces objective fact from its position as the ultimate determinant of scientific truth.

You may never read economics papers: But you should read this one. The paper bluntly calls much of the published work in macroeconomics “unscientific” and Romer re-labels some widely used economic terms as “phlogiston” and “aether” and “caloric” and describes the entire fields as “post-real” macroeconomics.

In the academic world, Romer’s paper is the equivalent of Martin Luther’s having nailed his 95 theses to the door of the Wittenberg Cathedral (which, incidentally was 499 years ago, this month). As Romer himself has pointed out, few of his criticisms are new, but to date they’ve been stated largely in oblique terms and obscure quarters. This paper has generated considerable comment and controversy, and Romer has responded to criticism in a measured but forceful way. But now this debate is very much out in the open.

It may seem like the macroeconomics and the Nobel Prize for Economics is a bit far afield for an urban policy focused website like City Observatory. It isn’t. Getting the macroeconomy right is an essential precondition for healthy cities. In recent years, an austerity policies—predicated on the flawed theories of the real business cycle and the advice of its promulgators—has led national governments to throttle back economic growth. In the decade prior, flawed policies and a blind faith in the power of the market to discipline financial institutions produced both a wasteful bubble of housing investment and a subsequently devastating economic collapse. Macroeconomics matters deeply to cities. And as we’ve written, Romer’s work as drawn a clear line between the institutions and innovation of cities and the process of long term growth.

Speaking truth to power, in terms that cannot be misunderstood

Romer’s critique also raises an important question about tone and rhetoric. He’s taken the extraordinary step of bluntly and personally challenging some of the acknowledged leaders in field (including Nobel laureates). He has done so at not insubstantial risk to his own personal career. But as he argues, in the spirit of Voltaire, our primary allegiance has to be to the truth, and not to the favorable opinions of a cadre of peers.

Especially this year, when the rhetoric of the nation’s presidential campaign has fallen to an all-time low, it seems like a bad time to celebrate apparent incivility. Academic debates—at least those that appear in print—are supposed to be muted and polite disagreement, rather than a brawl. In practice, they are so opaque and inaccessible that few outside the profession can even understand that there’s a disagreement. For all scientists, but especially economists, its even more important that they write in a way that cannot possibly be misunderstood, even at the cost of offending someone.

Another Nobel Laureate in Economics, Paul Krugman reached a similar conclusion:

Outsider positions, like that of being an iconoclastic columnist at the New York Times, require a lot of effort to get peoples’ attention. It wasn’t nice to characterize the doctrine of expansionary austerity as belief in the confidence fairy, but I do believe that it focused the discussion in a way that a less caustic approach would not have achieved.

And one more point: writing effectively requires that you have a voice, that the passion shows — and too much self-censorship can get in the way, making the writing dull and stiff.   . . . pretending to respect views that you don’t isn’t, and shouldn’t, be part of the job description for economists trying to grapple with these important issues.

The Nobel prize sends a signal to the world–and to the scientific community–about what matters. This is a unique opportunity to do just that.

Paul Romer is awarded the Economics Nobel

Why the leading economist of innovation sees a central role for cities

Two years ago, in 2016, we did our best to nudge the Royal Swedish Academy of Sciences to give the Nobel Laureate in Economic Sciences to Paul Romer.  It turns out we were just a couple of years early. Yesterday, the Academy, announced Romer as the 2018 economics laureate (along with William Nordhaus).

This should be of far more than academic interest to urbanists. As we argued in our missive two year’s ago, Romer’s work has a lot to say about cities.  With typical modesty, Romer’s response to winning the prize was to move a short, non-technical post summarizing the work to the top of his personal blog. It features a prominent role for cities as engines of innovation, progress and wealth creation:

One of the biggest meta-ideas of modern life is to let people live together in dense urban agglomerations. A second is to allow market forces to guide most of the detailed decisions these people make about who they interact with each other. Together, the city and the market let large groups of people cooperate by discovering new ideas, sharing them, and learning from each other. The benefits can show up as a new design for a coffee cup or wages for a worker that grow with experience acquired in jobs with a sequence of employers. People living in a large city cooperate with residents there and through many forms of exchange, with residents in other cities too. Cities connect us all together. China’s growth reflects is rapid embrace of these two big meta-ideas, the market and the city.

We won’t try to summarize all of Romer’s work here.  Suffice it to say, he deserves enormous credit for focusing our attention on the importance of creating new ideas as a principal means of driving economic growth and improvements in well being. As the quote above illustrates, he sees a central role for institutions, like cities, in creating the environment in which knowledge creation can flourish. And he’s proven many times that his is a clear and outspoken voice for realizing the potential his theories describe.

Paul Romer at TED University. TED2011. February 28 – March 4, Long Beach, CA. Credit: James Duncan Davidson / TED

First, in a series of papers published a couple of decades ago, Romer was responsible for some of the key breakthroughs in what is called “New Growth Theory,” which re-writes the mechanics of long-term economic growth in a fundamental and optimistic way. We described the key insights from of these theories a couple of months ago at City Observatory. Romer’s long been short-listed for the prize on account of this work, awaiting it seems, only sufficient quantities of gray hair to take his turn.

Second, in the past few weeks, Romer has turned the economic world on its head with a scathing critique of deep flaws in the past two decades of macroeconomic theorizing. In a paper entitled, “The Trouble with Macroeconomics,” Romer indicts the state of macroeconomics, and its growing detachment from the real world. The abstract of this paper reads as follows:

For more than three decades, macroeconomics has gone backwards. The treatment of identification now is no more credible than in the early 1970s but escapes challenge because it is so much more opaque. Macroeconomic theorists dismiss mere facts by feigning an obtuse ignorance about such simple assertions as “tight monetary policy can cause a recession.” Their models attribute fluctuations in aggregate variables to imaginary causal forces that are not influenced by the action that any person takes. A parallel with string theory from physics hints at a general failure mode of science that is triggered when respect for highly regarded leaders evolves into a deference to authority that displaces objective fact from its position as the ultimate determinant of scientific truth.

You may never read economics papers: But you should read this one. The paper bluntly calls much of the published work in macroeconomics “unscientific” and Romer re-labels some widely used economic terms as “phlogiston” and “aether” and “caloric” and describes the entire fields as “post-real” macroeconomics.

In the academic world, Romer’s paper is the equivalent of Martin Luther’s having nailed his 95 theses to the door of the Wittenberg Cathedral (which, incidentally was 499 years ago, this month). As Romer himself has pointed out, few of his criticisms are new, but to date they’ve been stated largely in oblique terms and obscure quarters. This paper has generated considerable comment and controversy, and Romer has responded to criticism in a measured but forceful way. But now this debate is very much out in the open.

It may seem like the macroeconomics and the Nobel Prize for Economics is a bit far afield for an urban policy focused website like City Observatory. It isn’t. Getting the macroeconomy right is an essential precondition for healthy cities. In recent years, an austerity policies—predicated on the flawed theories of the real business cycle and the advice of its promulgators—has led national governments to throttle back economic growth. In the decade prior, flawed policies and a blind faith in the power of the market to discipline financial institutions produced both a wasteful bubble of housing investment and a subsequently devastating economic collapse. Macroeconomics matters deeply to cities. And as we’ve written, Romer’s work as drawn a clear line between the institutions and innovation of cities and the process of long term growth.

Speaking truth to power, in terms that cannot be misunderstood

Romer’s critique also raises an important question about tone and rhetoric. He’s taken the extraordinary step of bluntly and personally challenging some of the acknowledged leaders in field (including Nobel laureates). He has done so at not insubstantial risk to his own personal career. But as he argues, in the spirit of Voltaire, our primary allegiance has to be to the truth, and not to the favorable opinions of a cadre of peers.

Especially this year, when the rhetoric of the nation’s presidential campaign has fallen to an all-time low, it seems like a bad time to celebrate apparent incivility. Academic debates—at least those that appear in print—are supposed to be muted and polite disagreement, rather than a brawl. In practice, they are so opaque and inaccessible that few outside the profession can even understand that there’s a disagreement. For all scientists, but especially economists, its even more important that they write in a way that cannot possibly be misunderstood, even at the cost of offending someone.

Another Nobel Laureate in Economics, Paul Krugman reached a similar conclusion:

Outsider positions, like that of being an iconoclastic columnist at the New York Times, require a lot of effort to get peoples’ attention. It wasn’t nice to characterize the doctrine of expansionary austerity as belief in the confidence fairy, but I do believe that it focused the discussion in a way that a less caustic approach would not have achieved.

And one more point: writing effectively requires that you have a voice, that the passion shows — and too much self-censorship can get in the way, making the writing dull and stiff.   . . . pretending to respect views that you don’t isn’t, and shouldn’t, be part of the job description for economists trying to grapple with these important issues.

The Nobel prize sends a signal to the world–and to the scientific community–and to urbanists, about what matters.

The price of autonomous cars: why it matters

If you believe the soothsayers–including the CEO of Lyft–our cities will soon be home to swarms of autonomous vehicles that ferry us quietly, cleanly and safely to all of our urban destinations. The technology is developing–and rolling out–at a breakneck pace. Imagine some combination of Uber, electrically powered cars, and robotic control.  You’ll use your handheld device to summon a robotic vehicle to pick you up, then drop you off at your destination. Vast fleets of these vehicles will flow through city streets, meeting much of our transportation demand and reducing the ownership of private cars. Big players in the automobile and technology industries are making aggressive bets that this will happen.  But, the big question behind this, as we asked in part one of this series yesterday, is “how much will it cost?”

While the news that Uber is now street-testing self-driving cars in Pittsburgh–albeit with full time human supervisors–has heightened expectations that a massive deployment is just around the corner, some are still expressing doubts.  The Wall Street Journal points out that the initial deployment of autonomous vehicles may be restricted to well-mapped urban areas, slow speeds (under 25 miles per hour) and good weather conditions.  It could be twenty years before we have “go anywhere” autonomous vehicles.

And those looking forward and contemplating the widespread availability of self-driving cars are predicting everything from a new urban nirvana to a hellish exurban dystopia.  The optimists see a world where parking spaces  are beaten into plowshares, the carnage from car crashes is eliminated, where greenhouse gas emissions fall sharply and where the young, the old and the infirm, those who can’t drive have easy access to door-to-door transit. The pessimists visualize a kind of exurban dystopia with mass unemployment for those who now make their living driving vehicles, and where cheap and comfortable autonomous vehicles facilitate a new wave of population decentralization and sprawl.

To an economist, all of these projections hinge on a single fact about autonomous vehicles that we don’t yet know:  how much they will cost to operate.  If they’re cheap, they’ll be adopted more quickly and widely and have a much more disruptive effect.  If they’re more expensive than private cars or transit or biking or walking, they’ll be adopted more slowly, and probably have less impact on the transport system. (It’s worth noting that despite their notoriety, today’s Uber and Lyft ridesharing services have been used by less than 15 percent of the population).  Whether autonomous vehicles become commonplace–or dominant–or whether they remain a niche product, for a select segment of the population or some restricted geography, will depend on how much they cost.

As we reported yesterday, the consensus of estimates is that fleets of autonomous vehicles would likely cost between about 30 and 50 cents per mile to operate sometime in the next one to two decades. That’s potentially a good deal cheaper than the 50 to 85 cents average operating cost for a conventional privately owned vehicle.  All of these estimates assume that the hardware and software for navigation and vehicle control, including computers, sensors and communications, though expensive today, will decline in cost as the technology quickly matures. Some of those savings come from a combination of electric propulsion, and perhaps smaller, purpose built “pod” vehicles. But most of the savings comes from greater utilization. Privately owned cars, it is frequently noted generally sit idle 90 percent of the time. In theory, at least, fleets of autonomous vehicles would be more nearly in constant motion, taking up less space for storage, and doing more work.

av_cost_ests

Peak demand and surge pricing

A couple of things to keep in mind as we ponder the meaning of these estimates:  First, cost is not the same as price. While these figures represent what it might cost fleet owners to operate such vehicles, the prices they charge customers will likely be higher, both because they’ll want a profit, and because travel demand at some peak times (and locations) will exceed capacity.  

And that’s the big obstacle to realizing the theoretical higher utilization of autonomous vehicles. Demand for travel isn’t spread evenly throughout the day. Many more of us want to travel as certain times (especially early in the morning and late in the afternoon) and the presence of these peaks, as we all know, is the defining feature of our urban transportation problem. Whiz-bang technology or not, there simply won’t be enough autonomous vehicles to handle the demand at the peak hour, for two reasons:  first, fleet operators won’t want to own enough vehicles to meet the peak, as those vehicles would be idle all the rest of the time.  The second issue is what Jarrett Walker has called the “geometry” problem: there simply isn’t enough room on city streets and highways to accommodate all the potential peak travelers if they are each in a personal vehicle.  

Consider a practical example. One prominent study, by Columbia University’s Earth Institute, predicts that it would be possible to run autonomous vehicles in Manhattan for 40 cents per mile.  That’s far cheaper than current modes of travel–including taxi, ridesharing, private cars and even the subway or bus for trips of less than five miles–so it’s likely that many more people will want to take advantage of autonomous vehicles than there will be vehicles to accommodate them. So, at the peak, autonomous vehicles will undoubtedly charge a surge fare, just Uber and Lyft do now.

The competitive challenge to transit, especially off-peak

Most of the estimates presented here suggest fully autonomous vehicles will be cheaper than privately owned conventional vehicles.  It’s also likely that they may be less expensive than transit for many trips. In many cities the typical bus trip is only 2 or 3 miles in length; if the price of an autonomous vehicle is less than 50 cents per mile, the cost of such a trip (door-to-door, in an non-shared vehicle) will be less than the transit fare.  Autonomous vehicles could easily cannibalize much of the transit market, especially in off-peak hours.

And because they can charge fares much higher than costs at the peak, operators will likely discount off-peak fares to below cost.  That may mean at non-peak times, autonomous vehicles may be available to travelers at prices lower than the estimates shown here.  Simply put, as long as operators cover their variable costs–which are likely to be electricity and tires–they needn’t worry about covering their fixed costs (which can be paid for from peak period profits).

Behavioral effects of per-mile pricing

The silver lining here–if there is one–is that the kind of per mile pricing that fleet vendors are likely to employ for autonomous vehicle fleets will send much stronger signals to consumers about the effects of their travel decisions than our current mostly flat-rate travel pricing.  Today, most households own automobiles, and have pay the same level of fixed costs (car payments, insurance) whether they use their vehicle or not for an additional trip. Because the marginal cost of a trip is often perceived to be just the cost of fuel (perhaps 15-20 cents per mile), households use cars for trips that could easily be taken by other modes. That calculus changes  if each trip has a separate additional cost–and consumers are likely to alter their behavior accordingly. Per mile pricing will make travelers more aware–and likely more sensitive to–the tradeoffs of different modes and locations. The evidence from evaluations of car-sharing programs, like Zip-car, show that per mile pricing tends to lead many households to reduce the number of cars they own–or give up car ownership altogether.

The price of disruption

If these cost estimates are correction, and if autonomous cars are actually feasible any time soon, the lower cost of single occupancy vehicle travel and a different pricing scheme will likely trigger greater changes in travel behavior. At the same time, other institutions, like road-building agencies and transit providers may see a major disruption of their business models.  A move to electric cars threatens the principal revenue source of road-building agencies, the gas tax. And an overall decline in vehicle ownership coupled with more intense peak demand could be a state or city transportation department’s fiscal nightmare.  Whether that happens depends a lot on whether these forecasts of relatively inexpensive autonomous vehicles pan out.

How much will it cost?
How much will it cost–and who will end up paying?

 

 

 

What price for autonomous vehicles?

It’s easy to focus on technology, but pricing will determine autonomous vehicles impact.

Everyone’s trying hard to imagine what a future full of autonomous cars might look like. Sure, there are big questions about whether a technology company or a conventional car company will succeed, whether the critical factor will be manufacturing prowess or software sophistication, and all manner of other technical details.

How much will it cost?
How much will it cost?

But for economists — and also for urbanists of all stripes — a very big question has to be:  How much will autonomous cars cost?  We’re going to tackle this important question in two parts.  Part one–today–assembles some of the estimates that have been made.  We’ll aim to ballpark the approximate cost per mile of autonomous vehicles.  In part two–tomorrow–we’ll consider what this range of estimates implies for the future of urban transportation, and for cities themselves, because transportation and urban form are so closely interrelated.

So here is a first preliminary list of some of the estimates of the cost per mile of operating autonomous vehicles.  We’ve reproduced data from a number of sources, including universities, manufacturers, and consulting firms. Its difficult to make direct comparisons between these estimates, because they not only employ different assumptions, but also forecast costs for different future years (with unstated assumptions about inflation). There’s some significant disagreement about the cost of operation of current vehicles, which range from 59 cents per mile to 84 cents per mile.  (For this commentary, we’ve assembled these estimates without undertaking our own analysis of their accuracy or reliability; we encourage interested readers to click through and read each of these studies and draw their own conclusions about their utility).

ford_av$1.00 per mile.
Ford (2016) thinks it can reduce the cost of highly automated vehicles to about $1.00 per mile, making them highly competitive with taxis which it estimates cost $6.00 per mile.
rmi_av_203551 cents per mile (2025), 33 cents per mile (2040)
Rocky Mountain Institute (2016) estimates that in 2018, autonomous vehicle costs will be roughly competitive with current vehicles (about 84 cents per mile), but will steadily decline, to 51 cents per mile by 2025 and 33 cents per mile by 2040.
morgan_stanley_share_owned50 cents per mile (2030).
Morgan Stanley (2016) estimates autonomous vehicles will cost about 50 cents per mile by 2030, compared to about 74 cents per mile for privately owned standard vehicles."


kpmg_201643 cents per mile.
KPMG (2016) estimates costs of 43 cents per mile total. It estimates current cars have variable costs of 21 cents per mile and fixed costs of 61 cents per mile for a total of 84 cents per mile. KPMG estimates new shared AVs would cost 17 cents per mile variable, and 26 cents per mile fixed (43 cents per mile total) with $25K car fully depreciated in 3 years being driven about 40K miles per year.
deloitte31 to 46 cents per mile.
Deloitte (2016) estimates costs of 46 cents to as little as 31 cents per mile for autonomous vehicles; the latter estimate corresponds to low speed purpose built pods.
barclays_201629 cents per mile (2040)
Barclay’s (2016) estimates the costs of autonomous vehicles at .29 per mile by 2040, compared to about 66 cents per mile for conventional, privately owned vehicles today.
earthinstitute_201315-41 cents per mile.
Columbia University Earth Institute (2013) estimates costs of autonomous vehicles would be about 41 cents per mile for full-sized vehicles and could be as little as 15 cents per mile for purpose-built low speed vehicles. This compares to costs of 59 to 75 cents per mile for conventional privately owned automobiles.

The estimates for future costs range from as much as a dollar per mile (Ford’s near term estimate of its cost of operation for what it refers to as “highly automated vehicles),” to an estimate of 15 cents per mile a decade or more from now for the operation of small purpose-built low-speed urban “pods”–like Google’s prototype autonomous vehicle.  Overall, the estimates imply that fleets of autonomous vehicles could be operated in US cities in the next decade or two for something between 30 and 50 cents per mile.

And, for a variety of reasons–which we’ll explore in more detail tomorrow–the deployment of autonomous vehicles is much more likely to occur in cities. The critical factor is that market demand will be strongest in cities. According to the Wall Street Journal, autonomous vehicles will initially  be restricted to low speeds, avoid bad weather and stay within carefully circumscribed territories (given the cost and complexity of constructing the detailed maps  autonomous vehicles to navigate streets), all factors that point to cities.

These estimates hinge on a number of important assumptions about operating costs. The highest estimates usually assume some form of automating something resembling existing vehicles; operating costs are assumed to be lower with electric propulsion and smaller vehicles. A key cost driver is vehicle utilization and lifetime; fleets of autonomous vehicles are assumed to be used much more intensively than today’s privately owned cars, with a big reduction in capital cost per mile traveled.

There are some other big assumptions about whole categories of costs, and the policy environment looking forward. Todd Litman raises the concern that autonomous vehicles will require relatively high expenditures for cleaning, maintenance and vandalism repair, as much as hundreds of dollars per week.  Its not clear that any of the estimates for the costs of operating electric vehicles include any kind of road user fee to replace gas tax revenues now paid by internal combustion powered vehicles.

Despite they uncertainties, the available estimates suggest that successful autonomous vehicles could be substantially cheaper than today’s cars. And if they’re available on-demand and a la carte–freeing users from the cost of ownership, parking, maintenance and insurance–this may engender large changes in consumer and travel behavior.

 

How much will autonomous vehicles cost?

Everyone’s trying hard to imagine what a future full of autonomous cars might look like. Sure, there are big questions about whether a technology company or a conventional car company will succeed, whether the critical factor will be manufacturing prowess or software sophistication, and all manner of other technical details.

How much will it cost?
How much will it cost?

 

But for economists — and also for urbanists of all stripes — a very big question has to be:  How much will autonomous cars cost?  We’re going to tackle this important question in two parts.  Part one–today–assembles some of the estimates that have been made.  We’ll aim to ballpark the approximate cost per mile of autonomous vehicles.  In part two–tomorrow–we’ll consider what this range of estimates implies for the future of urban transportation, and for cities themselves, because transportation and urban form are so closely interrelated.

So here is a first preliminary list of some of the estimates of the cost per mile of operating autonomous vehicles.  We’ve reproduced data from a number of sources, including universities, manufacturers, and consulting firms. Its difficult to make direct comparisons between these estimates, because they not only employ different assumptions, but also forecast costs for different future years (with unstated assumptions about inflation). There’s some significant disagreement about the cost of operation of current vehicles, which range from 59 cents per mile to 84 cents per mile.  (For this commentary, we’ve assembled these estimates without undertaking our own analysis of their accuracy or reliability; we encourage interested readers to click through and read each of these studies and draw their own conclusions about their utility).

ford_av$1.00 per mile.
Ford (2016) thinks it can reduce the cost of highly automated vehicles to about $1.00 per mile, making them highly competitive with taxis which it estimates cost $6.00 per mile.
rmi_av_203551 cents per mile (2025), 33 cents per mile (2040)
Rocky Mountain Institute (2016) estimates that in 2018, autonomous vehicle costs will be roughly competitive with current vehicles (about 84 cents per mile), but will steadily decline, to 51 cents per mile by 2025 and 33 cents per mile by 2040.
morgan_stanley_share_owned50 cents per mile (2030).
Morgan Stanley (2016) estimates autonomous vehicles will cost about 50 cents per mile by 2030, compared to about 74 cents per mile for privately owned standard vehicles."


kpmg_201643 cents per mile.
KPMG (2016) estimates costs of 43 cents per mile total. It estimates current cars have variable costs of 21 cents per mile and fixed costs of 61 cents per mile for a total of 84 cents per mile. KPMG estimates new shared AVs would cost 17 cents per mile variable, and 26 cents per mile fixed (43 cents per mile total) with $25K car fully depreciated in 3 years being driven about 40K miles per year.
deloitte31 to 46 cents per mile.
Deloitte (2016) estimates costs of 46 cents to as little as 31 cents per mile for autonomous vehicles; the latter estimate corresponds to low speed purpose built pods.
barclays_201629 cents per mile (2040)
Barclay’s (2016) estimates the costs of autonomous vehicles at .29 per mile by 2040, compared to about 66 cents per mile for conventional, privately owned vehicles today.
earthinstitute_201315-41 cents per mile.
Columbia University Earth Institute (2013) estimates costs of autonomous vehicles would be about 41 cents per mile for full-sized vehicles and could be as little as 15 cents per mile for purpose-built low speed vehicles. This compares to costs of 59 to 75 cents per mile for conventional privately owned automobiles.

The estimates for future costs range from as much as a dollar per mile (Ford’s near term estimate of its cost of operation for what it refers to as “highly automated vehicles),” to an estimate of 15 cents per mile a decade or more from now for the operation of small purpose-built low-speed urban “pods”–like Google’s prototype autonomous vehicle.  Overall, the estimates imply that fleets of autonomous vehicles could be operated in US cities in the next decade or two for something between 30 and 50 cents per mile.

And, for a variety of reasons–which we’ll explore in more detail tomorrow–the deployment of autonomous vehicles is much more likely to occur in cities. The critical factor is that market demand will be strongest in cities. According to the Wall Street Journal, autonomous vehicles will initially  be restricted to low speeds, avoid bad weather and stay within carefully circumscribed territories (given the cost and complexity of constructing the detailed maps  autonomous vehicles to navigate streets), all factors that point to cities.

These estimates hinge on a number of important assumptions about operating costs. The highest estimates usually assume some form of automating something resembling existing vehicles; operating costs are assumed to be lower with electric propulsion and smaller vehicles. A key cost driver is vehicle utilization and lifetime; fleets of autonomous vehicles are assumed to be used much more intensively than today’s privately owned cars, with a big reduction in capital cost per mile traveled.

There are some other big assumptions about whole categories of costs, and the policy environment looking forward. Todd Litman raises the concern that autonomous vehicles will require relatively high expenditures for cleaning, maintenance and vandalism repair, as much as hundreds of dollars per week.  Its not clear that any of the estimates for the costs of operating electric vehicles include any kind of road user fee to replace gas tax revenues now paid by internal combustion powered vehicles.

Despite they uncertainties, the available estimates suggest that successful autonomous vehicles could be substantially cheaper than today’s cars. And if they’re available on-demand and a la carte–freeing users from the cost of ownership, parking, maintenance and insurance–this may engender large changes in consumer and travel behavior. Tomorrow, we’ll explore what these effects might be.

 

Cities are powering the rebound in national income growth

Behind the big headlines about an national income rebound: thriving city economies are the driver.

As economic headlines go, it was pretty dramatic and upbeat news:  The US recorded an 5.2 percent increase in real household incomes, not only the first increase  since 2007, but also the biggest one-year increase ever recorded. Its a signal that the national economy is finally recovering from the Great Recession (the worst and most prolonged economic downturn in eight decades).

Fittingly, The Wall Street Journal headline proclaimed the good news:

wsj_income_gain_headline

But dig deeper into the data, and there’s an even more interesting development: The big growth in US incomes was powered by the growth in incomes in cities.  The following chart shows the inflation-adjusted change in incomes between 2014 and 2015 for the nation’s cities, suburbs and rural areas.  The key numbers here are seven, four and two:  the average city household’s income grew seven percent, the average suburban household’s income grew four percent and the average rural household’s income declined by two percent. (NOTE: This two percent decline appears to be an error based on changed geographic definitions for what constitutes rural areas, see our comment below).  The more urban you were in 2015, the faster your income rose.

median_income_city2015

Source:  Census Bureau, Income and Poverty in the United States: 2015

For those who follow this data closely, this is yet another strong piece of evidence that the US national economy is being powered by what’s happening inside cities.  If the nation’s incomes had grown only as fast as those in rural and suburban areas, the national income increase would have been cut roughly in half, to an underwhelming 2.5 percent. The gain in city incomes hasn’t escaped the attention of other analysts. At Vox, Tim Lee flagged the disparity between city and suburban and rural income gains, summarizing it as “a fundamentally urban recovery.”

As we pointed out last year, urban centers are, for the first time in decades, gaining jobs faster than their surrounding peripheries.  Measured by job growth, large metropolitan areas–those with a million or more population–have grown much faster than smaller metros and rural areas. The shift to the center is also reflected in housing prices; homes in vibrant urban centers have registered significant increases relative to the price of suburban homes.

There’s an unfortunate tendency to portray this data in a “winners” and “losers” frame: Vox headlines its story as cities getting richer and rural areas getting left behind. But really what’s at work here is a fundamental shift in the forces that are propelling national economic growth. The kinds of industries that are growing today, in technology, software and a range of high value services, are industries that depend on the talent, density and vibrancy of city economies for their success. It’s not that we’ve somehow simply reallocated some activities that could just as easily occur in rural areas to cities; much of this growth is uniquely the product of urban economies.

A particularly misleading connotation of the word “recovery” is that it seems to suggest that in the wake of a recession, economies rebound simply by restoring exactly the kinds and patterns of jobs and industries they lost. What really happens is what Joseph Schumpeter famously called “creative destruction”: the economy grows by creating new ideas, jobs, and industries, often in new locations. As we shift increasingly to a knowledge-driven economy, that process is occurring most and fastest in the nation’s cities, where talented workers are choosing to live, and where businesses seeking to hire them are starting, moving and expanding.

This is not your father’s or your grandmother’s recovery. The US economy is changing in a fundamental way to be more urban-centered and urban-driven. Its an open question as to whether we’ll recognize that this is now the dynamic that drives the national economy, and fashion policies that capitalize on cities as a critical source of economic strength.

A few technical notes

The data for these estimates come from the Current Population Survey which is used to generate national estimates, rather than the more fine grained geographies reported in the American Community Survey. For its annual report on income and poverty, the Census Bureau provides only a limited geographic breakdown of income data. Specifically, they report the differences in income and poverty for metropolitan and non-metropolitan areas, and within metropolitan areas, the differences between “principal cities”–generally the largest and first named city in a metro area–and the remainder of the metropolitan area.  Although city boundaries are less than ideal for making geographic comparisons at the national level, it is a rough, first-order way of charting the different trajectories of cities and suburbs.

When the 2015 ACS data becomes available later this year, we and others will want to examine in more closely to better understand the broad trends from this week’s report.

UPDATE: September 19:  Census Report likely under-estimated rural income growth

The New York Times Upshot points out that the reported decline in incomes in rural areas is probably an error, due to the changing definition of what constitutes “rural” areas in the Current Population Survey.

 

 

Counting women entrepreneurs

Entrepreneurship is both a key driver of economic activity and an essential path to economic opportunity for millions of Americans. For much of our history, entrepreneurship has been dominated by men. But in recent decades, women have overcome many of the social and other obstacles entrepreneurship and as a result, the number of women active in starting and growing their own businesses has been increasing.

A new survey, conducted by the Census Bureau, in cooperation with the Ewing Marion Kauffman Foundation, provides a rich source of data about the economic contributions of women-owned businesses. The Annual Survey of Entrepreneurship is the first iteration of a survey that gathers data which asks detailed questions about key demographic characteristics of business owners, including gender, race and ethnicity, and veteran’s status. And unlike other business data, the entrepreneurship survey reports data by age of business, allowing us to examine separately the economic contributions of newly formed businesses.

The survey focuses on businesses with paid employees, and so generally excludes self-employed individuals working on their own. In 2014, the survey reports that there were more than 5.4 million businesses with a payroll in the United States. Of these, about 270,000 businesses were public corporations (or other business entities for which the gender or other demographic characteristics of owners could not be ascertained). These businesses employed almost 60 million workers (52 percent of total payroll employment).  The remaining 5.1 million firms with identifiable owners employed about 55 million workers.  The survey concludes that nearly 1.1 million businesses, or 20.4 percent of those with individually identifiable owners, were owned exclusively by women and employed about 8.5 million workers.  About 10.8 percent of these women-owned businesses had started in the past two years, compared to about 8.9 percent of all employer firms.  Women-owned businesses are found in all economic sectors, but are disproportionately represented in education, health and social services, where they comprise about 28 percent of all employer businesses.

The report also offers data on business ownership patterns for the 50 largest US metropolitan areas.   We thought it would be interesting to see how different areas ranked in terms of the share of all businesses with employment that were owned by women.

Here’s a listing of the number of women-owned businesses, the share of total businesses owned by women for these fifty metropolitan areas.

 

Among the cities with the highest proportions of women-owned businesses with a payroll are Denver, Atlanta and Baltimore, with nearly 1 in 4 businesses (for which demographic characteristics of owners could be identified) being owned by women.  The metropolitan areas with the lowest fraction of women-owned businesses include Salt Lake City, Memphis, and Birmingham, where only about 17-18 percent of businesses are owned by women.

When we map the fraction of women-owned businesses, some geographic patterns become apparent.  In general, the proportion of women businesses is higher in Western metropolitan areas, and in many Southern metropolitan areas, particularly in Florida, Texas and Georgia.  In the Northeast, Midwest and in much of the South, the share of women-owned businesses tends to be much smaller. Washington and Baltimore appear to be outliers in their geographic region, as do St. Louis and Kansas City. From Philadelphia to Boston, the Northeast corridor has below average shares of women-owned businesses.

In addition to identifying the gender of business owners, the survey also provides insight on other ownership characteristics, including race and ethnicity; we’ll examine some of these findings in a future commentary. The Census plans to conduct its new survey of entrepreneurs on an annual basis. This promises to be a useful was of benchmarking efforts to draw more Americans of every stripe into business ownership.

McMansions Fading Away?

Just a few months ago we were being tolderroneously, in our view–that the McMansion was making a big comeback. Then, last week, there were a wave of stories lamenting the declining value of McMansions. Bloomberg published: “McMansions define ugly in a new way: They’re a bad investment –Shoddy construction, ostentatious design—and low resale values.”  The Chicago Tribune chimed in “The McMansion’s day has come and gone.” Whither are these monster homes headed?

downton_abbey
Even “Downton Abbey” is past its heyday (Highclere Castle)

First, as we’ve noted, its problematic to draw conclusions about the state of the McMansion business by looking at the share of newly built homes 4,000 feet or larger (one of the standard definitions of a McMansion). The problem is that in weak housing markets (such as what we’ve been experiencing for the better part of a decade in the wake of the collapse of the housing bubble) the demand for small homes falls far more than the demand for large, expensive ones. So the share of big homes increases (as does the measured median size of new homes). And indeed, that’s exactly what happened post–2007: the number of new smaller homes fell by 60 percent, while the number of new McMansions fell by only 43 percent, so the big homes were a bigger share (of a much smaller housing market).  Several otherwise quite numerate reports gullibly treated this increased market share as evidence of a rebound in the McMansion market; it isn’t.

We proposed a McMansion-per-millionaire measure as a better way of gauging the demand for these structures, and showed that the ratio of big new houses to multi-millionaire households did indeed peak in 2002, and has  failed to recover since. We built about 16 McMansion per 1,000 multi-millionaires in 2002, and only about 5 in 2014.

Another way of assessing the market demand for behemoth homes is by looking at the prices they command in the market. What triggered these recent downbeat stories about McMansions was an analysis entitled “Are McMansions Falling out of favor” by Trulia’s Ralph McLaughlin, looking at the comparative price trajectories of 3,000 to 5,000 square foot  homes built between 2001 and 2007 and all other homes in each metropolitan area.  McLaughlin found that since 2012, the premium that buyers paid for these big houses fell pretty sharply in most major metropolitan markets around the country.  Overall, the big house premium fell from about 137 percent in 2012 to 118 percent this year.

screenshot-2016-09-08-15-03-57

In a way, this shouldn’t be too surprising. Part of the luster of a McMansion is not just its size, but its newness. Like new cars, McMansions may have their highest value when they leave the showroom (or the “Street of Dreams” moves on). According to the Chicago Tribune’s reporting on this story, apparently today’s McMansion buyer wants dark floors, gray walls, and white kitchen cabinets, very different materials and color schemes than last decade’s big houses. As they age, we would expect all vintage 2005 houses to depreciate, relative to the market. This gradual decline in value is essential to the process of filtering–housing becomes more affordable as it ages. (And at some point, usually many decades later, when the surviving old homes acquire the cachet of “historical” — they may begin appreciating again, relative to the rest of the housing stock).

There’s another factor working against the McMansion, in our view. In general, these large homes have generally been built on the periphery of the metropolitan area, in suburban or exurban greenfields. As we’ve shown, the growing demand for walkability and urban amenities has meant an increase in prices for more central housing relative to more distant locations. Its likely that this trend is also hastening the erosion of the big house premium.

Finally, there is a financial angle here, too. McMansions were at the apex of the housing price appreciation frenzy of the bubble years. You took the sizable appreciation in your previous house, and rolled it over into an even larger house–hoping to reap further gains when it appreciated. The move-up and trade-up demand that fueled McMansion demand has mostly evaporated. Despite gains in recent months, nominal home values in most markets haven’t recovered to pre-recession levels, and adjusted for inflation, many home owners have yet to see a gain on their real estate investment. According to Zillow, the effective negative equity rate (homeowners who have less than 20 percent equity in their homes) was 35 percent.

There will always be people with more money than taste, so there will always be a market for McMansions (or whatever fashion they might evolve into next). But many of the market factors that combined to boost their fortunes a decade ago have changed. Consumers now know that home prices won’t increase without fail and the interest in ex-urban living has waned. Homeownership overall is down, and much of the growth in homeownership will be among older adults (who probably won’t be up-sizing).

Where are African-American entrepreneurs?

Entrepreneurship is both a key driver of economic activity and an essential path to economic opportunity for millions of Americans. Historically, discrimination and lower levels of wealth and income have been barriers to entrepreneurship by African-Americans, but that’s begun to change. According to newly released data from the Census Bureau, its now estimated that there are more than 108,000 African-American owned businesses with a payroll in the U.S.

The new survey, conducted by the Census Bureau, in cooperation with the Ewing Marion Kauffman Foundation, provides a rich source of data about the economic contributions of African-American-owned businesses. Called the Annual Survey of Entrepreneurship, this is the first iteration of a survey that gathers data which asks detailed questions about key demographic characteristics of business owners, including gender, race and ethnicity, and veteran’s status. And unlike other business data, the entrepreneurship survey reports data by age of business, allowing us to examine separately the economic contributions of newly formed businesses.

The survey focuses on businesses with paid employees, and so generally excludes self-employed individuals working on their own. In 2014, the survey reports that there were more than 5.4 million businesses with a payroll in the United States. Of these, about 270,000 businesses were public corporations (or other business entities for which the gender or other demographic characteristics of owners could not be ascertained). These large corporate businesses employed almost 60 million workers (52 percent of total payroll employment).  The remaining 5.1 million firms with identifiable owners employed about 55 million workers.

The survey concludes that about 108,000 businesses, or roughly two percent of those businesses with individually identifiable owners, were owned exclusively by African-Americans. Together these businesses employed more than 1 million workers nationally.  On average, African-American owned businesses are younger than other businesses; about 14.1 percent of these African-American-owned businesses had started in the past two years, compared to about 8.9 percent of all employer firms. Africanowned businesses are found in all economic sectors, but are disproportionately represented in  health and social services.  About 28 percent of African-American owned businesses are engaged in health and social services, compared to about 12 percent of all individually owned businesses.

The report also offers data on business ownership patterns for the 50 largest US metropolitan areas.   We thought it would be interesting to see how different areas ranked in terms of the share of all businesses with employment that were owned by African-Americans.

Here’s a listing of the number of African-American owned businesses per 1,000 African-Americans in the population in each of the fifty largest US metropolitan areas. Think of this as an indicator of the likelihood that an African-American owns a business with a payroll in each of these places. Overall, about three in one thousand African-Americans in these fifty large metropolitan areas own a business.

Among the cities with the highest proportions of business owners among the African-American population are San Jose, St. Louis, Denver and Seattle. Each of these cities has about six or seven African-American entrepreneurs per 1,000 African-American residents. San Jose is famously the capital of Silicon Valley, which may explain why such a relatively high fraction of its African-American residents own businesses with a payroll. In contrast, Louisville, Buffalo, Memphis and Cleveland have much lower rates of African-American entrepreneurship, each of these metro areas has fewer than two African-American entrepreneurs per 1,000 African-American residents.

Another way to think about this data is to compare the share of the population in each metropolitan area that is African American with the share of entrepreneurs who are African American. The following chart shows this information. As one would expect, as the share of the African-American population increases, so too does the fraction of entrepreneurs who are African-American. There are some clear outliers. As shown on the chart, St. Louis has somewhat more African-American entrepreneurs than one would expect, given the size of is African-American population, and conversely, New Orleans has fewer. But on average, entrepreneurship is much less common among African-Americans than the overall population, in every metro area. On average, the share of the African-Americans who are entrepreneurs is about one-fifth their share of the population of a given metropolitan area.

In a previous post, we examined the geography of women-owned businesses.   The Census plans to conduct its new survey of entrepreneurs on an annual basis. This promises to be a useful was of benchmarking efforts to draw more Americans of every stripe into business ownership.

Who patronizes small retailers?

 

Urban developers regularly wax eloquent over the importance of local small businesses.  But ultimately, businesses depend on customer support. So, in what markets do customers routinely support small businesses? Getting data that reflects on this question is often very difficult. A new source of “big data” on consumer spending patterns comes from the JPMorganChase Institute, which uses anonymized credit and debit card data from more than 16 billion transactions by the bank’s 50 million customers to measure consumer spending patterns across the United States.  Their “Local Consumer Commerce Index” index reports detailed data on spending patterns in 15 major metropolitan areas across the country.

Small Business (Flickr: La Citta Vita)
Small Business (Flickr: La Citta Vita)

The company’s proprietary credit and debit card data aren’t complete or perfect, of course. To the extent there are demographic variations in the bank’s market share in different metropolitan areas, or different patterns of credit and debit card use compared to cash purchases (or checks) these data won’t be completely representative. But they do represent a sizable sample of consumer spending and JPMI analysis show that they are roughly congruent with government measures of retail sales activity. The data cover many daily purchases of a wide range of non-durable goods and services; it’s likely that they under-report purchases of major durables, like cars and appliances, which are frequently financed through bank- or store-credit, rather than purchased with credit or debit cards.

The Institute is now publishing a monthly analysis of its index data that looks at changes in retail sales by metro market, by age, by income group, and by major product category (restaurants, fuel, etc). The report also estimates how much people spend in their home metropolitan area, as opposed to purchases in other metropolitan areas.

The Institute also classifies purchases according to size of business. We mined these data–which the Institute makes freely available here–to examine what fraction of consumer spending in each covered metropolitan market goes to “small” businesses.  The JPMC Institute classifies as “large” all those firms who have a market share of 8 percent or greater in a particular product category, and then divides the remaining businesses into “medium” and “small” establishments.

So what do these data tell us about where consumers are most likely to patronize smaller businesses?

First, there’s considerable variation among metropolitan areas.  Overall, small businesses account for about 32.6 percent of retail sales, according to the Institute’s estimates.  In New York City (think bodegas and boutiques) small establishments account for 36 percent of sales.  In Columbus, the comparable figure is 23 percent.  Here are the data for the 15 metropolitan areas covered in the JPMorgan Chase Institute’s study:

Second, the Institute reports that its own tabulations of retail spending data show that people who live in urban centers spend a larger fraction of their retail dollar at smaller businesses than those who live in suburbs.  They conclude: “central cities uniformly have more spending at small and medium enterprises than do their surrounding metropolitan areas.”  Their data show that purchases at small and medium sized firms are 10 to 15 percentage points  percent higher in central cities than in their surrounding suburbs.

The JPMC Institute data are an interesting and useful new window into consumer spending patterns. You can learn more about the data, and read the insights from the Institute’s analysts in their report that describes the methodology and key findings:  https://www.jpmorganchase.com/corporate/institute/document/jpmc-institute-local-commerce-report.pdf.

 

The Economic Value of Walkability: New Evidence

One of the hallmarks of great urban spaces is walkability–places with lots of destinations and points of interest in close proximity to one another, buzzing sidewalks, people to watch, interesting public spaces–all these are things that the experts and market surveys are telling us people want to have.

Walkable places. (Flickr: TMImages PDX)
Walkable places. (Flickr: TMImages PDX)

Its all well and good to acknowledge walkability in the abstract, but to tough-minded economists (and to those with an interest in public policy) we really want to know, what’s it worth?  How much, in dollar and cents terms, do people value walkable neighborhoods?  Thanks to the researcher’s at RedFin, we have a new set of estimates of the economic value of walkability.

Redfin used an economic tool called “hedonic regression” to examine more than a million home sales in major markets around the country, and to tease out the separate contributions of a house’s lot size, age, number of bedrooms and bathrooms, square footage and neighborhood characteristics (like average income). In addition, the RedFin model included an examination of each property’s Walk Score.  Walk Score is an algorithm that estimates the walkability of every address in the United States on a scale of 0 to  100 based on its proximity to a number common destinations like schools, stores, coffee shops, parks and restaurants.

What they found is that increased walkability was associated with higher home values across the country. On average, they found that a one point increase in a house’s Walk Score was associated with a $3,000 increase in the house’s market value. But their findings have some importance nuances.

First, the value of walkability varies from city to city. Its much more valuable in larger, denser cities, on average than it is in smaller ones. A one point increase in Walk Score is worth nearly $4,000 in San Francisco, Washington and Los Angeles, but only $100 to $200 in Orange County or Phoenix.

Second, the relationship between walkability and home value isn’t linear: a one point increase in the Walk Score for a home with a very low score doesn’t have nearly as much impact as an increase in Walk Score for a home with a high Walk Score.  This suggests that there is a kind of minimum threshold of walkability.  For homes with Walk Scores of less than 40, small changes in walkability don’t seem to have much effect on home values. In their book, Zillow TalkSpencer Raskoff and Stan Humphries reached a similar conclusion in their research by a somewhat different statistical route, finding that the big gains in home value were associated with changes toward the high end of the Walk Score scale.

For their benchmark comparison of different cities, RedFin computed how much a home’s value might be expected to increase if it went from a WalkScore of 60 (somewhat walkable) to a WalkScore of 80 (very walkable). The results are shown here.

Walk_Score_6080

Among the markets they studied, the average impact of raising a typical home’s Walk Score from 60 to 80 was to add more than $100,000 to its market value. In San Francisco, the gain is $188,000; in Phoenix, only a tenth that amount.

Redfin’s estimates parallel those reported by their real estate data rivals at Zillow. Raskoff and Humphries looked at a different set of cities, and examined the effect of a 15-point increase in Walk Score.  They found that this increased home values by an average of 12 percent, with actual increases ranging from 4 percent to 24 percent.

We think the new RedFin results have one important caveat. We know from a wide variety of research that proximity to the urban core tends to be positively associated with home values in most markets. And it turns out that there is some correlation between Walk Scores and centrality (older, closer-in and more dense neighborhoods tend, on average to have higher Walk Scores). RedFin’s model didn’t adjust its findings for distance to the central business district. What this means is that some of the effect that their model attributes to Walk Score may be capturing the value of proximity to the city center, rather than just walkability.  So as you read these results, you might want to think about them representing the combined effect of central, walkable neighborhoods.  (Our own estimates, which controlled for centrality, still showed a significant, positive impact for walkability on home values).

The RedFin study adds to a growing body of economic evidence that strongly supports the intuition of urbanists and the consumer research:  American’s attach a large and apparently growing value to the ability to live in walkable neighborhoods.  The high price that we now have to pay to get walkable places ought to be  a strong public policy signal that we should be looking for ways to build more such neighborhoods. Too often, as we’ve noted, our current public policies–like zoning–effectively make it illegal to build the kind of dense, interesting, mixed-use neighborhoods that offer the walkability that is in such high demand.

More Driving, More Dying (2016 First Half Update)

More grim statistics from the National Safety Council:  The number of persons fatally injured in traffic crashes in the first half of 2016 grew by 9 percent.  That means we’re on track to see more than 38,000 persons die on the road in 2016, an increase of more than 5,000 from levels recorded just two years ago.

Motor Vehicle Fatality Estimates - 6 month trends
Motor Vehicle Fatality Estimates – 6 month trends

 

Just two weeks ago, we wrote about the traditional summer driving season as a harbinger of the connection between the amount of driving we do and the high crash and fatality rates we experience. And these data show, for the first half of the year, that things are not going well.  As alarming as these statistics are, the bigger question that they pose is why are crash rates rising?  And what, if anything can we do about it?

It’s not the economy, stupid.

There are undoubtedly many factors at work behind the rise in crashes and crash deaths. There’s clearly much more we can do to make our city streets and roadways safer for all travelers.

We have to disagree with the National Safety Council on one key point: we shouldn’t mindlessly blame the economy for our safety woes. In their press release, they attribute the increase in fatalities to  an improving economy, saying:

While many factors likely contributed to the fatality increase, a stronger economy and lower unemployment rates are at the core of the trend.

That’s an unfortunate, and probably incorrect framing, in our view. Chalking the rise in traffic deaths up to an improving economy seems a bit fatalistic: implying that more traffic deaths are an sad but inevitable consequence of economic growth, one which might prompt some people to shrug off the increase in deaths.  That would be tragically wrong, because, at least through 2013, the nation experienced a decrease in traffic deaths and an improving economy.

What has changed, since 2014, is not the pace of job growth or the steady decline in the unemployment rate (both of which have been proceeding nicely since the economy bottomed out in 2009), but rather a reversal of the increase in gasoline prices which started in the summer of 2014.  As we pointed out a few weeks ago, gas prices have been steadily declining, and as a direct result, Americans have begun driving more.

Now it would be fair to point out that a three percent increase in driving has been accompanied by a nine percent increase in traffic deaths. But we have good reasons to believe that the additional driving (and additional drivers and additional trips) that are prompted by cheaper gasoline are exactly the the ones that involve some of the highest risks.  A study of gas prices and crash rates found that the relationship was indeed “non-linear”–that small changes in gas prices were associated with  disproportionately larger increases in crash rates.

Higher gas prices not only discourage driving generally, they seem to have the effect of reducing risky driving, and thus produce a safety dividend. Its time to do more than just lament tragic statistics: if we want to make any progress toward Vision Zero, we ought to be putting in place policies that bring the price of driving closer to the costs that it imposes on society. If people reduce their driving–as they did when gasoline cost more than it does today–there will be fewer crashes and fewer deaths.

The Week Observed: Aug. 12, 2016

What City Observatory did this week

1. The national party platforms on transit. In November, most Americans will be choosing between a party whose platform offers the barest details and seemingly little understanding of urban transportation and a party whose platform is “more or less openly hostile” to it.

2. Marietta’s victory over affordable housing. Last year we wrote about this Atlanta suburb’s plan to banish the residents of 1,300 intact, market-rate, lower-rent apartments by spending $65 million to acquire the buildings and demolish them. The complexes had a 25 percent poverty rate and were inhabited by 80 percent people of color; as Google Street View shows, they’re now gone. The national media have remained silent.

Marietta_before after

3. The new sucking sound: offshoring taxes from intellectual property. Manufacturing jobs may be trickling back into the United States, but domestic companies built on ideas are happily keeping their profits outside the nation that nurtures them. That’s due to our international tax regime that rewards fictions like Apple’s assigning 92 percent of its profit to offshore work.

4. Rail projects need reality-based traffic projections too. Freeway partisans may be the worst offenders in claiming that ever-rising traffic requires unending expansion, but a federal judge ruled this week that Maryland’s Purple Line light rail also needs to revisit outdated ridership trends before moving ahead with its multibillion-dollar project in suburban DC.


The week’s must reads

1. What causes displacement in growing cities? The Sightline Institute describes the housing market in cities like Seattle as “a giant game of musical chairs”: “Even if the game-makers preserve certain chairs for certain people (‘reserved for veterans’ or ‘special-needs players only’) or stem the addition of expensive new chairs to the game (‘no McMansion chairs!’) or give money to some players to help them buy chairs (‘Section 8’), as long as people outnumber homes, the game will still keep knocking players out. And it will always, always, always knock out the players with the least financial resources.”

2. Upzone, but don’t give it away for free. Hong Kong has a lesson for U.S. cities, says USC’s Richard Green: it sells newly developable land at auction and uses its proceeds for housing subsidies that help half the population afford to live in the city. Even in the face of tax limits like Prop 13, he says, cities like Los Angeles could use a similar system to sell air rights and invest the money in housing.

3. How local land use decisions are biased against non-residents. Who loses when cities block new housing? Most of all it’s people who work but don’t live there. Emily Badger uses a decision to block 4,000 new homes in Brisbane, California, just outside San Francisco, as a case for making more land-use decisions at the regional, state or national levels — as Japan has done, and as California Gov. Jerry Brown has been pushing to do. She pulls data showing this is a bigger issue in some cities than others:

population change

Image: Washington Post.

4. De Blasio’s housing plan stumbles. Months after agreeing in theory to a plan that let developments be larger as long as they included some below-market-rate units, New York City council members are turning against Mayor Bill de Blasio at the project-by-project level. “Many of us don’t want to just see rentals or just housing being built,” zoning subcommittee chair Donovan Richards tells Politico. “There is a need for parking, etc.”


New knowledge

1. Cities can survive auto congestion if they are dense enough. “On average, more jobs can be reached in a given amount of time via the congested streets of San Francisco than on the fast moving freeways and boulevards in the fringes of the region,” write Brian Taylor, Taner Osman, Trevor Thomas and Andrew Mondschein in a Caltrans-backed, peer-reviewed critique of the Texas Transportation Institute’s “cost of congestion” measurements. It’s not that congestion is good or shouldn’t be mitigated, they find, but rather that access to destinations is generally more important than speed.

sf parklet
San Francisco: neither immobilized nor unproductive. Photo: Michael Andersen.

2. Tech growth boosts low-wage workers, too. Many cities want a local high-tech industry. But when such efforts succeed, do residents with different skills benefit? For a new study in Annals of the American Association of Geographers, Neil Lee and Andres Rodriguez-Pose of the London School of Economics examine the spillover effects of high tech job growth on the low and moderate income population of a metropolitan area. Data on 295 metro areas for the period 2005 through 2011 (a time dominated by the Great Recession) suggests that tech jobs had a positive impact on the employment and earning prospects of low wage workers, but weren’t enough by themselves to have a major impact on poverty rates.

3. Improving communication about housing shortages. Views and perspectives of single-family homeowners are “dominant” in media coverage of Seattle housing issues, according to a 14-page “media audit” by the Sightline Institute, leading to perceptions that they are “a stand-in for ‘public’ or ‘voters’ and given special deference.” Sightline offers suggestions for housing advocates to better frame the situation — for example, by avoiding the words “supply and demand.”


The Week Observed is City Observatory’s weekly newsletter. Every Friday, we give you a quick review of the most important articles, blog posts, and scholarly research on American cities.

Our goal is to help you keep up with—and participate in—the ongoing debate about how to create prosperous, equitable, and livable cities, without having to wade through the hundreds of thousands of words produced on the subject every week by yourself.

If you have ideas for making The Week Observed better, we’d love to hear them! Let us know at jcortright@cityobservatory.org or on Twitter at @cityobs.

Back to school: Three charts that make the case for cities

Its early September, and most of the the nation’s students are (or shortly will be) back in the classroom. There may be a few key academic insights that are no longer top of mind due to the distractions of summer, so as good teachers know, now is a good time for a quick refresher–something that hits the highlights, and reminds us of the lesson plan.  So it is today, with City Observatory.

Pay attention class (Flickr: Jeff Warren)
Pay attention class (Flickr: Jeff Warren)

There’s a growing tide of data illustrating the economic importance of vibrant urban centers. Here are three charts we’ve collected in the past year that underscore the importance of city centers, walkability and transit access—some of the critical factors behind city success.

Chart 1: Walkability drives commercial land values

Real Capital Analytics tracks commercial real estate values in cities across the United States. Like many of us, they’ve noticed the growing importance that businesses place on being located in walkable areas—because that’s where their customers and workers increasingly want to be. And the desirability of walkable areas gets directly reflected in land values. RCA constructed a price index for US commercial real estate that compares  how values are growing in highly walkable areas compared to car-dependent ones. No surprises here: over the past 15 years commercial real estate located in the most walkable areas has dramatically outperformed less walkable areas.

RCA_Walkability

RCA uses a repeat sales index to track changes in property values over time. Their data show that not only have property values in highly walkable central business district locations fully recovered since the 2008 recession, they’ve gained more than 30 percent over their previous peak. Meanwhile, commercial property values in car-dependent suburbs languish at pre-recession levels. As we’ve noted at City Observatory, the growing disparity between central and suburban property values is a kind of “Dow of Cities” that shows that illustrates the economic importance of centrality.

Chart 2: Transit access boosts property values

In addition to walkability, another aspect of great urban spaces—transit accessibility—is also a strong predictor of property values. The Center for Neighborhood Technology looked at trends in residential real estate values in Boston, Chicago, Minneapolis-St. Paul, Phoenix, and San Francisco, between 2006 and 2011, and found that property in transit served locations dramatically out performed property values in places with limited transit. They found strong evidence to support the view that “access to transit” is the new real estate mantra. Over this five-year period, transit-served locations outperformed the typical property in their region by about 40 percent, while property values in non-transit served areas underperformed the regional average.

CNT_TransitShed

Chart 3: People are increasingly moving to urban centers

Luke Juday at the University of Virginia’s Demographics Research Group has done a terrific job of compiling Census data to map the relationship between population trends and centrality—how close to the center of the central business district do people live. Juday’s work can show whether specific population groups are, in the aggregate, moving closer to the urban center, or are decentralizing. His interactive charts show data for the top 50 metropolitan areas, and clearly illustrate the centralizing trend that characterizes well-educated young workers—something that we’ve explored in our reports on the Young and Restless. For example, consider this chart of the location of 22 to 34 year olds by distance from the central business district in 2000 and 2012.

Juday_2234_50metros

 

The data in this chart are a composite of the 50 largest metropolitan areas in the U.S. In 2012, the fraction of the population within a mile of the center of the central business district (the darker line) in this key young adult demographic approached 30 percent, a substantial increase from 2000 (the lighter line). Meanwhile, the share of the population in more outlying areas declined. This is powerful evidence of the growing preference of young adults for urban living.

At City Observatory, we’re data-driven. These three charts, taken together with four others we highlighted earlier, make a strong case for the growing economic importance of cities. Walkability, transit access and the movement to city centers are big economic drivers. That’s the lesson that all of us–students and urban leaders alike–need to be keeping in mind.

 

The Summer Driving Season & The High Price of Cheap Gas

Cheaper gas comes at a high price: More driving, more dying, more pollution.

We’re at the peak of the summer driving season, and millions of Americans will be on the road. While gas prices are down from the highs of just a few years ago, there’s still a significant price to be paid.

Vacation Traffic (Flickr: Lunavorax)
Vacation Traffic (Flickr: Lunavorax)

As the Frontier Group’s Tony Dutzik noted, earlier this month marked the 103rd consecutive week in which gasoline prices were lower than they were in the same week a year previously.  Two years ago, the price of gas averaged about $3.75 per gallon. Last week, according to the US Department of Energy, it stood at just under $2.40.

While cheaper gas has been a short run tonic for the economy–lower gas bills mean consumers have more money to spend on other things–the lower price of gas has provoked predictable behavior changes.

We’re driving more, reversing a decade-long trend in which Americans drove less. Ever since the price of gas went from less than $2 a gallon in 2002 to $4 a gallon in 2008, Americans have been driving less and less every year. Vehicle miles traveled per person per day peaked in 2004 at 26.7, and declined steadily through 2013. But in 2014, with the plunge in gas prices, driving started going back up.

 

Price matters, but driving is still exhibits a relatively low elasticity relative to price. The 39 percent decline in gas prices over the past two years has (so far) produced an increase in driving of about 4 percent.

We’re dying more

Earlier this month the National Highway Traffic Safety Administration reported that highway fatalities rose nearly 8 percent in the past year.  While there’s a lot of speculation that distracted driving may contribute to many crashes–though there’s little evidence its associated with the uptick in fatalities–its very clear that more driving is the biggest risk factor in producing more crashes, and more deaths.  There’s also some statistical evidence that cheaper gas actually facilitates more driving by more crash-prone drivers, which is consistent with a rise in fatalities that is greater than the increase in the miles driven.

We’re using more energy and polluting more

More driving means more energy consumption, and more pollution as well.  Not only are we driving more miles–which burns more fuel, but we’re also buying less efficient vehicles.  According to Michael Sivak at the University of Michigan, the sales-weighted average fuel economy of new cars purchased in the US has declined over the past two years from 25.7 miles per gallon to 25.3 miles per gallon.

Each gallon of gasoline burned generates about 20 pounds of carbon emissions, so the increase in driving also means more greenhouse gas emissions.

The bottom line is that prices matter, and many of the key attributes of driving are under-priced.  Vehicles don’t pay for the pollution they emit, for their contribution to climate change, or even for the cost of wear and tear on roads. If the price of driving more accurately reflected the costs it imposes on everyone, there’d be less congestion, less pollution, fewer traffic deaths, and we’d save money. The fluctuations in gas prices over the past few years are powerful economic evidence of how this works. Its a lesson we’ve paid for, so it would be good if we learned from it.

The Week Observed: Aug. 5, 2016

What City Observatory did this week

1. The case for more Ubers. From mobile phones to microchips, it’s clear that even mega-companies must act in consumer interest when competition forces them to. When Uber and Lyft can pull out of Austin in response to new regulations, that’s a sign that they’re not facing enough healthy competition, and cities should be focusing on encouraging their potential rivals.

10103237286_becdefba86_z
Lyft at work. Photo: Raldo (Flickr)

2. How cities can diversify their ride-hailing services. Following up on Monday’s post, we offered six suggestions for encouraging more Ubers. Throughout the local regulatory process, cities should be asking: will this help or hinder future competitors in this market?

3. Segregation and the racial wealth gap. The more that white and black Americans’ homes are separated from one another within a city they share, the greater the difference between the average incomes of the people in each racial group. Our analysis echoes similar academic findings, but it’s not clear whether segregation worsens inequality or the other way around.

4. Things are definitely looking up for Detroit. We compiled various indicators about the site of the country’s most famous urban economic collapse to show that with five straight years of job growth, Detroit seems to be on a very solid rebound. Though the city is still adding jobs more slowly than the national average or than its own suburbs, Detroit’s 1.4 percent annual job growth marks a clear turnaround from before the Great Recession, when the nation was adding jobs as Detroit was losing them.


The week’s must reads

1. Is the recent drop in homeownership good? Interesting question. While owning their homes has helped many middle class families build wealth, three of the five people in this NYT exchange say promoting homeownership might be counterproductive, especially for poorer households. Harvard’s Ed Glaeser says mortgage subsidies “stack the deck against America’s cities.” Author A. Mechele Dickerson points out that low-income homeowners tend to buy in neighborhoods with poor appreciation prospects and that rental affordability is a bigger problem. Economist Dean Baker flags the average home’s $25,000 transaction cost and says the state should be working to boost incomes, not subsidizing relatively wealthy homeowners.

2. Uber: steering toward monopoly? Uber’s decision to essentially pull out of China was major business news this week. The company sold its operations there to its Chinese competitor Didi Chuxing. Bloomberg’s Justin Fox floated a theory: Uber is getting tired of competition, so it’s focusing its money and effort on dominating markets where it faces smaller rivals. As we suggested this week, cities should care a great deal about whether their local market for ride-sharing and other transportation services is monopolistic or competitive.

3. How Tokyo killed real estate inflation. Japan’s capital city has been booming just like New York and London — but unlike them, it’s kept housing prices almost totally stable. How? With national land-use laws that prevent neighbors from blocking nearby development, writes Robin Hardy. The cost is a city that is often “spectacularly ugly,” but where “the ugliness is shared by rich and poor alike.” (See also this commentary, which attributes the situation to land reforms in the 1990s.)


New knowledge

1. Local Institutions drive civic success. Does wealth come from institutions like fair judges and honest government, or from natural resources like coal and topsoil? It’s the urban economist’s version of the nurture/nature debate. In a new paper from Utrecht University, Tobias Ketterer, and Andrés Rodríguez-Pose use economic data from 184 sub-national regions in Europe to test the contribution of each factor. They find that while resource endowments play a role, “low corruption and government accountability are crucial factors behind regional economic growth.”

2. Planning builds functional cities. Yes, we’ve all heard: most of the world’s population now lives in cities. But especially in the developing world, how we build cities (compact and efficient or riven by underdeveloped slums) makes a huge difference to their economic success and ecological footprint.  Vernon Henderson, Anthony Venables and two co-authors explore data for Nairobi, Kenya, in a paper from the London School of Economics. They find that institutional problems–lack of clear land titles, corruption, and poor urban planning–lead to development and persistence of low-density, poor-quality housing in slums, which leads to inefficient use of valuable land at the urban center. Effective institutions are particularly important early on, because development patterns are hard to change once established.

3. Old vs. young on pension liabilities. In a growing number of cities, paying the costs of municipal pensions is a big financial challenge. Pension finance involves important intergenerational effects: current taxpayers are paying employees who provided services decades ago. A new research paper from the Philadelphia Federal Reserve presents an interesting twist: cities with fewer under-55 homeowners are less likely to fund public pensions, possibly because it’s young homeowners who most fear having to face the full costs later.


The Week Observed is City Observatory’s weekly newsletter. Every Friday, we give you a quick review of the most important articles, blog posts, and scholarly research on American cities.

Our goal is to help you keep up with—and participate in—the ongoing debate about how to create prosperous, equitable, and livable cities, without having to wade through the hundreds of thousands of words produced on the subject every week by yourself.

If you have ideas for making The Week Observed better, we’d love to hear them! Let us know at jcortright@cityobservatory.org or on Twitter at @cityobs.

Reversed Polarity: Bay Area venture capital trends

The greater San Francisco Bay area has been a hotbed of economic activity and technological change for decades, bringing us ground-breaking tech companies from Hewlett-Packard and Intel, to Apple and Google, to AirBNB and Uber. Its a great place to spot trends that are likely to spread elsewhere.  One such trend is the growing tendency of new technology startups to locate in cities.  Today we explore some new data on venture capital investment that are indicative of this trend.

Credit: David Yu, Flickr
Credit: David Yu, Flickr

As we noted in July, there’s always been a dynamic tension between the older, established city of San Francisco in the north, and the new, upstart, tech-driven city of San Jose in the south.  From the 1950s onward, San Jose and the suburban cities of Santa Clara county grew rapidly as the tech industry expanded. Eventually the area was re-christened Silicon Valley.

One of the hallmarks of the Valley’s growth was the invention and explosion of the venture capital industry: Technology savvy, high risk investors, who would make big bets on nascent technology companies with the hopes of growing them into large and profitable enterprises.  Sand Hill Road in Palo Alto became synonymous with the cluster of venture capital that financed hundreds of tech firms.

Silicon Valley’s dominance of the technology startup world was clearly illustrated each year with the publication of the dollar value of venture capital investments by the National Venture Capital Association and PriceWaterhouseCoopers MoneyTree.  Silicon Valley startups would frequently account for a third or more of all the venture capital investment in the United States.  But since the Great Recession that pattern has changed dramatically.

By 2010, according to data gathered by the NVCA, the San Francisco metro area had pulled ahead of Silicon Valley in venture capital investment.  In the past two years (2014 and 2015) venture capital investment in San Francisco has dwarfed VC investment in Silicon Valley.  In 2015, San Francisco firms received about $21 billion in venture capital investment compared to about $7 billion in Silicon Valley.

bay_area_vc

 

VC investment is important both in its own right–because we are talking about billions of dollars, which gets spent on rent, salaries and other purchases, initially at least in these local economies–but perhaps more importantly because venture capital investment is a leading indicator of future economic activity. While individual firms may fail, the flow of venture capital investment is indicative of the most productive locations for new technology driven businesses.

What these data signal is that it is an urban location–San Francisco–that is now pulling well ahead of Silicon Valley, which is still mostly characterized by a suburban office park model of development.  Some of this may have to do with the kind of firms that are drawing investment. Much of the current round of VC investment is going to software and web-related firms, not the kinds of semiconductor-driven hardware firms that have been Silicon Valley’s superstars in the past.  But unlike the 1970s and 1980s, when technology was a decidedly suburban activity, focused primarily in low density “nerdistans,” today its the case that new technology enterprises are disproportionately found in cities.  And today, companies are increasingly choosing to locate their operations in more urban neighborhoods and more walkable suburbs.

What’s driving firms to cities is the fact that the workers they want to hire–well educated young adults in their twenties and thirties–increasingly want to live in dense, walkable urban environments like San Francisco, and not the sprawling suburbs of Silicon Valley.  Further evidence of this trend is, of course, the famous “Google buses” that  pick up workers in high-demand neighborhoods in San Francisco and ferry them, in air-conditioned, wifi-enabled comfort, to prosaic suburban office campuses 30 or 40 miles south.

The movement of workers, investment, and new startup firms to San Francisco is another indicator of the growing strength of cities in shaping economic growth. And has been the case over the past half-century or more, trends that start in the Bay Area tend to ripple through the rest of the country. What we see here is a shift in economic polarity, from the suburban-led growth of the past, to more city-led growth. That’s one of the reasons we think the reversal of the long process of job decentralization has just begun.

Finally, some background about geography.  The Bay Area has several major cities, including San Francisco, Oakland and San Jose.  To the Office of Management and Budget, which draws the boundaries of the nation’s metropolitan areas after each decennial census, the three cities were all part of a single metropolitan area up through the classification created following Census 2000.  In 2010, with new data, and slightly different rules for delineating metro areas, San Jose was split off into its own separate metropolitan area, consisting of Santa Clara and San Benito counties at the south end of San Francisco Bay.  (If it hadn’t been hived off into its own metro area, the name of the larger metropolis would have been the “San Jose-San Francisco-Oakland” metropolitan area, inasmuch as the city of San Jose’s population had passed that of San Francisco.

 

 

 

The triumph of the City and the twilight of nerdistans

This is a story about the triumph of the City—not “the city” that Ed Glaeser has written about in sweeping global and historic terms—but the triumph of a particular city: San Francisco.

For decades, the San Francisco Bay Area’s economy has been a microcosm and a hot house for studying the interplay between innovation, economic prosperity, urban form and social impacts. It gave us the quintessential model of technological geography, Silicon Valley. And today, it’s showing us how that geography is changing—and shifting towards cities.

As a graduate student at the University of California, Berkeley, more than three decades ago, one of the first things I learned about living in Bay Area was that the large city between us and the Pacific Ocean was not “San Fran” nor “SF,” and especially not “Frisco.” San Francisco was simply “the City.”

In the late ’70s and early ’80s, San Francisco was the queen of her little geographic universe, the center of arts, culture, and commerce in Northern California. That was heyday of San Francisco Chronicle columnist Herb Caen, the martyred Harvey Milk, and George Moscone in City Hall. In the wake of Prop. 13, California’s voter-adopted property tax limitation measure, there was a lot of political unrest that led to, among other things, rent control in the City.

Down Highway 101, there was Silicon Valley—or, to those in the Bay Area, simply “the Valley.” Santa Clara County, on the peninsula south of San Francisco, was long regarded as an agricultural hinterland—much as the Central Valley or Salinas are thought of today. The Stanford University campus, the South Bay’s major intellectual center, was (and still is) nicknamed “the Farm”; the area was historically famous for its fruit orchards. But all that changed. San Jose and its surrounding communities grew steadily in the 1960s, 1970s, and 1980s to become the economic hotbed of the region. The personal computer was essentially invented in Silicon Valley garages. Hewlett-Packard, Intel, and Apple all got their start in The Valley. Cities and states across the nation and the world set about trying to replicate what they perceived to be the elements of Silicon Valley’s success: research universities, science parks, technology transfer offices, entrepreneurship programs, and venture capital investment. But no matter how many emulators emerged, Silicon Valley remained the dominant epicenter of new technology firms in the U.S.

As the Valley grew, the City seemed quaint and dowdy by comparison. In the 1990s, it lost some of its corporate crown jewels, as Bank of America decamped its headquarters to—shudder—North Carolina. Sure, the City had its counter-cultural cred with the Jefferson Airplane and, later, Dead Kennedys and others, but the Valley was where the work got done.

The technology wave, particularly the personal computer and the Internet, seemed to bypass San Francisco of the big new firms, the Ciscos, the Oracles, the Googles, got their start in Silicon Valley and grew there. Measured by gross domestic product per capita, San Jose blew by San Francisco in the 1990s, and never looked back. It was, as Joel Kotkin famously argued, the victory of the suburban nerdistans. Engineers and businesspeople wanted to live split-level houses on large lots in suburbs and drive, alone, to work each day. While Kotkin admitted that some creative types might gravitate toward Richard Florida’s boho cities, he pushed that most job growth would be in sensible suburbs:

“Today’s most rapidly expanding economic regions remain those that reflect the values and cultural preferences of the nerdish culture — as epitomized by the technology-dominated, culturally undernourished environs of Silicon Valley. In the coming decade, we are likely to see the continued migration of traditional high-tech firms to new nerdistans in places like Orange County, Calif., north Dallas, Northern Virginia, Raleigh-Durham and around Redmond, Wash., home base for Microsoft.”

But for the past decade or so, and most notably since the end of the Great Recession, a funny thing has happened. Tech has been growing faster in the City than in the Valley. Lots of new firms working on new Internet technology plays—the Ubers, the AirBnBs, the SalesForces—started up in San Francisco and grew there. At the same time, more and more young tech workers, not unlike the young workers nationally, had a growing preference for urban living. The City is a lot more urbane than the Valley. As Richard Florida has chronicled, venture capital investment, perhaps the best leading indicator of future technology growth, has shifted from the suburbs to the cities—nowhere more strikingly than in the San Francisco Bay Area.

And so, to accommodate the needs and desires of their most precious input—the human capital of their workers—Silicon Valley companies started running their own subsidized, point-to-point transit services. The “Google buses” pick up workers in high-demand neighborhoods in San Francisco and ferry them, in air-conditioned, wifi-enabled comfort, to prosaic suburban office campuses 30 or 40 miles south. These buses became the flashpoint for protests about the changing demographics and economic wave sweeping over the city, as Kim-Mai Cutler explained in her epic TechCrunch essay, “How Burrowing Owls Led to Vomiting Anarchists.” In the past 12 years, the number of workers commuting from San Francisco to jobs in Santa Clara County has increased by 50 percent, according to data from the Census Bureau’s Local Employment and Housing Dynamics data series.

Those trends came to their logical culmination this week. The San Francisco Business Times reported that Facebook, now headquartered in the Valley’s Menlo Park, is exploring the construction of a major office complex in San Francisco. According to the Times’ reporting, the company’s decision is driven by the growing desire of its workers to live in urban environments. Additionally, Facebook has faced competition and poaching for talent from San Francisco-based companies, including Uber.

Facebook’s interest in a San Francisco office is just one harbinger of the northward movement of the tech industry. Apple, which has famously insisted that its employees work in its campus in Cupertino, has recently leased office space in San Francisco’s SoMa neighborhood. Google now has an estimated 2,500 employees in San Francisco, and has purchased and leased property in the city’s financial district.

The miserable commute to Silicon Valley from San Francisco means that busy tech workers find it more desirable to work closer to where they live. Paradoxically, as Kim-Mai Cutler warned, the protests and obstacles to Google and other tech buses are prompting tech companies to expand their operations in The City, which brings in even more tech workers to bid up the price of housing there. As she tweeted on July 25:

kimmai_gbus_tweet

As we’ve chronicled at City Observatory, jobs are moving back into city centers around the country, reversing a decades-long trend of employment decentralization. Companies as diverse as McDonalds, which is relocating from suburban Oak Brook to downtown Chicago, and GE, which will move from a suburban Connecticut campus to downtown Boston, all cite the strong desire to access talented workers. Those workers are are increasingly choosing to live in cities. While we view the resurgence of city center economies as a positive development, it also poses important challenges, especially concerning housing supply and affordability. For economic and equity reasons, it is critical that we tackle the nation’s growing shortage of cities.

Our apologies to Ed Glaeser for borrowing the title of his excellent book, The Triumph of the City: How Our Greatest Invention Makes Us Richer, Smarter, Greener, Healthier, and Happier, for this commentary. We’re deeply indebted to Dr. Glaeser for outlining many of the forces at work in America’s cities, including agglomeration economies and the theory of the consumer city. These are chief among the explanations for the recent triumph of San Francisco over Silicon Valley.

Let a thousand Ubers bloom

Why cities should promote robust competition in ride sharing markets

We’re in the midst of an unfolding revolution in transportation technology, thanks to the advent of transportation network companies. By harnessing cheap and ubiquitous communication technology, Uber and other firms organizing what they call “ride sharing” services have not only disrupted the taxi business, but are starting to change the way we think about transportation.  While we think of disruption here as being primarily driven by new technology, the kinds of institutional arrangements–laws and regulations–that govern transportation will profoundly determine what gains are realized, and who wins or loses.

Many thousands of Irises (Flickr: Oregon Department of Agriculture)
Many thousands of Irises (Flickr: Oregon Department of Agriculture)

Right now, Uber has an estimated market value (judging by what recent investors have paid for their stake in the company) of nearly $70 billion. That’s a whopping number, larger in fact than say, carmakers like Ford and GM.  It’s an especially high valuation for a company that has neither turned a profit nor gone public, thus subjecting its financial results to more outside scrutiny.  Uber’s generous valuation has to be based on the expectation that it’s going to be a very, very large and profitable firm, and that it will be as dominant in its market as other famous tech firms–like Microsoft or Google–have been.

The importance of competition

For a moment, it’s worth thinking about the critical role of competition in shaping technology adoption and maximizing consumer value. Take the rapidly changing cell phone industry, which has increasingly replaced the old wire-line telephony of the pre-digital era. Back in the day, phone service—especially local phone service—was a regulated monopoly. It barely changed for decades—the two biggest innovations were princess phones (don’t ask) and touch-tone dialing.

But when the Federal Communications Commission auctioned off wireless radio spectrum for cellular communications it did so in a way that assured that there would be multiple, competing operators in each market. Though there’s been some industry consolidation, critical antitrust decisions made in the past few years have kept four major players (AT&T, Verizon, Sprint, and T-Mobile) very much in the game. T-Mobile has acted as the wildcard, disrupting industry pricing and service practices and prompting steady declines in consumer voice and data costs. In the absence of multiple competitors, it’s unlikely that a cozy duopoly or even triopoly would have driven costs down.

Or consider the case of Intel, which because of a quirk of US Defense Department requirements was obligated to “second source” licenses for some of its key microprocessor technologies to rival Advanced Micro Devices (AMD). Second-sourcing required Intel to share some of its intellectual property with a rival firm so that the military would have multiple and redundant sources of essential technologies. This kept AMD in the market as a “fast follower” and prompted Intel to continuously improve the speed and capability of its microprocessors.

How much does being first count for?

Uber’s first-mover advantages and market share arguably give it a market edge; drivers want to work for Uber because it has the largest customer base and customers prefer Uber because it has more drivers. More cars mean shorter waits for customers, which attracts more customers to Uber and therefore generates more income for its drivers. This positive feedback loop can help drive up market share for Uber at the expense of its competitors. Whether this happens depends two things: how powerful are these network effects and whether effective competitors emerge.

Some economists think that these network externalities tend inevitably to lead to winner-take-all markets, and that once established, dominant market positions are difficult or impossible to overcome. That’s a major factor behind Uber’s high valuation: investors think the company will continue to have a dominant position in the industry and will eventually reap high profits as a result.

Antitrust is a live issue with Uber. The company has famously disclaimed that Uber drivers are its employees—asserting, instead, that they are “independent contractors”—businesses separate from Uber. But this has led some to argue that Uber is collaborating with its drivers to fix prices, which may constitute a violation of antitrust laws. The argument is that the Uber app —which presents all customers with the same rate and gives supposedly independent drivers no opportunity to offer different prices (and no opportunity for customers to bargain) — represents technology-enabled price fixing. This may especially be a problem for “surge pricing,” when every driver effectively raises her price at the same time (something that would be impossible to accomplish absent the technology).

But others question whether the market power afforded by these network externalities extend beyond local markets. Bloomberg View’s Justin Fox argues that the scope of network effects probably doesn’t exceed metropolitan markets. Except possibly for business travelers or tourists, Uber’s market share in some far away city is of little importance to travelers in Peoria. This may increasingly become true as these services become more widespread—it’s still the case that 85 percent of Americans have never used Uber or Lyft, and the 15 percent who have used the services are wealthier, better educated, and probably less price sensitive than those who haven’t used these services yet.

A policy shock in Austin

The big news in the ride sharing business this year was a referendum in Austin on the city’s proposed requirement that all contract drivers be fingerprinted. Uber and Lyft went to the political mat, spending $8.4 million on a campaign to defeat the requirement (the most expensive local political campaign in Austin history by far). The centerpiece of their campaign was a threat to pull out of Austin if the requirement took effect. They lost 56 to 44 percent, and both have followed through on their threat. But in their wake, a number of smaller companies and startups have stepped into the gap. (It’s hard to think of a place that is more entrepreneurial and tech savvy; other communities might not have seen such a response.) According to the Texas Tribune, there are now half dozen companies offering ride sharing services, with a range of pricing, technology, and business models. The lucrative New York market has also attracted a new entrant, Juno. It aims to attract Uber and Lyft’s highest rated drivers by offering them a chance to own equity in the firm. Nothing guarantees that any of these competitors will survive. Already, Uber’s largest domestic rival, Lyft, has put itself up for sale.

In the long run, the social benefits of a new technology will depend, in large part, on whether the technology is controlled by a monopolist, or is subject to dynamic competition. New evidence suggests that the economic harm of monopolies may be much larger than previously recognized, and that a key method monopolists use to earn high profits (or “economic rents”) is to try to shape the rules of the game to their advantage. In a recent research paper published by the Federal Reserve Bank of Minneapolis, James Schmitz writes:

. . . monopolists typically increase prices by using political machinery to limit the output of competing products—usually by blocking low-cost substitutes. By limiting supply of these competing products, the monopolist drives up demand for its own.

There’s nothing foreordained about what shape the marketplace for transportation network companies or ride sharing will look like. There could be one dominant firm—Uber—or many competing firms. It’s actually very much in the interests of cities to encourage a large number of rivals. Economically, competition is likely to be good for consumers and for innovation. Having lots of different firms offer service—and also compete for drivers—is likely to drive down the share of revenue that goes to these digital intermediaries. And as the experience of Austin shows, having just one or two principal providers of ride sharing services means that they can credibly threaten to pull out of a market, and thereby shape public policy. With more competitors, such threats are less credible and effective, as pulling out would usually just mean conceding the market to those who remain.

As municipal governments (and in some cases, states) look to re-think the institutional and regulatory framework that guides transportation network companies and taxis, they should put a premium on rules and conditions that are competition-friendly, and that make it particularly easy for new entrants to emerge. An open, competitive marketplace for these services is more likely to promote experimentation, provide better deals and services for customers, and give communities an equal voice to that of companies in shaping what our future transportation systems look like.

Mystery in the Bookstore

Signs of a rebound in independent bookstores, but not in the statistics

Lately, there’ve been a spate of stories pointing to a minor renaissance of the independent American bookstore. After decades of glum news and closings, there are more and more instances of independent bookstores opening or expanding. The American Bookseller’s Association points with pride to a seven-year string of increases in its dues paying members. Articles in the New York Times “Indie Bookstores are back with a passion,” and US News “Indie Bookstores remain on a roll,” recount first hand accounts of successful firms.

The independent bookstore is an American icon. It’s hard to picture a city–the classic Main Street–without a local bookstore. Bookstores are one of the categories of customer-facing retail and service businesses we’ve used at City Observatory to create our “storefront index” which measures urban walkability. Founding father Ben Franklin was famously a self-taught intellectual who ran a book shop in Philadelphia. The indie bookseller figures prominently in pop culture, from Meg Ryan’s Shop Around the Corner bookstore owner in You’ve Got Mail, to a host of other films and television. In The Big Sleep, Humphrey Bogart’s Phillip Marlowe takes refuge in Dorothy Malone’s Acme Bookshop while staking out a suspect.

More recently, Portlandia has featured ardently feminist booksellers Candace and Toni, the proprietors of Women and Women First Bookstore.

 

For a long time, what with the growth of on-line retailer Amazon (which built its business model selling books at a discount) and with the advent of big-box retailing, it seemed like the small independent bookseller was a doomed anachronism. But in the past few years, there’s been a surprising rebound in local bookselling. It turns out that many readers still prefer the printed page, and gladly patronize a knowledgable and attentive local business. And the surviving and thriving local booksellers have changed their business models to emphasize personal service, community, and on-site experiences that larger and virtual competitors have a hard time matching. But while some stores are flourishing, others are floundering: in Memphis, the Booksellers at Laurelwood, one of three remaining city bookstores is closing this month. In Detroit, the city’s oldest–Big Bookstore in Midtown–is closing after eight decades. In St. Louis, it’s the half century old Webster Groves bookshop that’s closing.

One final sign that a shift back to bricks and mortar bookselling is in the cards: even Amazon is opening its own physical stores.

Government data tell a different story

With such upbeat stories in the popular press, we decided to take a quick look at Census data on the number and geography of bookstores, to see if we could corroborate and quantify these trends. We looked to two key data series, the annual County Business Patterns series, tabulated by the Census Bureau using payroll tax records, and the once-every-five years Economic Census, which survey’s the nation’s businesses about sales, wages, and business operations. We focus on the government definition of bookstores, NAICS 451211.  This statistical category includes all kinds of bookstores, from the large national chains to small, independent businesses, as well as college bookstores, and those that are adjuncts to museums.

According to the Economic Census, the number of bookstores in the US has fallen from 12,363 in 1997 to 7,176 in 2012–a loss of more than 5,000 establishments.  That pattern is also reflected more recently in the data reported as part of the County Business Patterns series. These data show the number of bookstores declining by about 30 percent since 2008, from 9,700 to about 6,900.

 

So here’s our mystery: While there’s been a visible resurgence in bookstores in some locations, the bigger pattern of change remains downward.

We’re not sure what the answer is to this mystery.  There are some of the usual suspects to consider.  First, its likely that many of the bookstores that are closing are the big national chains, like Borders, Barnes and Noble and Waldenbooks.  In market’s where these larger national stores are closing, it may be creating more market space for independent operators to thrive and even occasionally expand. A second factor is that much of the decline in the number of establishments may be among very small bookstores in small towns and rural areas. These are the kinds of places where the threat from Amazon (lower prices, wider selection and convenience) would be a threat.

 

 

 

Growing e-commerce means less urban traffic

The takeaway:

  • Urban truck traffic is flat to declining, even as Internet commerce has exploded.
  • More e-commerce will result in greater efficiency and less urban traffic as delivery density increases
  • We likely are overbuilt for freight infrastructure in an e-commerce era
  • Time-series data on urban freight movements suffer from series breaks that make long term trend comparisons unreliable.

Delivering packages and reducing urban traffic congestion! Credit: Jason Lawrence, Flickr
Delivering packages and reducing urban traffic congestion! Credit: Jason Lawrence, Flickr

Over at the Brookings Institution, Adie Tomer has performed a significant public service by assembling several decades of US DOT data on vehicle miles traveled.  A significant weakness of US DOT’s website is that it mostly presents data a single year at a time, which makes it really difficult to observe and analyze trends in the data.

Tomer’s post plots the US DOT data on urban travel by passenger cars, unit-trucks, and combination trucks.  He points to the growth of e-commerce, and the recent entry of Jet.com–which aims to be a challenger to Amazon’s dominance of web-based retailing.  Tomer speculates that growing e-commerce will lead to more and more delivery trucks crowding urban streets.

He marshals several decades of data on urban truck VMT to claim that urban truck traffic is up an eye-popping 800 percent since 1966.

Another way to see trucking’s urban trajectory is to view aggregate growth since the 1960s. While urban vehicle miles traveled for both passenger cars and trucks grew steadily between 1966 and 1990—in both cases, far surpassing urban population growth—urban trucking absolutely exploded thereafter, reaching almost 800 percent growth until the Great Recession led to reduced demand. That pattern coincided almost perfectly with the rise of e-commerce and the use of digital communications to manage shipping for logistics firms like UPS and FedEx and major private shippers like Walmart.

The post concludes by warning us that we need to provide for additional infrastructure for urban freight movement.

With new companies like Jet and continued growth in stalwarts like Amazon, we should expect e-commerce and urban trucking to keep growing. Those patterns bring some significant implications at all levels of government.

On the transportation side, freight investment will need to be targeted at pinch points and bottlenecks. Those specific sites of congestion deliver disproportionate costs to shippers, which get passed along to consumers, and create supply chain uncertainty

But the problems of doing time series analysis with DOT’s VM-1 (vehicle miles traveled) data is not limited to the largely cosmetic problem of web-site layout.  The more serious problem is the significant series breaks that underlie the published data.  Over time, US DOT has had to make important changes to the way it defines urban and rural areas (as urban development has occurred) and has had to cope with changing data sources.  And, to be sure, DOT has tried to improve the accuracy of its estimates over time.  The cumulative result of these changes is that it is very difficult to make statistically valid statements about the change in truck traffic in cities.  (We’ve spelled out our concerns about the series break in the freight data in a technical appendix, below).

Urban truck travel actually peaked in 2008, and has mostly been declining, except for the past year.

Truck_VMT

In our view, we ought to heavily discount the published data, and not make comparisons that assume that the pre-2006 data are comparable to the post 2006 data.  If we look only at the post-2006 data, a very different picture emerges. For the past six years–a period for which we have apparently comparable estimates, which appear to be not significantly affected by re-definitions of urban and rural areas–there is little evidence that urban truck traffic is increasing.  If anything, the data suggest that it is flat to decreasing.

The alarmist implication of the “800% growth” statistic is that urban traffic will be significantly worsened by growing e-commerce sales.  For example, the Brookings data prompted bloggers at SSTI to write “Urban truck traffic has boomed alongside the rise in e-commerce. ” and to fret that “If the rapid growth in urban truck VMT is a result of increasing e-commerce deliveries, we are a long way from peak urban truck traffic.”

In our view, such fears are wildly overblown.  If anything they have the relationship between urban traffic patterns and e-commerce exactly backwards.  The evidence to date suggests that not only has the growth of e-commerce done nothing to fuel more urban truck trips, but on net, e-commerce coupled with package delivery is actually reducing total urban VMT, as it cuts into the number and length of shopping trips that people take in urban areas.

E-Commerce is increasing rapidly; Urban truck travel is flat

Urban_Freight

The period since 2007 coincides with the big increase in e-commerce in the U.S.  From 2007 to 2013, Amazon‘s North American sales increased by a factor of 5, from $8 billion to$44 billion. Between 20007 and 2013, the total e-commerce revenues  United States has doubled, from about $137 billion to about $261 billion according to the U.S. Department of Commerce.  But over this same time period, according to the US DOT data as tabulated by Brookings, truck traffic in urban areas actually declined.  All the increase in e-commerce appears to have no net effect on urban truck traffic.

Does an increase in package deliveries mean increased urban traffic?

It actually seems like that increased deliveries will reduce urban traffic congestion, for two reasons.  First, in many cases, ordering on line substitutes for shopping trips.  Customers who get goods delivered at home forego personal car shopping trips.  And because the typical UPS delivery truck makes 120 or so deliveries a day, each delivery truck may be responsible for dozens of fewer car-based shopping trips.  At least one study suggests that the shift to e-commerce may reduce total VMT and carbon emissions.  And transportation scholars have noted a significant decrease in shopping trips and time spent shopping.

But there’s a second reason to welcome–and not fear–an expansion of e-commerce from a transportation perspective.  The efficiency of urban trucks is driven by “delivery density”–basically how closely spaced are each of a truck’s stops.  One of the industry’s key efficiency metrics is “stops per mile.”  The more stops per mile, according to the Institute for Supply Management, the greater the efficiency and the lower the cost of delivery.  As delivery volumes increase, delivery becomes progressively more efficient.  In the last several years, thanks to increased volumes — coupled with computerized routing algorithms — UPS has increased its number of stops per mile–stops increased by 3.6 percent but miles traveled increased by only about half as much, 1.9 percent.  UPS estimates that higher stops per mile saved an estimated 20 million vehicle miles of travel.  Or consider the experience of the U.S. Postal Service:  since 2008, its increased the number of packages it delivers by 700 million per year (up 21 percent) while its delivery fleet has decreased by 10,000 vehicles (about 5 percent).

As e-commerce and delivery volumes grow, stop density will increase and freight transport will become more efficient.  Because Jet.com is a rival internet shopping site to Amazon.com, and not a trucking company, its growth means more packages and greater delivery density for UPS and Fedex, not another rival delivery service putting trucks on the street.

So, far from putative cause of worry about transportation system capacity–and inevitably, a stalking horse for highway expansion projects in urban areas–the growth of e-commerce should be seen as another force that is likely to reduce total vehicle miles of travel, both by households (as they substitute on line shopping for car travel) and as greater delivery density improves the efficiency of urban freight delivery.

As David Levinson reports, data from detailed metropolitan level travel surveys and the national American Time Use Study show that time spent shopping  has declined by about a third in the past decade.    As Levinson concludes “. . . our 20th century retail infrastructure and supporting transportation system of roads and parking is overbuilt for the 21st century last-mile delivery problems in an era with growing internet shopping.”

So the next time you see one of those white or brown package delivery trucks think about how many car based shopping trips its taking off the road.

Technical Appendix:  Urban Truck Data

We’re strongly of the opinion that its not appropriate to treat the pre-2006 and post-2007 truck freight data as a single series that represents the actual year to year growth in urban freight mileage.  There are good reasons to treat this as a “series break” look separately at the two series.  The technical reasons behind this judgment are two-fold.

Series Break 1:  Urbanized area boundaries

Tony Dutzik explored this issue last year in a post for the Frontier Group.  Briefly, a number of rural roads were re-classified as urban roads (reflecting changes in development patterns over time).  This has the effect of biasing upwards later year estimates of urban VMT when compared to previous years.  Some part of the apparent increase in “urban” VMT over the past decade has been a result of reclassifying formerly “rural” traffic as “urban”–not more urban traffic.

Series Break 2:  Vehicle classifications

US DOT has used different data and different definitions to classify vehicles pre- and post-2007.  Methodologically, what USDOT has done is migrated their vehicle classification system from that used in the now discontinued Vehicle Inventory and Use Survey and instead substituted RL Polk data.  As a result of this shift in methodology the number of truck miles on urban roads jumped almost 50 percent in one year, from about 102 billion miles in 2006 to about 150 billion miles in 2007.  In 2009, USDOT explained how they’d changed their estimating procedures.

The data now on the website for 2000-2006 were estimated using a methodology developed in the late 1990s. FHWA recently developed a new methodology and used it for this year’s Highway Statistics. This methodology takes advantage of additional and improved information available beginning in 2007 when states were first required to report motorcycle data – before that time, the reporting was not mandatory and the data were missing for a few states. Also, the new methodology does not rely on data from the national vehicle inventory and use survey which provided critical data for the original methodology but was not collected in 2007 as planned.

In April 2011, FHWA recalculated the 2000-2008 data along with the 2009 data to estimate trends. However, after further review and consideration, the agency determined that it is more reliable to retain the original 2000-2006 estimates because the information available for those years does not fully meet the requirements of the new methodology. Thus, the original 2000-2006 estimates are now used, whereas the 2007-2009 data are still based on the new methodology.

The author gratefully acknowledges Adie Tomer’s willingness to share the Excel spreadsheets upon which his analysis was based.

Show Your Work: Getting DOT Traffic Forecasts Out of the Black Box

  • Traffic projections used to justify highway expansions are often wildly wrong
  • The recent Wisconsin court case doesn’t substitute better models, but it does require DOTs to show their data and assumptions instead of hiding them

Highway23

The road less traveled:  Wisconsin Highway 23

There’s a lot of high-fiving in the progressive transportation community about last month’s Wisconsin court decision that stopped a proposed highway widening project. The reason? The state Department of Transportation (DOT) used inadequate traffic projections to justify the project.

The plaintiffs in the case were in a celebratory mood. Steve Hiniker, Executive Director of 1000 Friends of Wisconsin said “We have known for years that the state DOT has been using artificially high traffic forecasts to justify a number of highway expansion projects.  Now a federal court has validated our claims.” Over at CityLab, Eric Jaffe calls it a court-ordered vindication of the peak car argument: “How Wisconsin residents cried peak car and won.

But while the decision is hugely encouraging, it’s important to understand that 1,000 Friends of Wisconsin v. US DOT wasn’t a conclusive win for better traffic projections — the case was actually decided on different, much narrower grounds.

The federal district court ruling is really a take down of the opaque “black box” approach most state DOTs use in transportation forecasting. The project in question was a 20-mile long widening–from two lanes to four–of a stretch of state highway 23 between Sheboygan and Fond du Lac.  The environmental group sued, charging that the Environmental Impact Statement prepared to justify the project and evaluate alternatives was based on faulty and outdated forecasts that overstated future traffic levels.

The court made it clear that it wasn’t in the business of adjudicating competing claims about the reasonableness of models or modeling assumptions.  And it didn’t rule that 1,000 Friends of Wisconsin’s arguments about declining traffic or peak car or lower population projections trumped or invalidated Wisconsin DOTs modeling.  What the court did do, however, was say that WisDOT failed to explain how its model worked in a way that the public (and the court) could understand.  Essentially, the court ruled that Wisconsin DOT couldn’t use a “black box” to generate its projections — instead it had to present its data, assumptions and methodology in such a way that the public and outsiders could see how the results were produced.  Judge Lynn Adelman wrote:

“In short, a reader of the impact statement and the administrative record has no idea how WisDOT applied TAFIS and TDM to produce the actual traffic projections that appear in the impact statement.”  (page 12)

The court was unpersuaded by vague and repetitive blandishments offered in defense by the DOT about techniques and the mechanics of modeling methodology.  The court specifically found that the DOT staff failed to explain how they arrived at the projected traffic volumes that appear in the impact statement, which seem to conflict with the recent trend of declining traffic volumes. And it found that:

“. . .  the defendants repeated and elaborated on their general discussion of how TAFIS and TDM work and did not explain how those tools were applied to arrive at the specific traffic projections that appear in the impact statement.” (page 13).

It appears that the DOT’s position foundered over its inability to answer very basic questions about how a decline in population forecasts and a decline in recorded traffic levels squared with its modeling of future traffic levels.  The Wisconsin DOT didn’t explain to the court’s satisfaction why it was sticking with the same level of traffic predicted for 2035, when population growth rate forecasts–which were supposedly a key input to the model–were reduced by two-thirds.

As a legal matter, the court went out of its way to state that it wasn’t about to second guess the methodology and assumptions chosen by the state DOT.  Here the court ruled, as other courts have, that unless the methodology is “irrational,” it’s not in violation of the National Environmental Policy Act (NEPA).

While it falls short of  a legal vindication of the “peak car” argument, requiring DOTs to open up their “black box” forecasts is still likely to be a devastatingly important ruling.  Official DOT traffic forecasts are frequently presented as the product of a special kind of technical alchemy.  While model results are clothed with the illusion of precision (“this road will carry 184,200 cars in 2035”), there’s really much, much more ambiguity in the results.  To pass muster under NEPA, the process used for calculating future traffic levels will now likely be laid bare.

Those who’ve worked with traffic models know that they’re clumsier, clunkier and more malleable than the precise, hyper-technical image that traffic engineers (or politically appointed transportation agency officials) typically paint of them in the introductions to environmental impact statements.  The numerical outputs from computer simulations, for example, are often subjected to “post-processing” — the preferred euphemism for manually changing predicted traffic levels based on the judgment of the modeler (or the desires of the modeler’s client.)

And there’s lots of room for manipulation. In his book, “Toll Road Traffic and Revenue Forecasts” Rob Bain, a pre-eminent international expert on traffic forecasting, lists 21 different ways modelers can inflate traffic forecasts and concludes “it is perfectly possible to inflate the numbers for clients who want inflated numbers” (page 75).

In practice, DOTs have often used traffic forecasts as a sales tool or a rationalization for new projects.  Once the traffic modeling generates a sufficiently high number to justify additional capacity, the agencies stick with it in spite of evidence to the contrary.  For the proposed $3 billion Columbia River Crossing between Oregon and Washington, the two state DOTs stuck with exaggerated vintage 2005 forecasts in a final environmental impact statement issued in 2013; ignoring actual declines in traffic that had occurred in the intervening years.  And as in Wisconsin, they offered no explanation as to why the modelling didn’t change.

For years, we’ve known that DOT traffic forecasting models are frequently wrong and that they regularly over-estimate future traffic and congestion.  Multi-billion dollar projects are often predicated on traffic forecasts that fail repeatedly to be borne out by reality.  The Sightline Institute showed that for Washington’s SR-520 floating bridge project, the state always forecast a big increase in traffic, even though traffic levels continually declined.

trust_wsdot_proj

The political acceptance of these kinds of errors is rampant in the industry.  The State Smart Transportation Institute analyzed an aggregation of state traffic forecasts prepared annually by the US DOT showed that the 20-year projections overestimated future traffic volumes in every single year the reports could be compared against data on actual miles driven by Americans.

SSTI_Overshoot

A big part of the reason these flawed forecasts have continued to be made–and not corrected–is that the forecasting process is opaque to outsiders.  The federal district court’s ruling in 1000 Friends of Wisconsin v. U.S. DOT should make it much more difficult for highway builders to continue justifying projects based on this kind of “black box” modeling. As the old saying goes:  sunlight is often the best disinfectant. Greater transparency in the data and assumptions that underlie traffic forecasts could lead to much wiser decisions about where to invest scarce transportation resources.

 

 

Playing Apart

Our City Observatory report, Less in Common, catalogs the ways that we as a nation have been growing increasingly separated from one another.  Changes in technology, the economy and society have all coalesced to create more fragmentation and division.

As Robert Putnam described this trend in his 2000 book, we are “Bowling Alone.”  And while work, housing and shopping have become more stratified and dispersed, there still ought to be the opportunity for us to play together. Sports fandom is one of the few countervailing trends: within metropolitan areas popular support for the “home team” whether in pro-sports or college athletics is cuts across demographic and geographic boundaries.

But in our personal lives our recreation is becoming more isolated, chiefly through the privatization of leisure.

Consider:  instead of going to public parks and playgrounds, more children play in the copious backyards of suburban homes. This trend is amplified by helicopter parents.  Free range children are an anomaly, and the combination of sprawl and insecurity adds to the chauffering burden of adults–which in turn means spending more time in cocooned private vehicles. And as we know, the decline in physical exercise among the nations children has been a key factor in the explosive growth of juvenile obesity.

One of the hallmarks of the decline in the public recreational commons is swimming.  In the early part of the 20th century, swimming pools were almost exclusively in the public domain.  Prior to World War II it was estimated that there were fewer than 2,500 homes with private, in-ground swimming pools.  Today, there are more than 5 million.

That’s one of the reasons we found Samsung’s television commercial “A Perfect Day” so compelling. It highlighted the adventures of a group of kids, cycling around New York City, and ending up spending time at a public pool.  Its encouraging that a private company can make our aspirations for living life in public a central part of its marketing message.

That’s certainly a contrast to the trend of commoditization of leisure. Increasingly, we pay to play, and play in the private realm.  The number of persons who belong to private gyms has increased from about 13 million in 1981 to more than 50 million today.  While gyms provide a great experience for those who join, they tend to draw disproportionately from wealthier and younger demographic groups–again contributing to our self-segregation by common background and interest.

Over just the past five years, the number of Americans classified as “physically inactive”–not participating in sports, recreation or exercise, has increased from 75 million to 83 million, according to the Physical Activity Council.  And youth participation in the most common team sports — soccer, basketball, football and baseball — has declined 4 percent since 2008.

As we think about ways to strengthen and restore the civic commons, we will probably want to place special emphasis on parks and recreation.  Public parks are one of the places where people of different races, ethnicities and incomes can come together and share experiences.

More evidence of surging city job growth

In February, we released our latest CityReport Surging City Center Job Growth, presenting evidence showing employment growing faster in the city centers of the nation’s largest metros since 2007. Another set of analysts has, independent of our work, produced findings that point to renewed job growth in the nation’s inner city neighborhoods.

A new report issued by the Federal Reserve Bank of Cleveland, using similar data but different definitions reaches many of the same conclusions. The analysis, prepared by Fed Economist Daniel Hartley and Nikhil Kaza and T. William Lester of the University of North Carolina, is entitled Are America’s Inner Cities Competitive?  Evidence from the 2000s.  The Fed study divides metropolitan areas  into three parts:  the central business district (CBD)–a series of tracts that form the core of the commercial area in each metro’s largest city–the inner city–tracts within a principal city but outside the central business district (CBD), and the suburbs–the remainder of metro area.

Between 2002 and 2011, Hartley, Kaza and Lester report that inner cities have added 1.8 million jobs.  They also echo one of our key findings:  that job growth in city centers was stronger in the post-recession period than it was earlier in the decade.  In the aggregate, inner cities recorded relatively robust job growth over the past decade (up 6.1% between 2002 and 2011) compared to suburbs (6.9%), and that particularly since the end of the recession (i.e. 2009) have recorded faster job growth (3.6%) than either suburbs (3.0%) or central business districts (2.6%).

To get a sense of how the geography of job growth has shifted over the past decade, its useful to divide the data roughly in half, comparing growth trends in the 2002-07 period (during the height of the housing bubble) with the growth from 2007-11 (the period representing the collapse of the bubble, and the impact of the Great Recession, and the first years of recovery).  These were the time periods used in our Surging City Center Job Growth report, and we’ve recalculated the Fed data to make it directly comparable to our analysis.  The chart below shows the data from the Fed report and computes the average annual growth rate of jobs for central business districts, inner cities, and suburbs for these two time periods.

These data show that in the earlier time period, suburbs were outperforming cities; inner cities were growing about half as fast as suburbs and CBD employment was actually declining.  From 2002 to 2007, the further you were from the center, the faster you grew.  This relationship reversed in the latter 2007-11 period.  Cities outperformed suburbs–suburbs saw a net decline in employment–and job growth was actually somewhat faster in the CBD than in inner cities.  Despite the recession, CBD job growth was much stronger in the 2007-11 period (+0.3%) than it was in the earlier 2002-07 period (-0.7%).  (Note that percentage figures in the following graph represent annualized growth rates.)

Hartley_Jobs

There are some key differences between the Fed study and our recent City Observatory report. Our definition of “city center” included all those businesses within three miles of the center of the central business district.  Both studies are based on geographically detailed employment data from the Census Bureau’s Local Employment and Housing Dynamics (LEHD) program.  The new Fed study reports data for 281 US metropolitan areas (our report looked at 41 of the largest metropolitan areas).

The authors conclude that while it is too soon to term this an urban renaissance, its a noticeable change from the long term trend of employment decentralization.  Though not universal, the pattern of strong inner city growth is widespread, with two-fifths (120 out of 281 metros) recording gains in overall employment and share of employment in inner cities.  The traditional decentralizing pattern of employment still holds for some metropolitan areas, like Houston and Dallas, but inner cities are flourishing in some unlikely places, like heavily suburbanized Los Angeles and San Antonio.

As we did in our report, the authors of the Federal Reserve study examine the industrial dimensions of job change.  Manufacturing jobs continue to suburbanize, and inner cities have been relatively more competitive for jobs in “eds and meds” education services and health care.  They also identify a key role for the consumer city and population-led theories of urban growth.  Within inner cities, job growth is positively associated with transit access and distance to the CBD, and seems to be driven more by population-serving businesses (like restaurants) than businesses dependent on infrastructure (manufacturing and distribution).

The full report has many more details, and identifies the metros with competitive inner cities (i.e. those places where inner city areas gained share of total metro employment between 2002 and 2011).

We’re expecting to get data for 2012 and 2013, to be able to judge whether these trends persisted as the US economy continued to recover.  If you’re keenly interested in urban economies, as we are, you’ll be eagerly awaiting the new numbers.  In the mean time, the Cleveland Fed study is a “must read.”

Hartley, Daniel A., Nikhil Kaza, and T. William Lester, 2015. “Are America’s Inner Cities Competitive? Evidence from the 2000s,” Federal Reserve Bank of Cleveland, working paper no 15-03.  https://www.clevelandfed.org/en/Newsroom%20and%20Events/Publications/Working%20Papers/2015%20Working%20Papers/WP%2015-03%20Are%20Americas-Inner-Cities-Competitive-Evidence-from-the-2000s.aspx

 

Want to close the Black/White Income Gap? Work to Reduce Segregation.

 

Nationally, the average black household has an income 42 percent lower than average white household. But that figure masks huge differences from one metropolitan area to another. And though any number of factors may influence the size of a place’s racial income gap, just one of them – residential segregation – allows you to predict as much as 60 percent of all variation in the income gap  from city to city. Although income gaps between whites and blacks are large and persistent across the country, they are much smaller in more integrated metropolitan areas and larger in more segregated metropolitan areas.  The strength of this relationship strongly suggests that reducing the income gap will necessarily require reducing racial segregation.

To get a picture of this relationship, we’ve assembled data on segregation and the black/white earnings gap for the largest U.S. metropolitan areas. The following chart shows the relationship between the black/white earnings disparity (on the vertical axis), and the degree of black/white segregation (on the horizontal axis).   Here, segregation is measured with something called the dissimilarity index, which essentially measures what percent of each group would have to move to create a completely integrated region. (Higher numbers therefore indicate more segregated places.) To measure the black-white income gap, we first calculated per capita black income as a percentage of per capita white income, and then took the difference from 100. (A metropolitan area where black income was 100% of white income would have no racial income gap, and would receive a score of zero; a metro area where black income was 90% of white income would receive a score of 10.)

The positive slope to the line indicates that as segregation increases, the gap between black income and white incomes grows as black incomes fall relative to white incomes. On average, each five-percentage-point decline in the dissimilarity index is associated with an three-percentage-point decline in the racial income gap (The r2 for this relationship is .59, suggesting a close relationship between relative income and segregation).

What’s less clear is which way the causality goes, or in what proportions. That is to say: there are good reasons to believe that high levels of segregation impair the relative economic opportunities available to black Americans. Segregation may have the effect of limiting an individual’s social networks, lowering the quality of public services, decreasing access to good schools, and increasing risk of exposure to crime, all of which may limit or reduce economic success.  This is especially true in neighborhoods of concentrated poverty, which tend to be disproportionately neighborhoods of color.

But there are also good reasons to believe that in places where black residents have relatively fewer economic opportunities, they will end up more segregated than in places where there are more opportunities. Relatively less income means less buying power when it comes to real estate, and less access to the wealthier neighborhoods that, in a metropolitan area with a large racial income gap, will be disproportionately white. A large difference between white and black earnings may also suggest related problems – like a particularly hostile white population – that would also lead to more segregation.

The data shown here is consistent with earlier and more recent research of the negative effects of segregation.  Glaeser and Cutler found that higher levels of segregation were correlated with worse economic outcomes for blacks.   Likewise, racial and income segregation was one of several factors that Raj Chetty and his colleagues found were strongly correlated with lower levels of inter-generational economic mobility at the metropolitan level.

Implications

To get a sense of how this relationship plays out in particular places, consider the difference between two Southern metropolitan areas: Birmingham and Raleigh.  Birmingham is more segregated (dissimilarity 65) than Raleigh (dissimilarity 41).  The black white income gap is significantly smaller in Raleigh (blacks earn 17 percent less than whites) than it is in Birmingham (blacks earn 29 percent less than whites).

The size and strength of this relationship point up the high stakes in continuing to make progress in reducing segregation as a means of reducing the racial income gap.   If Detroit had the same levels of segregation as the typical large metro (with an dissimilarity index of 60, instead of 80), you would expect its racial gap to be  12 percentage points smaller, which translates to $3,000 more in annual income for the average black resident.

These data presented here and the other research cited are a strong reminder that if we’re going to address the persistent racial gap in income, we’ll most likely need to make further progress in reducing racial segregation in the nation’s cities.

The correlations shown here don’t dispose of the question of causality:  this cross sectional evidence doesn’t prove that segregation causes a higher black-white income gap.  It is entirely possible that the reverse is true:  that places with smaller income gaps between blacks and whites have less segregation, in part because higher relative incomes for blacks afford them greater choices in metropolitan housing markets.  It may be the case that causation runs in both directions.   In the US, there are few examples of places that stay segregated that manage to close the income gap; there are few places that have closed the income gap that have not experienced dramatically lower levels of segregation.   Increased racial integration appears to be at least a corollary, if not a cause of reduced levels of income disparity between blacks and whites in US metropolitan areas.

If we’re concerned about the impacts of gentrification on the well-being of the nation’s African American population, we should recognize that anything that promotes greater racial integration in metropolitan areas is likely to be associated with a reduction in the black-white income gap; and conversely, maintaining segregation is likely to be an obstacle to diminishing this gap.

Though provocative, these data don’t control for a host of other factors that we know are likely to influence the economic outcomes of individuals, including the local industrial base and educational attainment.  It would be helpful to have a regression analysis that estimated the relationship between the black white earnings gap and education.  It may be the case that the smaller racial income gap in less segregated cities may be attributable to higher rates of black educational attainment in those cities.  For example, the industry mix in Raleigh may have lower levels of racial pay disparities and employment patterns than the mix of industries in Birmingham.  But even the industry mix may be influenced by the segregation pattern of cities; firms that have more equitable practices may gravitate towards, or grow more rapidly in communities with lower levels of segregation.

Brief Background on Racial Income Gaps and Segregation

Two enduring hallmarks of race in America are racial segregation and a persistent gap between the incomes of whites and blacks.  In 2011, median household income for White, Non-Hispanic Households was $55,412; for Blacks $32,366 (Census Bureau, Income, Poverty, and Health Insurance Coverage in the United States: 2011, Table A-1).  For households, the racial income gap between blacks and whites is 42 percent.  Census Bureau data shows on average, black men have per capita incomes that are about 64 percent that of Non-Hispanic White men.  This gap has narrowed only slightly over the past four decades: in the early 1980s the income of black men was about 59 percent that of Non-Hispanic whites.

Because the advantage of whites’ higher annual incomes compounds over time, racial wealth disparities are even greater than disparities in earnings.  Lifetime earnings for African-Americans are about 25 percent less than for similarly aged Non-Hispanic White Americans.   The Urban Institute estimated that the net present value of lifetime earnings for a non-hispanic white person born in late 1940s would be about $2 million compared to just $1.5 million for an African-American born the same year.

In the past half century, segregation has declined significantly.  Nationally, the black/non-black dissimilarity index has fallen from an all-time high of 80 in 1970 to 55 in 2010, according to Glaeser and Vigdor .  The number of all-white census tracts has declined from one in five to one in 427. Since 1960, the share of African-Americans living in majority-non-black areas increased from less than 30 percent to almost 60 percent.  Still, as noted in our chart, their are wide variations among metropolitan areas, many of which remain highly segregated.

Technical Notes

We measure the racial income gap by comparing the per capita income of blacks in each metropolitan area with the per capita income of whites in that same metropolitan area.  These data are from Brown University’s US 2010 project, and have been compiled from the 2005-09 American Community Survey.  The Brown researchers compiled this data separately for the metropolitan divisions that make up several large metropolitan areas (New York, Chicago, Miami, Philadelphia, San Francisco, Seattle, Dallas and others).  For these tabulations we report the segregation and racial income gaps reported for the most populous metropolitan division in each metropolitan area.

How Racial Segregation Leads to Income Inequality

Less Segregated Metro Areas Have Lower Black/White Income Disparities

Income inequality in the United States has a profoundly racial dimension.  As income inequality has increased, one feature of inequality has remained very much unchanged:  black incomes remain persistently lower than white incomes.  But while that pattern holds for the nation as a whole, its interesting to note that in some places the black/white income gap is much smaller. One characteristic of these more equal places is a lower level of racial segregation.

Nationally, the average black household has an income 42 percent lower than average white household. But that figure masks huge differences from one metropolitan area to another. And though any number of factors may influence the size of a place’s racial income gap, just one of them – residential segregation – allows you to predict as much as 60 percent of all variation in the income gap  from city to city. Although income gaps between whites and blacks are large and persistent across the country, they are much smaller in more integrated metropolitan areas and larger in more segregated metropolitan areas.  The strength of this relationship strongly suggests that reducing the income gap will necessarily require reducing racial segregation.

To get a picture of this relationship, we’ve assembled data on segregation and the black/white earnings gap for the largest U.S. metropolitan areas. The following chart shows the relationship between the black/white earnings disparity (on the vertical axis), and the degree of black/white segregation (on the horizontal axis).   Here, segregation is measured with something called the dissimilarity index, which essentially measures what percent of each group would have to move to create a completely integrated region. (Higher numbers therefore indicate more segregated places.) To measure the black-white income gap, we first calculated per capita black income as a percentage of per capita white income, and then took the difference from 100. (A metropolitan area where black income was 100% of white income would have no racial income gap, and would receive a score of zero; a metro area where black income was 90% of white income would receive a score of 10.)

The positive slope to the line indicates that as segregation increases, the gap between black income and white incomes grows as black incomes fall relative to white incomes. On average, each five-percentage-point decline in the dissimilarity index is associated with an three-percentage-point decline in the racial income gap (The r2 for this relationship is .59, suggesting a close relationship between relative income and segregation).

What’s less clear is which way the causality goes, or in what proportions. That is to say: there are good reasons to believe that high levels of segregation impair the relative economic opportunities available to black Americans. Segregation may have the effect of limiting an individual’s social networks, lowering the quality of public services, decreasing access to good schools, and increasing risk of exposure to crime, all of which may limit or reduce economic success.  This is especially true in neighborhoods of concentrated poverty, which tend to be disproportionately neighborhoods of color.

But there are also good reasons to believe that in places where black residents have relatively fewer economic opportunities, they will end up more segregated than in places where there are more opportunities. Relatively less income means less buying power when it comes to real estate, and less access to the wealthier neighborhoods that, in a metropolitan area with a large racial income gap, will be disproportionately white. A large difference between white and black earnings may also suggest related problems – like a particularly hostile white population – that would also lead to more segregation.

The data shown here is consistent with earlier and more recent research of the negative effects of segregation.  Glaeser and Cutler found that higher levels of segregation were correlated with worse economic outcomes for blacks.   Likewise, racial and income segregation was one of several factors that Raj Chetty and his colleagues found were strongly correlated with lower levels of inter-generational economic mobility at the metropolitan level.

Implications

To get a sense of how this relationship plays out in particular places, consider the difference between two Southern metropolitan areas: Birmingham and Raleigh.  Birmingham is more segregated (dissimilarity 65) than Raleigh (dissimilarity 41).  The black white income gap is significantly smaller in Raleigh (blacks earn 17 percent less than whites) than it is in Birmingham (blacks earn 29 percent less than whites).

The size and strength of this relationship point up the high stakes in continuing to make progress in reducing segregation as a means of reducing the racial income gap.   If Detroit had the same levels of segregation as the typical large metro (with an dissimilarity index of 60, instead of 80), you would expect its racial gap to be  12 percentage points smaller, which translates to $3,000 more in annual income for the average black resident.

These data presented here and the other research cited are a strong reminder that if we’re going to address the persistent racial gap in income, we’ll most likely need to make further progress in reducing racial segregation in the nation’s cities.

The correlations shown here don’t dispose of the question of causality:  this cross sectional evidence doesn’t prove that segregation causes a higher black-white income gap.  It is entirely possible that the reverse is true:  that places with smaller income gaps between blacks and whites have less segregation, in part because higher relative incomes for blacks afford them greater choices in metropolitan housing markets.  It may be the case that causation runs in both directions.   In the US, there are few examples of places that stay segregated that manage to close the income gap; there are few places that have closed the income gap that have not experienced dramatically lower levels of segregation.   Increased racial integration appears to be at least a corollary, if not a cause of reduced levels of income disparity between blacks and whites in US metropolitan areas.

If we’re concerned about the impacts of gentrification on the well-being of the nation’s African American population, we should recognize that anything that promotes greater racial integration in metropolitan areas is likely to be associated with a reduction in the black-white income gap; and conversely, maintaining segregation is likely to be an obstacle to diminishing this gap.

Though provocative, these data don’t control for a host of other factors that we know are likely to influence the economic outcomes of individuals, including the local industrial base and educational attainment.  It would be helpful to have a regression analysis that estimated the relationship between the black white earnings gap and education.  It may be the case that the smaller racial income gap in less segregated cities may be attributable to higher rates of black educational attainment in those cities.  For example, the industry mix in Raleigh may have lower levels of racial pay disparities and employment patterns than the mix of industries in Birmingham.  But even the industry mix may be influenced by the segregation pattern of cities; firms that have more equitable practices may gravitate towards, or grow more rapidly in communities with lower levels of segregation.

Brief Background on Racial Income Gaps and Segregation

Two enduring hallmarks of race in America are racial segregation and a persistent gap between the incomes of whites and blacks.  In 2011, median household income for White, Non-Hispanic Households was $55,412; for Blacks $32,366 (Census Bureau, Income, Poverty, and Health Insurance Coverage in the United States: 2011, Table A-1).  For households, the racial income gap between blacks and whites is 42 percent.  Census Bureau data shows on average, black men have per capita incomes that are about 64 percent that of Non-Hispanic White men.  This gap has narrowed only slightly over the past four decades: in the early 1980s the income of black men was about 59 percent that of Non-Hispanic whites.

Because the advantage of whites’ higher annual incomes compounds over time, racial wealth disparities are even greater than disparities in earnings.  Lifetime earnings for African-Americans are about 25 percent less than for similarly aged Non-Hispanic White Americans.   The Urban Institute estimated that the net present value of lifetime earnings for a non-hispanic white person born in late 1940s would be about $2 million compared to just $1.5 million for an African-American born the same year.

In the past half century, segregation has declined significantly.  Nationally, the black/non-black dissimilarity index has fallen from an all-time high of 80 in 1970 to 55 in 2010, according to Glaeser and Vigdor .  The number of all-white census tracts has declined from one in five to one in 427. Since 1960, the share of African-Americans living in majority-non-black areas increased from less than 30 percent to almost 60 percent.  Still, as noted in our chart, their are wide variations among metropolitan areas, many of which remain highly segregated.

Technical Notes

We measure the racial income gap by comparing the per capita income of blacks in each metropolitan area with the per capita income of whites in that same metropolitan area.  These data are from Brown University’s US 2010 project, and have been compiled from the 2005-09 American Community Survey.  The Brown researchers compiled this data separately for the metropolitan divisions that make up several large metropolitan areas (New York, Chicago, Miami, Philadelphia, San Francisco, Seattle, Dallas and others).  For these tabulations we report the segregation and racial income gaps reported for the most populous metropolitan division in each metropolitan area.

How important is proximity to jobs for the poor?

More jobs are close at hand in cities.  And on average the poor live closer to jobs than the non-poor.

One of the most enduring explanations for urban poverty is the “spatial mismatch hypothesis” promulgated by John Kain in the 1960s.  Briefly, the hypothesis holds that as jobs have increasingly suburbanized, job opportunities are moving further and further away from the inner city neighborhoods that house most of the poor. In theory, the fact that jobs are becoming more remote may make them more difficult to get, especially for the unemployed. How important is proximity to getting and keeping a job?

A new Brookings Institution report from Elizabeth Kneebone and Natalie Holmes, The Growing Distance Between People and Jobs  sheds some light on this old question.  Their data show that between 2000 and 2012, jobs generally decentralized in U.S. metropolitan areas, with the result that on average, people live further from jobs than they did a decade ago.  Put another way:  there are fewer jobs within the average commute distance of the typical metropolitan resident.

While job access has diminished for most Americans, the report notes that the declines in job access have been somewhat greater for the poor and for racial and ethnic minorities than for non-poor and white metropolitan residents.  This, in the report’s view, has exacerbated the spatial mismatch between the poor and jobs.

The Kneebone/Holmes findings emphasize the change in job access over time.  As jobs decentralized, the average American had about 7 percent fewer jobs within a typical commuting radius in 2012 than in 2000.  But its illuminating to look at the level of job access.  Certain patterns emerge:

People who live in large metropolitan areas have access to many, many more jobs, than do residents of smaller metropolitan areas.  The typical New Yorker is has just shy of a million  jobs within commuting distance; the typical Memphian, only 150,000.  This is what economists are talking about when they describe “thick” urban labor markets.

Dig deeper, and it turns out that within metropolitan areas, cities have much better job access than suburbs.  We’ve taken the Brookings data for 2012 and computed the relative job accessibility of cities compared to their to suburbs for each of the nation’s 50 largest metro areas.  For example, an average city resident in Charlotte has about 320,000 jobs within typical commuting distance.  The average suburban resident in the Charlotte metro has just 70,000.  (Metro level data are shown in the table below).  This means that a Charlotte city resident has about 4.6 times as many jobs within commuting distance of her home than does her suburban counterpart.  For the typical large metro area, city residents have about 2.4 times as many jobs within commuting distances as their suburban neighbors.  This pattern of higher job accessibility in cities holds for every large metro area in the country–save one:  Las Vegas.

At first this may seem counter-intuitive, but consider:  even though jobs may have been decentralizing, central locations are often better able to access jobs in any part of the region.  Its also the case that despite decentralization, job density–the number of jobs per square mile–still tends to be noticeably higher in urban centers than on the fringe.  Its also interesting to note that the difference in job accessibility between cities and suburbs (+140 percent) dwarfs the average decline in job accessibility (-7%) over the past decade.  While aggregate job accessibility may have decreased slightly, individuals have wide opportunity to influence their access to jobs in every metropolitan area based on whether they choose to live in cities or suburbs.

Perhaps even more surprisingly, on average the poor and ethnic minorities generally are closer to jobs than their white and non-poor counterparts.  We can do the same computation to compare relative job accessibility within each metro area for poor and non-poor populations, and to compare job accessibility for blacks and whites.  Despite job decentralization, and the fact that poorer neighborhoods often themselves support fewer local businesses and jobs, the poor residents of the typical large metropolitan area have about 20 percent more jobs within typical commuting distance than do their non-poor counterparts.  The black residents of large U.S. metropolitan areas are have on average about 50 percent more jobs within typical commuting distance than their white counterparts in the same metropolitan area.  Again, this pattern holds for virtually all large metropolitan areas.  Data showing relative job accessibility for poor and non-poor persons and black and white persons by metropolitan area are shown in the two right hand columns of the table above.

Of course, a pure distance-based measure of job accessibility may not fully reflect the transportation accessibility to particular jobs–especially for poor persons who are disproportionately more likely to not have access to automobiles for commute trips.  But the data show that city residents have strikingly better access to a large number of jobs, and other forms of transportation–transit, cycling and walking–generally work better in cities.  The density and proximity of jobs in cities, plus the availability of transit is one reason why poor persons disproportionately concentrate in cities, according to research by Ed Glaeser and his colleagues.

The very much higher level of physical job accessibility in cities, and the relative proximity that poor people and black Americans enjoy to employment opportunities is a signal that physical employment mismatch is at best only a partial explanation for persistent urban poverty.  Other important barriers, particularly a lack of education, concentrated poverty, and continued discrimination are also important factors.

We’re deeply appreciative of our friends at Brookings undertaking this analysis, and making their methodology and findings accessible and transparent.  The metro-by-metro data they present add a new dimension to our understanding of urban land use and evolving labor markets.  While we strongly encourage everyone to explore this data, we offer an observation. In measuring job accessibility, Kneebone and Holmes chose to use separate and locally customized estimates of local commute distance.  For example, the average intra-metropolitan commute (according to data from the LEHD program) in Houston is 12.2 miles, while in New Orleans it is 6.2 miles.  This means that a big part of the difference in measured job accessibility between these two metropolitan areas reflects the fact the typical commute shed for Houston cover a far larger area than for New Orleans.  While this may be an accurate reflection of typical commuting behavior in each cities, it makes direct comparisons between different metropolitan areas problematic.

The Cappuccino Congestion Index

April First falls on Saturday, and that’s a good reason to revisit an old favorite, the Cappuccino Congestion Index

We’re continuing told that congestion is a grievous threat to urban well-being. It’s annoying to queue up for anything, but traffic congestion has spawned a cottage industry of ginning up reports that transform our annoyance with waiting in lines into an imagined economic calamity. Using the same logic and methodology that underpins these traffic studies, its possible to demonstrate another insidious threat to the nation’s economic productivity: costly and growing coffee congestion.

cappuccino_line

Yes, there’s another black fluid that’s even more important than oil to the functioning of the U.S. economy: coffee. Because an estimated 100 million of us American workers can’t begin a productive work day without an early morning jolt of caffeine, and because one-third of these coffee drinkers regularly consume espresso drinks, lattes and cappuccinos, there is significant and growing congestion in coffee lines around the country. That’s costing us a lot of money. Consider these facts:

  • Delays waiting in line at the coffee shop for your daily latte, cappuccino or mocha cost U.S. consumers $4 billion every year in lost time;
  • The typical coffee drinker loses more time waiting in line at Starbucks than in traffic congestion;
  • Delays in getting your coffee are likely to increase because our coffee delivery infrastructure isn’t increasing as fast as coffee consumption.

Access to caffeine is provided by the nation’s growing corps of baristas and coffee bars. The largest of these, Starbucks, operates some 12,000 locations in the U.S. alone. Any delay in getting this vital beverage is going to impact a worker’s start time–and perhaps their day’s productivity. It’s true that sometimes, you can walk right up and get the triple espresso you need. Other times, however, you have to wait behind a phalanx ordering double, no-whip mochas with a pump of three different syrups, or an orange-mocha frappuccino. These delays in the coffee line are costly.

To figure out exactly how costly, we’ve applied the “travel time index” created by the Texas Transportation Institute to measure the economic impact of this delay on American coffee drinkers. For more than three decades TTI has used this index to calculate the dollar cost of traffic delays–here we use the same technique to figure the value of “coffee delays.”

The travel time index is the difference in time required for a rush hour commute compared to the same trip in non-congested conditions. According to Inrix, the travel tracking firm, the travel time index for the United States in July 2014  was 7.6, meaning that a commute trip that took 20 minutes in off-peak times would take an additional 91 seconds at the peak hour.

We constructed data on the relationship between customer volume and average service times for a series of Portland area coffee shops.  We used the 95th percentile time of 15 seconds as our estimate of “free flow” ordering conditions—how long it takes to enter the shop and place an order.  In our data-gathering, as the shop became more crowded, customers had to queue up. The time to place orders rose from an average of 30 to 40 seconds, to two to three minutes in “congested” conditions. The following chart shows our estimate of the relationship between customer volume and average wait times.

Coffee_Speed_Volume

Following the TTI methodology, we treat any additional time that customers have to spend waiting to place their order beyond what would be required in free flow times (i.e. more than 15 seconds) as delay attributable to coffee congestion.

Based on our observations and of typical coffee shops and other data, we were able to estimate the approximate flow of customers over the course of a day. We regard a typical coffee shop as one that has about 650 transactions daily. While most transactions are for a single consumer, some are for two or more consumers, so we use a consumer per transaction factor of 1.2. This means the typical coffee shop provides beverages (and other items) for about 750 consumers. We estimate the distribution of customers per hour over the course of the day based on overall patterns of hourly traffic, with the busiest times in the morning, and volume tapering off in the afternoon.

We then apply our speed/volume relationship (chart above) to our estimates of hourly volume to estimate the amount of delay experienced by customers in each hour.  When you scale these estimates up to reflect the millions of Americans waiting in line for their needed caffeine each day, the total value of time lost to cappuccino congestion costs consumers more than $4 billion annually. (Math below).


 

This is—of course—our April First commentary, and savvy readers will recognize it is tongue in cheek, but only partly so.  (The data are real, by the way!) The real April Fools Joke here is the application of this same tortured thinking to a description and a diagnosis of the nation’s traffic problems.

The Texas Transportation Institute’s  best estimate is that travel delays cost the average American between one and two minutes on their typical commute trip. While its possible–as we’ve done here–to apply a wage rate to that time and multiply by the total number of Americans to get an impressively large total, its not clear that the few odd minutes here and there have real value. This is why for years, we and others have debunked the TTI report. (The clumping of reported average commute times in the American Community Survey around values ending in “0” and “5” shows Americans don’t have that precise a sense of their average travel time anyhow.)

The “billions and billions” argument used by TTI to describe the cost of traffic congestion is a rhetorical device to generate alarm. The trouble is, when applied to transportation planning it leads to some misleading conclusions. Advocates argue regularly that the “costs of congestion” justify spending added billions in scarce public resources on expanding highways, supposedly to reduce time lost to congestion. There’s just no evidence this works–induced demand from new capacity causes traffic to expand and travel times to continue to lag:  Los Angeles just spent a whopping billion dollars to widen Interstate 405, with no measurable impact on congestion or traffic delays.

No one would expect to Starbucks to build enough locations—and hire enough baristas—so that everyone could enjoy the 15 second order times that you can experience when there’s a lull. Consumers are smart enough to understand that if you want a coffee the same time as everyone else, you’re probably going to have to queue up for a few minutes.

But strangely, when it comes to highways, we don’t recognize the trivially small scale of the expected time savings (a minute or two per person) and we don’t consider a kind of careful cost-benefit analysis that would tell us that very few transportation projects actually generate the kinds of sustained travel time savings that would make them economically worthwhile.

Ponder that as you wait in line for your cappuccino.  We’ll be just ahead of you ordering a double-espresso macchiato (and holding a stopwatch).


Want to know more?

Here’s the math:  We estimate that a peak times (around 10am) the typical Starbucks makes about 100 transactions, representing about 120 customers.  The average wait time is about two and one-half minutes–of which about two minutes and 15 second represents delay, compared to free flow conditions.  We make a similar computation for each hour of the day (customers are fewer and delays shorter at other hours).  Collectively customers at an typical store experience about 21 person hours of delay per day (that’s an average of a little over 90 seconds per customer).  We monetize the value of this delay at $15 per hour, and multiply it by 365 days and 12,000 Starbucks stores.  Since Starbucks represents about 35 percent of all coffee shops in the US, we scale this up to get a total value of time lost to coffee service delays of slightly more than $4 billion.

The Cappuccino Congestion Index

The Cappuccino Congestion Index shows how you can show how anything costs Americans billions and billions

We’re continuing told that congestion is a grievous threat to urban well-being. It’s annoying to queue up for anything, but traffic congestion has spawned a cottage industry of ginning up reports that transform our annoyance with waiting in lines into an imagined economic calamity. Using the same logic and methodology that underpins these traffic studies, its possible to demonstrate another insidious threat to the nation’s economic productivity: costly and growing coffee congestion.

cappuccino_line

Yes, there’s another black fluid that’s even more important than oil to the functioning of the U.S. economy: coffee. Because an estimated 100 million of us American workers can’t begin a productive work day without an early morning jolt of caffeine, and because one-third of these coffee drinkers regularly consume espresso drinks, lattes and cappuccinos, there is significant and growing congestion in coffee lines around the country. That’s costing us a lot of money. Consider these facts:

  • Delays waiting in line at the coffee shop for your daily latte, cappuccino or mocha cost U.S. consumers $4 billion every year in lost time;
  • The typical coffee drinker loses more time waiting in line at Starbucks than in traffic congestion;
  • Delays in getting your coffee are likely to increase because our coffee delivery infrastructure isn’t increasing as fast as coffee consumption.

Access to caffeine is provided by the nation’s growing corps of baristas and coffee bars. The largest of these, Starbucks, operates some 12,000 locations in the U.S. alone. Any delay in getting this vital beverage is going to impact a worker’s start time–and perhaps their day’s productivity. It’s true that sometimes, you can walk right up and get the triple espresso you need. Other times, however, you have to wait behind a phalanx ordering double, no-whip mochas with a pump of three different syrups, or an orange-mocha frappuccino. These delays in the coffee line are costly.

To figure out exactly how costly, we’ve applied the “travel time index” created by the Texas Transportation Institute to measure the economic impact of this delay on American coffee drinkers. For more than three decades TTI has used this index to calculate the dollar cost of traffic delays–here we use the same technique to figure the value of “coffee delays.”

The travel time index is the difference in time required for a rush hour commute compared to the same trip in non-congested conditions. According to Inrix, the travel tracking firm, the travel time index for the United States in July 2014  was 7.6, meaning that a commute trip that took 20 minutes in off-peak times would take an additional 91 seconds at the peak hour.

We constructed data on the relationship between customer volume and average service times for a series of Portland area coffee shops.  We used the 95th percentile time of 15 seconds as our estimate of “free flow” ordering conditions—how long it takes to enter the shop and place an order.  In our data-gathering, as the shop became more crowded, customers had to queue up. The time to place orders rose from an average of 30 to 40 seconds, to two to three minutes in “congested” conditions. The following chart shows our estimate of the relationship between customer volume and average wait times.

Coffee_Speed_Volume

Following the TTI methodology, we treat any additional time that customers have to spend waiting to place their order beyond what would be required in free flow times (i.e. more than 15 seconds) as delay attributable to coffee congestion.

Based on our observations and of typical coffee shops and other data, we were able to estimate the approximate flow of customers over the course of a day. We regard a typical coffee shop as one that has about 650 transactions daily. While most transactions are for a single consumer, some are for two or more consumers, so we use a consumer per transaction factor of 1.2. This means the typical coffee shop provides beverages (and other items) for about 750 consumers. We estimate the distribution of customers per hour over the course of the day based on overall patterns of hourly traffic, with the busiest times in the morning, and volume tapering off in the afternoon.

We then apply our speed/volume relationship (chart above) to our estimates of hourly volume to estimate the amount of delay experienced by customers in each hour.  When you scale these estimates up to reflect the millions of Americans waiting in line for their needed caffeine each day, the total value of time lost to cappuccino congestion costs consumers more than $4 billion annually. (Math below).


 

This is—of course—our regular April First commentary, and savvy readers will recognize it is tongue in cheek, but only partly so.  (The data are real, by the way!) The real April Fools Joke here is the application of this same tortured thinking to a description and a diagnosis of the nation’s traffic problems.

The Texas Transportation Institute’s  best estimate is that travel delays cost the average American between one and two minutes on their typical commute trip. While its possible–as we’ve done here–to apply a wage rate to that time and multiply by the total number of Americans to get an impressively large total, its not clear that the few odd minutes here and there have real value. This is why for years, we and others have debunked the TTI report. (The clumping of reported average commute times in the American Community Survey around values ending in “0” and “5” shows Americans don’t have that precise a sense of their average travel time anyhow.)

The “billions and billions” argument used by TTI to describe the cost of traffic congestion is a rhetorical device to generate alarm. The trouble is, when applied to transportation planning it leads to some misleading conclusions. Advocates argue regularly that the “costs of congestion” justify spending added billions in scarce public resources on expanding highways, supposedly to reduce time lost to congestion. There’s just no evidence this works–induced demand from new capacity causes traffic to expand and travel times to continue to lag:  Los Angeles just spent a whopping billion dollars to widen Interstate 405, with no measurable impact on congestion or traffic delays.

No one would expect to Starbucks to build enough locations—and hire enough baristas—so that everyone could enjoy the 15 second order times that you can experience when there’s a lull. Consumers are smart enough to understand that if you want a coffee the same time as everyone else, you’re probably going to have to queue up for a few minutes.

But strangely, when it comes to highways, we don’t recognize the trivially small scale of the expected time savings (a minute or two per person) and we don’t consider a kind of careful cost-benefit analysis that would tell us that very few transportation projects actually generate the kinds of sustained travel time savings that would make them economically worthwhile.

Ponder that as you wait in line for your cappuccino.  We’ll be just ahead of you ordering a double-espresso macchiato (and holding a stopwatch).


Want to know more?

Here’s the math:  We estimate that a peak times (around 10am) the typical Starbucks makes about 100 transactions, representing about 120 customers.  The average wait time is about two and one-half minutes–of which about two minutes and 15 second represents delay, compared to free flow conditions.  We make a similar computation for each hour of the day (customers are fewer and delays shorter at other hours).  Collectively customers at an typical store experience about 21 person hours of delay per day (that’s an average of a little over 90 seconds per customer).  We monetize the value of this delay at $15 per hour, and multiply it by 365 days and 12,000 Starbucks stores.  Since Starbucks represents about 35 percent of all coffee shops in the US, we scale this up to get a total value of time lost to coffee service delays of slightly more than $4 billion.

The Cappuccino Congestion Index

The Cappuccino Congestion Index shows how you can show how anything costs Americans billions and billions

We’re continuing told that congestion is a grievous threat to urban well-being. It’s annoying to queue up for anything, but traffic congestion has spawned a cottage industry of ginning up reports that transform our annoyance with waiting in lines into an imagined economic calamity. Using the same logic and methodology that underpins these traffic studies, its possible to demonstrate another insidious threat to the nation’s economic productivity: costly and growing coffee congestion.

cappuccino_line

Yes, there’s another black fluid that’s even more important than oil to the functioning of the U.S. economy: coffee. Because an estimated 100 million of us American workers can’t begin a productive work day without an early morning jolt of caffeine, and because one-third of these coffee drinkers regularly consume espresso drinks, lattes and cappuccinos, there is significant and growing congestion in coffee lines around the country. That’s costing us a lot of money. Consider these facts:

  • Delays waiting in line at the coffee shop for your daily latte, cappuccino or mocha cost U.S. consumers $4 billion every year in lost time;
  • The typical coffee drinker loses more time waiting in line at Starbucks than in traffic congestion;
  • Delays in getting your coffee are likely to increase because our coffee delivery infrastructure isn’t increasing as fast as coffee consumption.

Access to caffeine is provided by the nation’s growing corps of baristas and coffee bars. The largest of these, Starbucks, operates some 12,000 locations in the U.S. alone. Any delay in getting this vital beverage is going to impact a worker’s start time–and perhaps their day’s productivity. It’s true that sometimes, you can walk right up and get the triple espresso you need. Other times, however, you have to wait behind a phalanx ordering double, no-whip mochas with a pump of three different syrups, or an orange-mocha frappuccino. These delays in the coffee line are costly.

To figure out exactly how costly, we’ve applied the “travel time index” created by the Texas Transportation Institute to measure the economic impact of this delay on American coffee drinkers. For more than three decades TTI has used this index to calculate the dollar cost of traffic delays–here we use the same technique to figure the value of “coffee delays.”

The travel time index is the difference in time required for a rush hour commute compared to the same trip in non-congested conditions. According to Inrix, the travel tracking firm, the travel time index for the United States in July 2014  was 7.6, meaning that a commute trip that took 20 minutes in off-peak times would take an additional 91 seconds at the peak hour.

We constructed data on the relationship between customer volume and average service times for a series of Portland area coffee shops.  We used the 95th percentile time of 15 seconds as our estimate of “free flow” ordering conditions—how long it takes to enter the shop and place an order.  In our data-gathering, as the shop became more crowded, customers had to queue up. The time to place orders rose from an average of 30 to 40 seconds, to two to three minutes in “congested” conditions. The following chart shows our estimate of the relationship between customer volume and average wait times.

Coffee_Speed_Volume

Following the TTI methodology, we treat any additional time that customers have to spend waiting to place their order beyond what would be required in free flow times (i.e. more than 15 seconds) as delay attributable to coffee congestion.

Based on our observations and of typical coffee shops and other data, we were able to estimate the approximate flow of customers over the course of a day. We regard a typical coffee shop as one that has about 650 transactions daily. While most transactions are for a single consumer, some are for two or more consumers, so we use a consumer per transaction factor of 1.2. This means the typical coffee shop provides beverages (and other items) for about 750 consumers. We estimate the distribution of customers per hour over the course of the day based on overall patterns of hourly traffic, with the busiest times in the morning, and volume tapering off in the afternoon.

We then apply our speed/volume relationship (chart above) to our estimates of hourly volume to estimate the amount of delay experienced by customers in each hour.  When you scale these estimates up to reflect the millions of Americans waiting in line for their needed caffeine each day, the total value of time lost to cappuccino congestion costs consumers more than $4 billion annually. (Math below).


 

This is—of course—our regular April First commentary, and savvy readers will recognize it is tongue in cheek, but only partly so.  (The data are real, by the way!) The real April Fools Joke here is the application of this same tortured thinking to a description and a diagnosis of the nation’s traffic problems.

The Texas Transportation Institute’s  best estimate is that travel delays cost the average American between one and two minutes on their typical commute trip. While its possible–as we’ve done here–to apply a wage rate to that time and multiply by the total number of Americans to get an impressively large total, its not clear that the few odd minutes here and there have real value. This is why for years, we and others have debunked the TTI report. (The clumping of reported average commute times in the American Community Survey around values ending in “0” and “5” shows Americans don’t have that precise a sense of their average travel time anyhow.)

The “billions and billions” argument used by TTI to describe the cost of traffic congestion is a rhetorical device to generate alarm. The trouble is, when applied to transportation planning it leads to some misleading conclusions. Advocates argue regularly that the “costs of congestion” justify spending added billions in scarce public resources on expanding highways, supposedly to reduce time lost to congestion. There’s just no evidence this works–induced demand from new capacity causes traffic to expand and travel times to continue to lag:  Los Angeles just spent a whopping billion dollars to widen Interstate 405, with no measurable impact on congestion or traffic delays.

No one would expect to Starbucks to build enough locations—and hire enough baristas—so that everyone could enjoy the 15 second order times that you can experience when there’s a lull. Consumers are smart enough to understand that if you want a coffee the same time as everyone else, you’re probably going to have to queue up for a few minutes.

But strangely, when it comes to highways, we don’t recognize the trivially small scale of the expected time savings (a minute or two per person) and we don’t consider a kind of careful cost-benefit analysis that would tell us that very few transportation projects actually generate the kinds of sustained travel time savings that would make them economically worthwhile.

Ponder that as you wait in line for your cappuccino.  We’ll be just ahead of you ordering a double-espresso macchiato (and holding a stopwatch).


Want to know more?

Here’s the math:  We estimate that a peak times (around 10am) the typical Starbucks makes about 100 transactions, representing about 120 customers.  The average wait time is about two and one-half minutes–of which about two minutes and 15 second represents delay, compared to free flow conditions.  We make a similar computation for each hour of the day (customers are fewer and delays shorter at other hours).  Collectively customers at an typical store experience about 21 person hours of delay per day (that’s an average of a little over 90 seconds per customer).  We monetize the value of this delay at $15 per hour, and multiply it by 365 days and 12,000 Starbucks stores.  Since Starbucks represents about 35 percent of all coffee shops in the US, we scale this up to get a total value of time lost to coffee service delays of slightly more than $4 billion.

The Cappuccino Congestion Index

The Cappuccino Congestion Index shows how you can show how anything costs Americans billions and billions

We’re continuing told that congestion is a grievous threat to urban well-being. It’s annoying to queue up for anything, but traffic congestion has spawned a cottage industry of ginning up reports that transform our annoyance with waiting in lines into an imagined economic calamity. Using the same logic and methodology that underpins these traffic studies, its possible to demonstrate another insidious threat to the nation’s economic productivity: costly and growing coffee congestion.

cappuccino_line

Yes, there’s another black fluid that’s even more important than oil to the functioning of the U.S. economy: coffee. Because an estimated 100 million of us American workers can’t begin a productive work day without an early morning jolt of caffeine, and because one-third of these coffee drinkers regularly consume espresso drinks, lattes and cappuccinos, there is significant and growing congestion in coffee lines around the country. That’s costing us a lot of money. Consider these facts:

  • Delays waiting in line at the coffee shop for your daily latte, cappuccino or mocha cost U.S. consumers $4 billion every year in lost time;
  • The typical coffee drinker loses more time waiting in line at Starbucks than in traffic congestion;
  • Delays in getting your coffee are likely to increase because our coffee delivery infrastructure isn’t increasing as fast as coffee consumption.

Access to caffeine is provided by the nation’s growing corps of baristas and coffee bars. The largest of these, Starbucks, operates some 12,000 locations in the U.S. alone. Any delay in getting this vital beverage is going to impact a worker’s start time–and perhaps their day’s productivity. It’s true that sometimes, you can walk right up and get the triple espresso you need. Other times, however, you have to wait behind a phalanx ordering double, no-whip mochas with a pump of three different syrups, or an orange-mocha frappuccino. These delays in the coffee line are costly.

To figure out exactly how costly, we’ve applied the “travel time index” created by the Texas Transportation Institute to measure the economic impact of this delay on American coffee drinkers. For more than three decades TTI has used this index to calculate the dollar cost of traffic delays–here we use the same technique to figure the value of “coffee delays.”

The travel time index is the difference in time required for a rush hour commute compared to the same trip in non-congested conditions. According to Inrix, the travel tracking firm, the travel time index for the United States in July 2014  was 7.6, meaning that a commute trip that took 20 minutes in off-peak times would take an additional 91 seconds at the peak hour.

We constructed data on the relationship between customer volume and average service times for a series of Portland area coffee shops.  We used the 95th percentile time of 15 seconds as our estimate of “free flow” ordering conditions—how long it takes to enter the shop and place an order.  In our data-gathering, as the shop became more crowded, customers had to queue up. The time to place orders rose from an average of 30 to 40 seconds, to two to three minutes in “congested” conditions. The following chart shows our estimate of the relationship between customer volume and average wait times.

Coffee_Speed_Volume

Following the TTI methodology, we treat any additional time that customers have to spend waiting to place their order beyond what would be required in free flow times (i.e. more than 15 seconds) as delay attributable to coffee congestion.

Based on our observations and of typical coffee shops and other data, we were able to estimate the approximate flow of customers over the course of a day. We regard a typical coffee shop as one that has about 650 transactions daily. While most transactions are for a single consumer, some are for two or more consumers, so we use a consumer per transaction factor of 1.2. This means the typical coffee shop provides beverages (and other items) for about 750 consumers. We estimate the distribution of customers per hour over the course of the day based on overall patterns of hourly traffic, with the busiest times in the morning, and volume tapering off in the afternoon.

We then apply our speed/volume relationship (chart above) to our estimates of hourly volume to estimate the amount of delay experienced by customers in each hour.  When you scale these estimates up to reflect the millions of Americans waiting in line for their needed caffeine each day, the total value of time lost to cappuccino congestion costs consumers more than $4 billion annually. (Math below).


 

This is—of course—our regular April First commentary, and savvy readers will recognize it is tongue in cheek, but only partly so.  (The data are real, by the way!) The real April Fools Joke here is the application of this same tortured thinking to a description and a diagnosis of the nation’s traffic problems.

The Texas Transportation Institute’s  best estimate is that travel delays cost the average American between one and two minutes on their typical commute trip. While its possible–as we’ve done here–to apply a wage rate to that time and multiply by the total number of Americans to get an impressively large total, its not clear that the few odd minutes here and there have real value. This is why for years, we and others have debunked the TTI report. (The clumping of reported average commute times in the American Community Survey around values ending in “0” and “5” shows Americans don’t have that precise a sense of their average travel time anyhow.)

The “billions and billions” argument used by TTI to describe the cost of traffic congestion is a rhetorical device to generate alarm. The trouble is, when applied to transportation planning it leads to some misleading conclusions. Advocates argue regularly that the “costs of congestion” justify spending added billions in scarce public resources on expanding highways, supposedly to reduce time lost to congestion. There’s just no evidence this works–induced demand from new capacity causes traffic to expand and travel times to continue to lag:  Los Angeles just spent a whopping billion dollars to widen Interstate 405, with no measurable impact on congestion or traffic delays.

No one would expect to Starbucks to build enough locations—and hire enough baristas—so that everyone could enjoy the 15 second order times that you can experience when there’s a lull. Consumers are smart enough to understand that if you want a coffee the same time as everyone else, you’re probably going to have to queue up for a few minutes.

But strangely, when it comes to highways, we don’t recognize the trivially small scale of the expected time savings (a minute or two per person) and we don’t consider a kind of careful cost-benefit analysis that would tell us that very few transportation projects actually generate the kinds of sustained travel time savings that would make them economically worthwhile.

Ponder that as you wait in line for your cappuccino.  We’ll be just ahead of you ordering a double-espresso macchiato (and holding a stopwatch).


Want to know more?

Here’s the math:  We estimate that a peak times (around 10am) the typical Starbucks makes about 100 transactions, representing about 120 customers.  The average wait time is about two and one-half minutes–of which about two minutes and 15 second represents delay, compared to free flow conditions.  We make a similar computation for each hour of the day (customers are fewer and delays shorter at other hours).  Collectively customers at an typical store experience about 21 person hours of delay per day (that’s an average of a little over 90 seconds per customer).  We monetize the value of this delay at $15 per hour, and multiply it by 365 days and 12,000 Starbucks stores.  Since Starbucks represents about 35 percent of all coffee shops in the US, we scale this up to get a total value of time lost to coffee service delays of slightly more than $4 billion.

The Cappuccino Congestion Index

cappuccino_line

City Observatory, April 1. 2015

A new City Observatory analysis reveals a new and dangerous threat to the nation’s economic productivity: costly and growing coffee congestion.

Yes, there’s another black fluid that’s even more important than oil to the functioning of the U.S. economy: coffee. Because an estimated 100 million of us American workers can’t begin a productive work day without an early morning jolt of caffeine, and because one-third of these coffee drinkers regularly consume espresso drinks, lattes and cappuccinos, there is significant and growing congestion in coffee lines around the country. That’s costing us a lot of money. Consider these facts:

  • Delays waiting in line at the coffee shop for your daily latte, cappuccino or mocha cost U.S. consumers $4 billion every year in lost time;
  • The typical coffee drinker loses more time waiting in line at Starbucks than in traffic congestion;
  • Delays in getting your coffee are likely to increase because our coffee delivery infrastructure isn’t increasing as fast as coffee consumption.

Access to caffeine is provided by the nation’s growing corps of baristas and coffee bars. The largest of these, Starbucks, operates some 12,000 locations in the U.S. alone. Any delay in getting this vital beverage is going to impact a worker’s start time–and perhaps their day’s productivity. It’s true that sometimes, you can walk right up and get the triple espresso you need. Other times, however, you have to wait behind a phalanx ordering double, no-whip mochas with a pump of three different syrups, or an orange-mocha frappuccino. These delays in the coffee line are costly.

To figure out exactly how costly, we’ve applied the “travel time index” created by the Texas Transportation Institute to measure the economic impact of this delay on American coffee drinkers. For more than three decades TTI has used this index to calculate the dollar cost of traffic delays–here we use the same technique to figure the value of “coffee delays.”

The travel time index is the difference in time required for a rush hour commute compared to the same trip in non-congested conditions. According to Inrix, the travel tracking firm, the travel time index for the United States in July 2014 (the latest month for which they’ve released this data) was 7.6, meaning that a commute trip that took 20 minutes in off-peak times would take an additional 91 seconds at the peak hour.

We constructed data on the relationship between customer volume and average service times for a series of Portland area coffee shops.  We used the 95th percentile time of 15 seconds as our estimate of “free flow” ordering conditions—how long it takes to enter the shop and place an order.  In our data-gathering, as the shop became more crowded, customers had to queue up. The time to place orders rose from an average of 30 to 40 seconds, to two to three minutes in “congested” conditions. The following chart shows our estimate of the relationship between customer volume and average wait times.

Coffee_Speed_Volume

Following the TTI methodology, we treat any additional time that customers have to spend waiting to place their order beyond what would be required in free flow times (i.e. more than 15 seconds) as delay attributable to coffee congestion.

Based on our observations and of typical coffee shops and other data, we were able to estimate the approximate flow of customers over the course of a day. We regard a typical coffee shop as one that has about 650 transactions daily. While most transactions are for a single consumer, some are for two or more consumers, so we use a consumer per transaction factor of 1.2. This means the typical coffee shop provides beverages (and other items) for about 750 consumers. We estimate the distribution of customers per hour over the course of the day based on overall patterns of hourly traffic, with the busiest times in the morning, and volume tapering off in the afternoon.

We then apply our speed/volume relationship (chart above) to our estimates of hourly volume to estimate the amount of delay experienced by customers in each hour.  When you scale these estimates up to reflect the millions of Americans waiting in line for their needed caffeine each day, the total value of time lost to cappucino congestion costs consumers more than $4 billion annually. (Math below).


 

This is—of course—our April First commentary, and savvy readers will recognize it is tongue in cheek, but only partly so.  (The data are real, by the way!) The real April Fools Joke here is the application of this same tortured thinking to a description and a diagnosis of the nation’s traffic problems.

The Texas Transportation Institute’s  best estimate is that travel delays cost the average American between one and two minutes on their typical commute trip. While its possible–as we’ve done here–to apply a wage rate to that time and multiply by the total number of Americans to get an impressively large total, its not clear that the few odd minutes here and there have real value. This is why for years, we and others have debunked the TTI report. (The clumping of reported average commute times in the American Community Survey around values ending in “0” and “5” shows Americans don’t have that precise a sense of their average travel time anyhow.)

The “billions and billions” argument used by TTI to describe the cost of traffic congestion is a rhetorical device to generate alarm. The trouble is, when applied to transportation planning it leads to some misleading conclusions. Advocates argue regularly that the “costs of congestion” justify spending added billions in scarce public resources on expanding highways, supposedly to reduce time lost to congestion. There’s just no evidence this works–induced demand from new capacity causes traffic to expand and travel times to continue to lag:  Los Angeles just spent a whopping billion dollars to widen Interstate 405, with no measurable impact on congestion or traffic delays.

No one would expect to Starbucks to build enough locations—and hire enough baristas—so that everyone could enjoy the 15 second order times that you can experience when there’s a lull. Consumers are smart enough to understand that if you want a coffee the same time as everyone else, you’re probably going to have to queue up for a few minutes.

But strangely, when it comes to highways, we don’t recognize the trivially small scale of the expected time savings (a minute or two per person) and we don’t consider a kind of careful cost-benefit analysis that would tell us that very few transportation projects actually generate the kinds of sustained travel time savings that would make them economically worthwhile.

Ponder that as you wait in line for your cappuccino.  We’ll be just ahead of you ordering a double-espresso macchiato (and holding a stopwatch).


Want to know more?

Here’s the math:  We estimate that a peak times (around 10am) the typical Starbucks makes about 100 transactions, representing about 120 customers.  The average wait time is about two and one-half minutes–of which about two minutes and 15 second represents delay, compared to free flow conditions.  We make a similar computation for each hour of the day (customers are fewer and delays shorter at other hours).  Collectively customers at an typical store experience about 21 person hours of delay per day (that’s an average of a little over 90 seconds per customer).  We monetize the value of this delay at $15 per hour, and multiply it by 365 days and 12,000 Starbucks stores.  Since Starbucks represents about 35 percent of all coffee shops in the US, we scale this up to get a total value of time lost to coffee service delays of slightly more than $4 billion.