Intel and the Danger of Integration

Last week Brian Krzanich resigned as the CEO of Intel after violating the company’s non-fraternization policy. The details of Krzanich’s departure, though, ultimately don’t matter: his tenure was an abject failure, the extent of which is only now coming into view.

Intel’s Obsolete Opportunity

When Krzanich was appointed CEO in 2013 it was already clear that arguably the most important company in Silicon Valley’s history was in trouble: PCs, long Intel’s chief money-maker, were in decline, leaving the company ever more reliant on the sale of high-end chips to data centers; Intel had effectively zero presence in mobile, the industry’s other major growth area.

Still, I framed the situation that faced Krzanich as an opportunity, and drew a comparison to the challenges that faced the legendary Andy Grove three decades ago:

By the 1980s, though, it was the microprocessor business, fueled by the IBM PC, that was driving growth, while the DRAM business was fully commoditized and dominated by Japanese manufacturers. Yet Intel still fashioned itself a memory company. That was their identity, come hell or high water.

By 1986, said high water was rapidly threatening to drag Intel under. In fact, 1986 remains the only year in Intel’s history that they made a loss. Global overcapacity had caused DRAM prices to plummet, and Intel, rapidly becoming one of the smallest players in DRAM, felt the pain severely. It was in this climate of doom and gloom that Grove took over as CEO. And, in a highly emotional yet patently obvious decision, he once and for all got Intel out of the memory manufacturing business.

Intel was already the best microprocessor design company in the world. They just needed to accept and embrace their destiny.

Fast forward to the challenge that faced Krzanich:

It is into a climate of doom and gloom that Krzanich is taking over as CEO. And, in what will be a highly emotional yet increasingly obvious decision, he ought to commit Intel to the chip manufacturing business, i.e. manufacturing chips according to other companies’ designs.

Intel is already the best microprocessor manufacturing company in the world. They need to accept and embrace their destiny.

That article is now out of date: in a remarkable turn of events, Intel has lost its manufacturing lead. Ben Bajarin wrote last week in Intel’s Moment of Truth:

Not only has the competition caught Intel they have surpassed them. TSMC is now sampling on 7nm and AMD will ship their architecture on 7nm technology in both servers and client PCs ahead of Intel. For those who know their history, this is the first time AMD has ever beat Intel to a process node. Not only that, but AMD will likely have at least an 18 month lead on Intel with 7nm, and I view that as conservative.

As Bajarin notes, 7nm for TSMC (or Samsung or Global Foundries) isn’t necessarily better than Intel’s 10nm; chip-labeling isn’t what it used to be. The problem is that Intel’s 10nm process isn’t close to shipping at volume, and the competition’s 7nm processes are. Intel is behind, and its insistence on integration bears a large part of the blame.

Intel’s Integrated Model

Intel, like Microsoft, had its fortunes made by IBM: eager to get the PC an increasingly vocal section of its customer base demanded out the door, the mainframe maker outsourced much of the technology to third party vendors, the most important being an operating system from Microsoft and a processor from Intel. The impact of the former decision was the formation of an entire ecosystem centered around MS-DOS, and eventually Windows, cementing Microsoft’s dominance.

Intel was a slightly different story; while an operating system was simply bits on a disk, and thus easily duplicated for all of the PCs IBM would go on to sell, a processor was a physical device that needed to be manufactured. To that end IBM insisted on having a “second source”, that is, a second non-Intel manufacturer for Intel’s chips. Intel chose AMD, and licensed first the 8086 and 8088 designs that were in the original IBM PC, and later, again under pressure from IBM, the 80286 design; the latter was particularly important because it was designed to be upward compatible with everything that followed.

This laid the groundwork for Intel’s strategy — and immense profitability — for the next 35 years. First off, the dominance of Intel’s x86 design was assured thanks to its integration with DOS/Windows: specifically, DOS/Windows created a two-sided market of developers and PC users, and DOS/Windows ran on x86.

Microsoft and Intel were integrated in the PC value chain

However, thanks to its licensing deal with AMD, Intel wasn’t automatically entitled to all of the profits that would result from that integration; thus Intel doubled-down on an integration of its own: the design and manufacture of x86 chips. That is, Intel would invest huge sums of money into creating new and faster designs (the 386, the 486, the Pentium, etc.), and also invest huge sums of money into ever smaller and more efficient manufacturing processes that would push the limits of Moore’s Law. This one-two punch would ensure that, despite AMD’s license, Intel’s chips would be the only realistic choice for PC makers, allowing the company to capture the vast majority of the profits created by the x86’s integration with DOS/Windows.

Intel was largely successful. AMD did take the performance crown around the turn of the century with the Athlon 64, but the company was unable to keep up with Intel financially when it came to fabs, and Intel illegally leveraged its dominant position with OEMs to keep them buying mostly Intel parts; then, a few years later, Intel not only took back the performance lead with its Core architecture, but settled into the “tick-tock” strategy where it alternated new designs and new manufacturing processes on a regular schedule. The integration advantage was real.

TSMC’s Modular Approach

In the meantime there was a revolution brewing in Taiwan. In 1987, Morris Chang founded Taiwan Semiconductor Manufacturing Company (TSMC) promising “Integrity, commitment, innovation, and customer trust”. Integrity and customer trust referred to Chang’s commitment that TSMC would never compete with its customers with its own designs: the company would focus on nothing but manufacturing.

This was a completely novel idea: at that time all chip manufacturing was integrated a la Intel; the few firms that were only focused on chip design had to scrap for excess capacity at Integrated Device Manufacturers (IDMs) who were liable to steal designs and cut off production in favor of their own chips if demand rose. Now TSMC offered a much more attractive alternative, even if their manufacturing capabilities were behind.

In time, though, TSMC got better, in large part because it had no choice: soon its manufacturing capabilities were only one step behind industry standards, and within a decade had caught-up (although Intel remained ahead of everyone). Meanwhile, the fact that TSMC existed created the conditions for an explosion in “fabless” chip companies that focused on nothing but design. For example, in the late 1990s there was an explosion in companies focused on dedicated graphics chips: nearly all of them were manufactured by TSMC. And, all along, the increased business let TSMC invest even more in its manufacturing capabilities.

Integrated intel was competing with a competitive modular ecosystem

This represented into a three-pronged assault on Intel’s dominance:

  • Many of those new fabless design companies were creating products that were direct alternatives to Intel chips for general purpose computing. The vast majority of these were based on the ARM architecture, but also AMD in 2008 spun off its fab operations (christened GlobalFoundries) and became a fabless designer of x86 chips.
  • Specialized chips, designed by fabless design companies, were increasingly used for operations that had previously been the domain of general purpose processors. Graphics chips in particular were well-suited to machine learning, cryptocurrency mining, and other highly “embarrassingly parallel” operations; many of those applications have spawned specialized chips of their own. There are dedicated bitcoin chips, for example, or Google’s Tensor Processing Units: all are manufactured by TSMC.
  • Meanwhile TSMC, joined by competitors like GlobalFoundries and Samsung, were investing ever more in new manufacturing processes, fueled by the revenue from the previous two factors in a virtuous cycle.

Intel’s Straitjacket

Intel, meanwhile, was hemmed in by its integrated approach. The first major miss was mobile: instead of simply manufacturing ARM chips for the iPhone the company presumed it could win by leveraging its manufacturing to create a more-efficient x86 chip; it was a decision that evinced too much knowledge of Intel’s margins and not nearly enough reflection on the importance of the integration between DOS/Windows and x86.

Intel took the same mistaken approach to non general-purpose processors, particularly graphics: the company’s Larrabee architecture was a graphics chip based on — you guessed it — x86; it was predicated on leveraging Intel’s integration, instead of actually meeting a market need. Once the project predictably failed Intel limped along with graphics that were barely passable for general purpose displays, and worthless for all of the new use cases that were emerging.

The latest crisis, though, is in design: AMD is genuinely innovating with its Ryzen processors (manufactured by both GlobalFoundries and TSMC), while Intel is still selling varations on Skylake, a three year-old design. Ashraf Eassa, with assistance from a since-deleted tweet from a former Intel engineer, explains what happened:

According to a tweet from ex-Intel engineer Francois Piednoel, the company had the opportunity to bring all-new processor technology designs to its currently shipping 14nm technology, but management decided against it.

my post was actually pointing out that market stalling is more troublesome than Ryzen, It is not a good news. 2 years ago, I said that ICL should be taken to 14nm++, everybody looked at me like I was the craziest guy on the block, it was just in case … well … now, they know

— François Piednoël (@FPiednoel) April 26, 2018

The problem in recent years is that Intel has been unable to bring its major new manufacturing technology, known as 10nm, into mass production. At the same time, the issues with 10nm seemed to catch Intel off-guard. So, by the time it became clear that 10nm wouldn’t go into production as planned, it was too late for Intel to do the work to bring one of the new processor designs that was originally developed to be built on the 10nm technology to its older 14nm technology…

What Piednoel is saying in the tweet I quoted above is that when management had the opportunity to start doing the work to bring their latest processor design, known as Ice Lake (abbreviated “ICL” in the tweet), [to the 14nm process] they decided against doing so. That was likely because management truly believed two years ago that Intel’s 10nm manufacturing technology would be ready for production today. Management bet incorrectly, and Intel’s product portfolio is set to suffer as a result.

To put it another way, Intel’s management did not break out of the integration mindset: design and manufacturing were assumed to be in lockstep forever.

Integration and Disruption

It is perhaps simpler to say that Intel, like Microsoft, has been disrupted. The company’s integrated model resulted in incredible margins for years, and every time there was the possibility of a change in approach Intel’s executives chose to keep those margins. In fact, Intel has followed the script of the disrupted even more than Microsoft: while the decline of the PC finally led to The End of Windows, Intel has spent the last several years propping up its earnings by focusing more and more on the high-end, selling Xeon processors to cloud providers. That approach was certainly good for quarterly earnings, but it meant the company was only deepening the hole it was in with regards to basically everything else. And now, most distressingly of all, the company looks to be on the verge of losing its performance advantage even in high-end applications.

This is all certainly on Krzanich, and his predecessor Paul Otellini. Then again, perhaps neither had a choice: what makes disruption so devastating is the fact that, absent a crisis, it is almost impossible to avoid. Managers are paid to leverage their advantages, not destroy them; to increase margins, not obliterate them. Culture more broadly is an organization’s greatest asset right up until it becomes a curse. To demand that Intel apologize for its integrated model is satisfying in 2018, but all too dismissive of the 35 years of success and profits that preceded it.

So it goes.

AT&T, Time Warner, and the Need for Neutrality

The first thing to understand about the decision by a federal judge to approve AT&T’s acquisition of Time Warner, over the objection of the U.S. Department of Justice, is that it is very much in-line with the status quo: this is a vertical merger, and both the Department of Justice and the courts have defaulted towards approving such mergers for decades.1

Second, that there is an explosion of merger activity in and between the television production and distribution space is hardly a surprise: the Multichannel Video Programming Distributor (MVPD) business — that is, television distributed by cable, broadband, or satellite — has been shrinking for years now, and in a world where the addressable market is decreasing, the only avenues for growth are winning share from competitors, acquiring competitors, or vertically integrating.

Third, that last paragraph overstates the industry’s travails, at least in terms of television distribution, because most TV distributors are also internet service providers (ISPs), which means they are getting paid by consumers using the services disrupting MVPDs, including Netflix, Google, Facebook, and the Internet generally.

What was both unsurprising and yet odd about this case was the degree to which it was fought over point number two, with minimal acknowledgement of point number three. That is, it seems clear to me that AT&T made this acquisition with an eye on point number three, yet the government’s case was predicated on point number two; to that end, the government, in my eyes, rightly lost given the case they made. Whether they should have lost a better case is another question entirely.

Why AT&T Bought Time Warner

What is the point of a merger, instead of a contract? This is a question that always looms large in any acquisition, particularly one of this size: AT&T is paying $85 billion for Time Warner, and that’s an awfully steep price to simply hang out with movie stars.

The standard explanation for most mergers is “synergies”, the idea that there are significant cost savings from combining the operations of two companies; the reason this explanation is popular is because saving money is not an issue for antitrust, while the corresponding possibility — charging higher prices by achieving a stronger market position through consolidation — is. Such an explanation, though, is usually applied in the case of a horizontal merger, not a vertical one like AT&T and Time Warner.

To that end, AT&T was remarkably honest in its press release announcing the merger back in 2016:2

“With great content, you can build truly differentiated video services, whether it’s traditional TV, OTT or mobile. Our TV, mobile and broadband distribution and direct customer relationships provide unique insights from which we can offer addressable advertising and better tailor content,” [AT&T CEO Randall] Stephenson said. “It’s an integrated approach and we believe it’s the model that wins over time…

AT&T expects the deal to be accretive in the first year after close on both an adjusted EPS and free cash flow per share basis…Additionally, AT&T expects the deal to improve its dividend coverage and enhance its revenue and earnings growth profile.

Start with the second point: as I noted at the time, it’s not very sexy, but it matters to AT&T, a 34-year member of the Dividend Aristocrats, that is, a company in the S&P 500 that raised its dividend for 25 years straight or more. It’s a core part of AT&T’s valuation, but the company’s free cash flow has been struggling to keep up with its rising dividends. Time Warner will help significantly in this regard, as did the previous acquisition of DirecTV.

It is the first point, though, that is pertinent to this analysis: how exactly might Time Warner allow AT&T to “build truly differentiated video services”?

The Government’s Case

While the AT&T press release noted that those “truly differentiated video services” could be delivered via traditional TV, OTT, or mobile, the government’s case was entirely concerned with traditional TV. The original complaint stated:

Were this merger allowed to proceed, the newly combined firm likely would — just as AT&T/DirecTV has already predicted — use its control of Time Warner’s popular programming as a weapon to harm competition. AT&T/DirecTV would hinder its rivals by forcing them to pay hundreds of millions of dollars more per year for Time Warner’s networks, and it would use its increased power to slow the industry’s transition to new and exciting video distribution models that provide greater choice for consumers. The proposed merger would result in fewer innovative offerings and higher bills for American families.

The idea is that AT&T could leverage its ownership of DirecTV to demand higher prices for Turner networks from other MVPDs, because if the MVPDs refused to pay customers would be driven to switch to DirectTV. The problem is that, as was easily calculable, this makes no economic sense: the amount of money AT&T would lose by blacking out Turner would almost certainly outweigh whatever gains it might accrue. The judge agreed, and that was that.

AT&T’s Real Goals

Remember, though, that AT&T did not limit its options to traditional TV: what is far more compelling are the possibilities Time Warner content presents for OTT and mobile. The question is not what AT&T can do to increase the revenue potential of Time Warner content (which was the government’s focus), but rather what Time Warner content can do to increase the potential of AT&T’s services, particularly mobile.

Forgive the long excerpt, but I covered this angle at length in a Daily Update when the deal was announced:

AT&T’s core wireless business is competing in a saturated market with few growth prospects. Apple’s gift to the wireless industry of customers demanding high-priced data plans has largely run its course, with AT&T perhaps the biggest winner: the company acquired significant market share even as it increased its average revenue per user for nearly a decade, primarily thanks to the iPhone. Now, though, most everyone has a smartphone and, more pertinently, a data plan…

The implication of a saturated market is that growth is increasingly zero sum, which presents both a problem and an opportunity for AT&T. The problem is primarily T-Mobile: fueled by the massive break-up fee paid by AT&T for the aforementioned failed acquisition, T-Mobile has embarked on an all-out assault against the incumbent wireless carriers, and AT&T has felt the pain the most, recording a negative net change in postpaid wireless customers for eight straight quarters. Unable or unwilling to compete with T-Mobile on price, AT&T needs a differentiator, ideally one that will not only forestall losses but actually lead to gains.

At first glance this doesn’t explain the Time Warner acquisition either: per my point above these are two very different companies with two very different strategic views of content. A distributor in a zero-sum competition for subscribers (like AT&T) has a vertical business model: ideally there should be services and content that are exclusive to the distributor, thus securing customers. Time Warner, though, is a content company, which means it has a horizontal business model: content is made once and then monetized across the broadest set of potential customers possible, taking advantage of content’s zero marginal cost. The assumption of this sort of horizontal business model underlay Time Warner’s valuation; to suddenly make Time Warner’s content exclusive to AT&T would be massively value destructive (this is a reality often missed by suggestions that Apple, for example, should acquire content companies to differentiate its hardware).

AT&T, however, may have found a loophole: zero rating. Zero rating is often conflated with net neutrality, but unlike the latter, zero rating does not entail the discriminatory treatment of data; it just means that some data is free (sure, this is a violation of the idea of net neutrality, but this is why I was critical of the narrow focus on discriminatory treatment of data by net neutrality advocates). AT&T is already using zero rating to push DirecTV:

This is almost certainly the plan for Time Warner content as well: sure, it will continue to be available on all distributors, but if you subscribe to AT&T you can watch as much as you want for free; moreover, this offering is one that is strengthened by secular trends towards cord-cutting and mobile-only video consumption. If those trends continue on their current path AT&T will not only strengthen the moat of its wireless service against T-Mobile but maybe even start to steal share.

That this point never came up in the government’s case, and, by extension, the judge’s ruling, is truly astounding.

That noted, it is very fair to wonder why exactly the Department of Justice sued to block this acquisition: President Trump was very outspoken in his opposition to this deal and even more outspoken in his antipathy towards Time Warner-owned CNN. At the same time, Makan Delrahim, the Assistant Attorney General for Antitrust who led the case, didn’t see a problem with the merger before his appointment. That the government’s complaint rested on both the most obvious angle and, from AT&T’s perspective, the least important, suggests a paucity of rigor in the prosecution of this case; it is very reasonable to wonder if the order to oppose the merger came from the top, and that the easiest case was the obvious out.

The Neutrality Solution

Thus we are in the unfortunate scenario where a bad case by the government has led to, at best, a merger that was never examined for its truly anti-competitive elements, and at worst, bad law that will open the door for similar tie-ups. To be sure, it is not at all clear that the government would have won had they focused on zero rating: there is an obvious consumer benefit to the concept — that is why T-Mobile leveraged it to such great effect! — and the burden would have been on the government to show that the harm was greater.

The bigger issue, though, is the degree to which laws surrounding such issues are woefully out-of-date. Last fall I argued that Title II was the wrong framework to enforce net neutrality, even though net neutrality is a concept I absolutely support; I came to that position in part because zero rating was barely covered by the FCC’s action.3

What is clearly needed is new legislation, not an attempt to misapply ancient regulation in a way that is trivially reversible. Moreover, AT&T has a point that online services like Google and Facebook are legitimate competitors, particularly for ad dollars; said regulation should address the entire sector. To that end I would focus on three key principles:

  • First, ISPs should not purposely slow or block data on a discriminatory basis. I am not necessarily opposed to the concept of “fast lanes”, as I believe that offers significant potential for innovative services, although I recognize the arguments against them; it should be non-negotiable, though, that ISPs cannot purposely disfavor certain types of content.
  • Second, and similarly, dominant internet platforms should not be allowed to block any legal content from their services. At the same time, services should have discretion in monetization and algorithms; that anyone should be able to put content on YouTube, for example, does not mean that one has a right to have Google monetize it on their behalf, or surface it to people not looking for it.
  • Third, ISPs should not be allowed to zero-rate their own content, and platforms should not be allowed to prioritize their own content in their algorithms. Granted, this may be a bit extreme; at a minimum there should be strict rules and transparency around transfer pricing and a guarantee that the same rates are allowed to competitive services and content.

The reality of the Internet, as noted by Aggregation Theory, is increased centralization; meanwhile, the impact on the Internet on traditional media is an inexorable drive towards consolidation. Our current laws and antitrust jurisprudence are woefully unprepared to deal with this reality, and a new law guaranteeing neutrality is the best solution.

  1. Whether or not the presumption that vertical mergers are not anti-competitive is a worthwhile, albeit separate, discussion [↩︎]
  2. To be fair, the company also mentioned synergies, but it was hardly the point of the press release. [↩︎]
  3. The FCC said it would take it case-by-case, and did argue in the waning days of the Obama administration that zero rating one’s own services as AT&T is clearly trying to do was a violation, but that was never tested in court and was quickly rolled back [↩︎]

The Scooter Economy

As I understand it, the proper way to open an article about electric scooters is to first state one’s priors, explain the circumstances of how one came to try scooters, and then deliver a verdict. Unfortunately, that means mine is a bit boring: while most employing this format wanted to hate them,1 I was pretty sure scooters would be awesome — and they were!2

For me the circumstances were a trip to San Francisco; I purposely stayed at a hotel relatively far from where most of my meetings were, giving me no choice but to rely on some combination of scooters, e-bikes, and ride-sharing services. The scooters were a clear winner: fast, fun, and convenient — as long as you could find one near you. The city needs five times as many.

So, naturally, San Francisco banned them, at least temporarily: companies will be able to apply for their share of a pool of a mere 1,250 permits; that number may double in six months, but for now the scooter-riding experience will probably be more of a novelty, not something you can rely on. In fact, by the end of my trip, if I were actually in a rush, I knew to use a ride-sharing service.

It’s no surprise that ride-sharing services have higher liquidity: San Francisco is a car-friendly town. The city has a population of 884,363 humans and 496,843 vehicles, mostly in the city’s 275,000 on-street parking spaces. Granted, most of the Uber and Lyft drivers come from outside the city, but there is no congestion tax to deter them.

The result is an urban area stuck on a bizarre local maxima: most households have cars, but rarely use them, particularly in the city, because traffic is bad and parking is — relative to the number of cars — sparse; the alternative is ride-sharing, which incurs the same traffic costs but at least doesn’t require parking. And yet, San Francisco, for now anyways, will only allow about 60 parking spaces-worth of scooters onto the streets.

Everything as a Service

This is hardly the forum to discuss the oft-head-scratching politics of tech’s de facto capital city, and I can certainly see the downside of scooters, particularly the haphazard way with which they are being deployed; in an environment built for cars scooters get in the way.

It’s worth considering, though, just how much sense dockless scooters make: the concept is one of the purest manifestations of what I referred to in 2016 as Everything as a Service:

What happens, though, if we apply the services business model to hardware? Consider an airplane: I fly thousands of miles a year, but while Stratechery is doing well, I certainly don’t own my own plane! Rather, I fly on an airplane that is owned by an airline that is paid for in part through some percentage of my ticket cost. I am, effectively, “renting” a seat on that airplane, and once that flight is gone I own nothing other than new GPS coordinates on my phone.

Now the process of buying an airplane ticket, identifying who I am, etc. is far more cumbersome than simply hopping in my car — there are significant transaction costs — but given that I can’t afford an airplane it’s worth putting up with when I have to travel long distances. What happens, though, when those transaction costs are removed? Well, then you get Uber or its competitors: simply touch a button and a car that would have otherwise been unused will pick you up and take you where you want to go, for a price that is a tiny fraction of what the car cost to buy in the first place. The same model applies to hotels — instead of buying a house in every city you visit, simply rent a room — and Airbnb has taken the concept to a new level by leveraging unused space.

The enabling factor for both Uber and Airbnb applying a services business model to physical goods is your smartphone and the Internet: it enables distribution and transactions costs to be zero, making it infinitely more convenient to simply rent the physical goods you need instead of acquiring them outright.

What is striking about dockless scooters — at least when one is parked outside your door! — is that they make ride-sharing services feel like half-measures: why even wait five minutes, when you can just scan-and-go? Steve Jobs described computers as bicycles of the mind; now that computers are smartphones and connected to the Internet they can conjure up the physical equivalent as well!

Indeed, the only thing that could make the experience better — for riders and for everyone else — would be dedicated lanes, like, for example, the 900 miles worth of parking spaces in San Francisco. To be sure, the city isn’t going to make the conversion overnight, or, given the degree to which San Francisco is in thrall to homeowners, probably ever, but that is particularly a shame in 2018: venture capitalists are willing to fund the entire thing, and I’m not entirely sure why.

Missing Moats

Late last month came word that Sequoia Capital was leading a $150 million funding round for Bird, one of the electric scooter companies, valuing the company at $1 billion; a week later came reports that GV was leading a $250 million investment in rival Lime.

One of the interesting tidbits in Axios’s reporting on the latter was that each Lime scooter is used on average between 8 and 12 times a day; plugging that number into this very useful analysis of scooter-sharing unit costs suggests that the economics of both startups are very strong (certainly the size of the investments — and the quality of the investors — suggests the same).

The key word in that sentence, though, is “both”: what, precisely, might make Bird and Lime, or any of their competitors, unique? Or, to put it in business parlance, where is the moat? This is where the comparison to ride-sharing services is particularly instructive; I explained back in 2014 why there was more of a moat to be had in ride-sharing than most people thought:

  • There is a two-sided network between drivers and riders
  • As one service gains share, its increased utility of drivers will restrict liquidity on the other service, favoring the larger player
  • Riders will, all things being equal, use one service habitually

This leads to winner-take-all dynamics in a particular geographic area; then, when it comes times to launch in new areas, travelers and brand will give the larger service a head start.

To be sure, these interactions are complicated, and not everything is equal (see, for example, the huge amounts of share Lyft took last year thanks to Uber’s self-inflicted crises). It is that complication, though, and the fact it is exponentially more difficult to build a two-sided network (instead of, say, plopping a bunch of scooters on the street), that creates the conditions for a moat: the entire point of a moat is that it is hard to build.

Uber’s Self-Driving Mistake

This is why I have long maintained that the second-biggest mistake3 former Uber CEO Travis Kalanick made was the company’s head-first plunge into self-driving cars. On a surface level, the logic is obvious: Uber’s biggest cost is the driver, which means getting rid of them is an easy route to profitability — or, should someone else deploy self-driving cars first, then Uber could be undercut in price.

The mistake in Kalanick’s thinking is two-fold:

  • First, up-and-until the point that self-driving cars are widely available — that is, not simply invented, but built-and-deployed at scale — Uber’s drivers are its biggest competitive advantage. Kalanick’s public statements on the matter hardly evinced understanding on this point.
  • Second, bringing self-driving cars to market would entail huge amounts of capital investment. For one, this means it would be unlikely that Google, a company that rushes to reassure investors when it loses tens of basis points in margin, would do so by itself, and for another, whatever companies did make such an investment would be highly incentivized to maximize utilization of said investment as soon as possible. That means plugging into the dominant transportation-as-a-service network, which means partnering with Uber.

My contention is that Uber would have been best-served concentrating all of its resources on its driver-centric model, even as it built relationships with everyone in the self-driving space, positioning itself to be the best route to customers for whoever wins the self-driving technology battle.

Uber’s Second Chance

Interestingly, scooters and their closely-related cousin, e-bikes, may give Uber a second chance to get this right. Absent two-sided network effects, the potential moats for, well, self-riding scooters and e-bikes are relatively weak: proprietary technology is likely to provide short-lived advantages at best, and Bird and Lime have plenty of access to capital. Both are experimenting with “charging-sharing”, wherein they pay people to charge the scooters in their homes, but both augment that with their own contractors to both charge vehicles and move them to areas with high demand.

What remains under-appreciated is habit: your typical tech first-adopter may have no problem checking multiple apps to catch a quick ride, but I suspect most riders would prefer to use the same app they already have on their phone. To that end, there is certainly a strong impetus for Bird and Lime to spread to new cities, simply to get that first-app-installed advantage, but this is where Uber has the biggest advantage of all: the millions of people who already have the Uber app.

To that end, I thought Uber’s acquisition of Jump Bikes was a good idea, and scooters should be next (an acquisition of Bird or Lime may already be too pricey, but Jump has a strong technical team that should be able to get an Uber-equivalent out the door soon). The Uber app already handles multiple kinds of rides; it is a small step to handling multiple kinds of transportation — a smaller step than installing yet another app.

More Tech Surplus

More generally, in a world where everything is a service, companies may have to adapt to shallower moats than they may like. If you squint, what I am recommending for Uber looks a bit like a traditional consumer packaged goods (CPG) strategy: control distribution (shelf-space | screen-space) with a few dominant products (e.g. TIDE | UberX) that provide leverage for new offerings (e.g. Swiffer | Jump Bikes). The model isn’t nearly as strong, but there may be other potential lock-ins, particularly in terms of exclusive contracts with cities and universities.

Still, that is hardly the sort of dominance that accrues to digital-only aggregators like Facebook or Google, or even Netflix; the physical world is much harder to monopolize. That everything will be available as a service means a massive increase in efficiency for society broadly — more products will be available to more people for lower overall costs — even as the difficulty in digging moats means most of that efficiency becomes consumer surplus. And, as long as venture capitalists are willing to foot the bill, cities like San Francisco should take advantage.

I wrote a follow-up to this article in this Daily Update.

  1. That article is perhaps more revealing than the author appreciated [↩︎]
  2. Note: this article is going to focus on San Francisco for simplicity’s sake, although the broader points have nothing to do with San Francisco specifically; I am aware that the transportation situation is different in different cities — I do live in a different country, after all, in a city with fantastic public transportation and a plethora of personal transportation options. [↩︎]
  3. The first was not buying Lyft [↩︎]

The Cost of Developers

Yesterday saw three developer-related announcements, two from Apple, and one from Microsoft. The former came as part of Apple’s annual Worldwide Developers Conference keynote:

  • The iOS App Store, which turns 10 next month, serves 500 million weekly visitors, and as of later this week will have earned developers over $100 billion.
  • Sometime next year developers will be able to write apps for the Mac using iOS user interface frameworks (known as UIKit).

Microsoft, meanwhile, for the second time in three years, outshone Apple’s keynote with a massive acquisition. From the company’s press release:

Microsoft Corp. on Monday announced it has reached an agreement to acquire GitHub, the world’s leading software development platform where more than 28 million developers learn, share and collaborate to create the future. Together, the two companies will empower developers to achieve more at every stage of the development lifecycle, accelerate enterprise use of GitHub, and bring Microsoft’s developer tools and services to new audiences.

“Microsoft is a developer-first company, and by joining forces with GitHub we strengthen our commitment to developer freedom, openness and innovation,” said Satya Nadella, CEO, Microsoft. “We recognize the community responsibility we take on with this agreement and will do our best work to empower every developer to build, innovate and solve the world’s most pressing challenges.”

Under the terms of the agreement, Microsoft will acquire GitHub for $7.5 billion in Microsoft stock.

Developers can be quite expensive indeed!

Platform-Developer Symbiosis

Over the last few weeks, particularly in The Bill Gates Line, I have been exploring the differences between aggregators and platforms; while aggregators generally harvest already produced content or goods, developers leverage the platform to create something entirely new.

Platforms facilitate while aggregators intermediate

This results in a symbiosis between developers and platforms: from a technical perspective, platforms provide the fundamental building blocks (i.e. application program interfaces, or APIs) necessary for developers to build new experiences, and from a marketing perspective, those new experiences give customers a reason to buy the platform in the first place, or to upgrade.

The degree to which applications drive adoption of the underlying platform can, of course, vary; unsurprisingly the monetization potential of the platform relative to developers varies in a correlated way. Traditional Windows, for example, provided very little end user functionality; what made it so valuable were all of the applications built on top of its open platform.

Windows was an open platform

Here “open” means two things: first, the Windows API was available to anyone to build on, and two, developers built relationships directly with end users, including payment. This led to many huge software companies and, in 2003, to the creation of a platform on top of Windows: Valve’s Steam.

What Valve realized is that playing a game is only one part of the overall customer experience; the experience of discovering and buying the game matters as well, as does the installation and upgrade process. Moreover, these customer pain points were developer pain points as well; the original impetus to develop Steam, for example, was the difficulty in getting players to upgrade en masse, something that was essential for games in which players competed online. And, while Valve is a private company and has never announced Steam’s revenue numbers, reports suggest the platform generates billions of dollars a year.

Even that, though, pales in comparison to the iOS App Store: Apple took Steam’s app store idea and integrated it with the platform, such that iOS users and developers had no choice but to use Apple’s owned-and-operated distribution channel, with all of the various limitations and costs — 30%, to be precise — that that entailed.

The iPhone platform with an intermediation layer

Apple was able to accomplish this first and foremost because the underlying products — the iPhone and iPad — inspired demand in their own right, independent of applications. Apple had the users that developers needed to make money.

Second, the App Store, like Steam before it, really was a better experience that drove more downloads and purchases by end users. This meant that developing for iOS wasn’t simply attractive because of the number of users, but also because those users were willing to buy more than they would have on another platform.

Third — and this applies to Steam as well — the App Store dramatically lowered the barriers to entry for developers; this led to more apps, which attracted more users, which led to more apps, both locking in apps as a competitive advantage and also ensuring that no one app had outsized power (leaving Apple free to restrict Steam-like competitors by fiat).

Apple’s Platform Announcements

This frames the two Apple announcements I noted above. Start with the news of $100 billion for iOS developers: that means that Apple has collected around $40 billion, and at a very high margin to boot.

Moreover, the vast majority of Apple’s announcements were, if anything, about competing with those developers: the first new app announced, Measure, should immediately wipe out the only obviously useful Augmented Reality apps in the store. Apple also announced a new Podcasts app for Watch, update News, Stocks, and Voice Memo apps, and the only third party demos were about how one of the largest software companies there is — Adobe — would be supporting Apple’s preferred 3D-image format. And why not! The implication of owning all of those high-value users is that, on iOS anyways, developers are cheap.

The Mac, though, is a different story: the platform is far smaller than the iPhone; that there remain a number of high quality independent software vendors supporting the Mac is a testament to how valuable it is for developers to be able to build direct relationships with customers that can span years and multiple transactions. Still, there seems little question that the number of Mac apps is, if not trending in the wrong direction, certainly not growing in any meaningful way; there simply aren’t enough users to entice developers.

That means Apple’s approach has to be very different from iOS: instead of dictating terms to developers, Apple announced that it is in the middle of a multi-year project to make it easier to port iOS apps to the Mac. This is, in a fashion, Apple paying for Mac apps; no, the money isn’t going to developers, but Apple is voluntarily taking on a much greater portion of the porting workload. Developers are much more expensive when you don’t have nearly as many users.

The Cost of GitHub

Still, whatever it is costing Apple to build this porting framework, it surely is a lot less than $7.5 billion, the price Microsoft is paying for GitHub. Then again, at first glance, it may not be clear what the point of comparison is.

Go back to Windows: Microsoft had to do very little to convince developers to build on the platform. Indeed, even at the height of Microsoft’s antitrust troubles, developers continued to favor the platform by an overwhelming margin, for an obvious reason: that was where all the users were. In other words, for Windows, developers were cheap.

That is no longer the case today: Windows remains an important platform in the enterprise and for gaming (although Steam, much to Microsoft’s chagrin, takes a good amount of the platform profit there), but the company has no platform presence in mobile, and is in second place in the cloud. Moreover, that second place is largely predicated on shepherding existing corporate customers to cloud computing; it is not clear why any new company — or developer — would choose Microsoft.

This is the context for thinking about the acquisition of GitHub: lacking a platform with sufficient users to attract developers, Microsoft has to “acquire” developers directly through superior tooling and now, with GitHub, a superior cloud offering with a meaningful amount of network effects. The problem is that acquiring developers in this way, without the leverage of users, is extraordinarily expensive; it is very hard to imagine GitHub ever generating the sort of revenue that justifies this purchase price.

Again, though, GitHub revenue is not the point; Microsoft has plenty of revenue. What it also has is a potentially fatal weakness: no platform with user-based leverage. Instead Microsoft is betting that a future of open-source, cloud-based applications that exist independent of platforms will be a large-and-increasing share of the future, and that there is room in that future for a company to win by offering a superior user experience for developers directly, not simply exerting leverage on them.

This, by the way, is precisely why Microsoft is the best possible acquirer for GitHub, a company that, having raised $350 million in venture capital, was possibly not going to make it as an independent entity. Any company with a platform with a meaningful amount of users would find it very hard to resist the temptation to use GitHub as leverage; on the other side of the spectrum, purely enterprise-focused companies like IBM or Oracle would be tempted to wring every possible bit of profit out of the company.

What Microsoft wants is much fuzzier: it wants to be developers’ friend, in large part because it has no other option. In the long run, particularly as Windows continues to fade, the company will be ever more invested in a world with no gatekeepers, where developer tools and clouds win by being better on the merits, not by being able to leverage users.

That, though, is exactly why Microsoft had to pay so much: buying in directly is a whole lot more expensive than using leverage, which can produce equivalent — or better! — returns for much less investment.

The Bill Gates Line

Two of the more famous military sayings are “Generals are always preparing to fight the last war”, and “Never interrupt your enemy while he is making a mistake.” I thought of the latter at the conclusion of last Sunday’s 60 Minutes report on Google:

Google declined our request for an interview with one of its executives for this story, but in a written response to our questions, the company denied it was a monopoly in search or search advertising, citing many competitors including Amazon and Facebook. It says it does not make changes to its algorithm to disadvantage competitors and that, “our responsibility is to deliver the best results possible to our users, not specific placements for sites within our results. We understand that those sites whose ranking falls will be unhappy and may complain publicly.”

The 60 Minutes report was not exactly fair-and-balanced; it featured an anti-tech-monopoly crusader1, an anti-tech-monopoly activist, an anti-tech-monopoly regulator, and Yelp CEO Jeremy Stoppelman. And, in what seems highly unlikely to have been a coincidence, Yelp this week filed a new antitrust complaint in the EU against Google. To be sure, just because a report was biased does not mean it was wrong; while I am a bit skeptical of the EU’s antitrust case against Google Shopping, the open case about Android seems pretty clear-cut. Neither, though, is Yelp’s direct concern.

Yelp’s Case Against Google

This is from a blog post about the 60 Minutes feature:

Yelp did participate in the piece because Google is doing the opposite of “delivering the best results possible,” and instead is giving its own content an unlawful advantage. We’ve made a video to explain exactly how Google puts its own interests ahead of consumers in local search, which you can watch here:

Yelp’s position, at least in this video, appears to be that Google’s answer box is anticompetitive because it only includes reviews and ratings from Google; presumably the situation could be resolved were Google to use sources like Yelp. There are three problems with this argument, though:

  • First, the answer box originally included content scraped from sources like Yelp and other vertical search sites; under pressure from the FTC, driven in part by complaints from Yelp and other vertical search engines, Google agreed to stop doing so in 2013.2
  • Second, in a telling testament to the power of being on top of search results, Google’s ratings and reviews have improved considerably in the two years since that video was posted; this isn’t a static market (to be sure, this is an argument that could be used on both sides).
  • Third — and this is the point of this article — what Yelp seems to want will only serve to make Google stronger.

No wonder Google declined the request for an interview.

The Bill Gates Line

Over the last few weeks I have been exploring what differences there are between platforms and aggregators, and was reminded of this anecdote from Chamath Palihapitiya in an interview with Semil Shah:

Semil Shah: Do you see any similarities from your time at Facebook with Facebook platform and connect, and how Uber may supercharge their platform?

Chamath: Neither of them are platforms. They’re both kind of like these comical endeavors that do you as an Nth priority. I was in charge of Facebook Platform. We trumpeted it out like it was some hot shit big deal. And I remember when we raised money from Bill Gates, 3 or 4 months after — like our funding history was $5M, $83 M, $500M, and then $15B. When that 15B happened a few months after Facebook Platform and Gates said something along the lines of, “That’s a crock of shit. This isn’t a platform. A platform is when the economic value of everybody that uses it, exceeds the value of the company that creates it. Then it’s a platform.”

By this measure Windows was indeed the ultimate platform — the company used to brag about only capturing a minority of the total value of the Windows ecosystem — and the operating system’s clear successors are Amazon Web Services and Microsoft’s own Azure Cloud Services. In all three cases there are strong and durable businesses to be built on top.

A drawing of Platform Businesses Attract Customers by Third Parties
From Tech’s Two Philosophies

Once a platform dips under the Bill Gates Line, though, the long-term potential of a business built on a “platform” starts to decline. Apple’s App Store, for example, has all of the trappings of a platform, but Apple quite clearly captures the vast majority of the overall ecosystem, both because of the profitability of the iPhone and also because of its control of App Store economics; the paucity of strong and durable businesses on the App Store is a natural outgrowth of that.

The App Store intermediates 3rd parties and users

Note that Apple’s ability to control the economics of its developers comes from intermediating the relationship of those developers with customers.

Aggregators, Not Platforms

Facebook and Google take this intermediation to the extreme, leveraging their ability to drive discovery of the sheer abundance of information on their network and the Internet broadly:

A drawing of Aggregators Own Customer Relationships and Suppliers Follow
In the aggregator business model the aggregator owns customers and suppliers follow

It follows that Facebook and Google’s “platforms” not only don’t meet the Bill Gates Line, they don’t even register on the graph: they are the purest expression of aggregators. From my original formulation:

The fundamental disruption of the Internet has been to turn this dynamic on its head. First, the Internet has made distribution (of digital goods) free, neutralizing the advantage that pre-Internet distributors leveraged to integrate with suppliers. Secondly, the Internet has made transaction costs zero, making it viable for a distributor to integrate forward with end users/consumers at scale.

This has fundamentally changed the plane of competition: no longer do distributors compete based upon exclusive supplier relationships, with consumers/users an afterthought. Instead, suppliers can be aggregated at scale leaving consumers/users as a first order priority. By extension, this means that the most important factor determining success is the user experience: the best distributors/aggregators/market-makers win by providing the best experience, which earns them the most consumers/users, which attracts the most suppliers, which enhances the user experience in a virtuous cycle.

The result is the shift in value predicted by the Conservation of Attractive Profits. Previous incumbents, such as newspapers, book publishers, networks, taxi companies, and hoteliers, all of whom integrated backwards, lose value in favor of aggregators who aggregate modularized suppliers — which they often don’t pay for — to consumers/users with whom they have an exclusive relationship at scale.

This is ultimately the most important distinction between platforms and aggregators: platforms are powerful because they facilitate a relationship between 3rd-party suppliers and end users; aggregators, on the other hand, intermediate and control it.

Moreover, at least in the case of Facebook and Google, the point of integration in their respective value chains is the network effect. This is what I was trying to get at last week in The Moat Map with my discussion of the internalization of network effects:

  • Google has had the luxury of operating in an environment — the world wide web — that was by default completely open. That let the best technology win, and that win was augmented by the data that comes from serving an ever-increasing portion of the market. The end result was the integration of end users and the data feedback cycle that made Google search better and better the more it was used.
  • Facebook’s differentiator, meanwhile, is the relationships between friends and family; the company has subsequently integrated that network effect with consumer attention, forcing all of the content providers to jostle for space in the Newsfeed as pure commodities.

It’s worth noting, by the way, why it was that Facebook could come to be a rival to Google in the first place; specifically, Facebook had exclusive data — those relationships and all of the behavior on Facebook’s site that resulted — that Google couldn’t get to. In other words, Facebook succeeded not by being a part of Google, but by being completely separate.

Succeeding in a World of Aggregators

This gets at why I find Yelp’s complaints a bit besides the point: the company seems to be expending an awful lot of energy to regain the right to give Google the content Yelp worked hard to acquire. There is revenue there, of course, just as there is in the production of commodities generally, but without a sustainable cost advantage it’s not the best route to building a strong and durable business.

Of course that is the bigger problem: I noted above that Google’s library of ratings and reviews has grown substantially over the past few years; users generating content are the ultimate low-cost supplier, and losing that supply to Google is arguably a bigger problem for Yelp than whatever advertising revenue it can wring out from people that would click through on a hypothetical Google Answer Box that used 3rd-party sources. And, it should be noted, that Yelp’s entire business is user-generated reviews: they and similar vertical sites are likely to do a far better job of generating, organizing, and curating such data.

Still, I can’t help but wonder whether or not Yelp’s problem is not that Google is using its own content in the Answer Box, but rather the Answer Box itself; which of these set of results would be better for Yelp’s business, even in a hypothetical world where Answer Box content comes from Yelp?

Yelp would get more visitors without the answer box

Presuming that the answer is the image on the right — driving users to Yelp is both better for the bottom line and better for content generation, which mostly happens on the desktop — and it becomes clear that Yelp’s biggest problem is that the more useful Google is — even if it only ever uses Yelp’s data! — the less viable Yelp’s business becomes. This is exactly what you would expect in an aggregator-dominated value chain: aggregators completely disintermediate suppliers and reduce them to commodities.

To that end, this is why the best strategies entail business models that avoid Google and Facebook completely: look no further than Amazon, which last month stopped buying Google Shopping ads, something the company can afford to do given that half of shoppers start their product searches on Amazon. To be sure, Amazon is plenty powerful in its own right, but it is a hard-to-ignore example of Google’s favorite argument that “competition is only a click away.”

Yelp Versus Google

Still, I have sympathy for Yelp’s position; Stoppelman told 60 Minutes:

If I were starting out today, I would have no shot of building Yelp. That opportunity has been closed off by Google and their approach…because if you provide great content in one of these categories that is lucrative to Google, and seen as potentially threatening, they will snuff you out.

Stoppelman is right, but the reason is perhaps less nefarious than it seems; the 60 Minutes report explained why in the voiceover:

Yelp and countless other sites depend on Google to bring them web traffic — eyeballs for their advertisers.

Yelp, like many other review sites, has deep roots in SEO — search-engine optimization. Their entire business was long predicated on Google doing their customer acquisition for them. To the company’s credit it has become a well-known brand in its own right, and now gets around 70% of its visits via its mobile app. Those visits are very much in the Amazon model I highlighted above: users are going straight to Yelp and bypassing Google directly.

That, though, isn’t great for Google! It seems a bit rich that Yelp should be free to leverage its app to avoid Google completely, and yet demand that Google continue to feature Yelp prominently in its search results, particularly on mobile, where the Answer Box has particular utility. I get that Yelp feels like Google has changed the terms of the deal from when Yelp was founded in 2004, but the reality is that the change that truly mattered was mobile.

What I do find compelling is a new video that Yelp put out yesterday; while it makes many of the same points as the one above, instead of being focused on regulators it is targeting Google itself, arguing that Google isn’t living up to its own standards by not featuring the best results, and not driving traffic back to sites that make the content Google needs (by, for example, not including prominent links to the content filling its answer boxes; Yelp isn’t asking that they go away, just that they drive traffic to 3rd parties). Google may be an aggregator, but it still needs supply, which means it needs a sustainable open web. The company should listen.

Facebook and Data Portability

Facebook, unfortunately for its suppliers, faces no such constraints: the content that is truly differentiated is made by Facebook’s users, and it is wholly owned by Facebook. Facebook is even further from the Bill Gates Line than Google is: the latter at least needs commoditized suppliers; the former can take or leave them on a whim, and does.

That is why I’ve come to realize a popular prescription for Facebook’s dominance, data portability, put forward this week by a coalition of progressive organizations under the umbrella Freedom From Facebook, is so mistaken.3 The problem with data portability is that it goes both ways: if you can take your data out of Facebook to other applications, you can do the same thing in the other direction. The question, then, is which entity is likely to have the greater center of gravity with regards to data: Facebook, with its social network, or practically anything else?

Facebook at the center of data exchange
From The Facebook Brand

Remember the conditions that led to Facebook’s rise in the first place: the company was able to circumvent Google, go directly to users, and build a walled garden of data that the search company couldn’t touch. Partnering or interoperating with companies below the Bill Gates Line, particularly aggregators, is simply an invitation to be intermediated. To demand that governments enforce exactly that would be a mistake that only helps Facebook.4

The broader takeaway is that distinguishing between platforms and aggregators isn’t simply an academic exercise: it should affect how companies think about their competitive environment vis-à-vis the biggest companies in tech, and, just as importantly, it should weigh heavily on regulators. The Microsoft antitrust battles of the 2000s were in many respects about enforcing interoperability as a way of breaking into the Microsoft platform; today antitrust should be far more concerned about aggregators capturing everything they touch by virtue of their control of end users.

That’s the thing about the “Generals fight the last war” saying; it’s usually applied to the losing side that made mistake after mistake while the victors leveraged the new world order.

I wrote a follow-up to this article in this Daily Update.

  1. I’ve discussed why I disagree with Gary Reback’s views on monopoly and innovation in this Daily Update [↩︎]
  2. With regard to that FTC decision, yes, as the Wall Street Journal reported, some FTC staff members recommended suing Google; what is not true is that the recommendation was unanimous, or that FTC commissioners ultimately deciding to go in another direction was unusual. In fact, other staff groups in other groups recommended against the suit, and the decision of the FTC commissioners was unanimous. Again, that is not to say it was the right decision, but that the popular conception — including what was reported in that 60 Minutes piece — is a bit off [↩︎]
  3. To be fair, I’ve made the same argument previously, but I’ve changed my mind [↩︎]
  4. The group’s demand that Facebook be forced to divest Instagram, WhatsApp, and Messenger makes much more sense in terms of this framework (with the exception of Messenger, which has always been a part of Facebook). I strongly believe that the single best antitrust remedy for aggregators is limiting acquisitions [↩︎]

The Moat Map

A subtext to last week’s article, Tech’s Two Philosophies, was the idea that there is a difference between Aggregators and Platforms; this was the key section:

It is no accident that Apple and Microsoft, the two “bicycle of the mind” companies, were founded only a year apart, and for decades had broadly similar business models: sure, Microsoft licensed software, while Apple sold software-differentiated hardware, but both were and are at their core personal computer companies and, by extension, platforms…

Google and Facebook, on the other hand, are products of the Internet, and the Internet leads not to platforms but to aggregators. While platforms need 3rd parties to make them useful and build their moat through the creation of ecosystems, aggregators attract end users by virtue of their inherent usefulness and, over time, leave suppliers no choice but to follow the aggregators’ dictates if they wish to reach end users.

The distinction wasn’t entirely satisfying; first and foremost the power of both aggregators and platforms, however defined, ultimately rests on the size and strength of their userbase. Moreover, Google and Facebook have platform-type aspects to their business, and Apple has aggregator characteristics when it comes to its control of the App Store (that Microsoft does not is a symbol of the company’s mobile failure).

Moreover, what of companies like Amazon, or Netflix? In a follow-up Daily Update I classified the former as a platform and the latter as an aggregator, but clearly both have very different businesses — and supplier relationships — than either Google and Facebook on one side or Apple and Microsoft on the other, even as they both derive their power from owning the customer relationship.

Make no mistake, that bit about owning the customer relationship remains critical: that is the critical insight of Aggregation Theory. How that ownership of the customer translates into an enduring moat, though, depends on the interaction of two distinct attributes: supplier differentiation and network effects.

The Supplier Differentiation Spectrum

Consider the six companies I mentioned above: Facebook, Google, Amazon, Netflix, Apple, and Microsoft.1

The degree of differentiation of tech company suppliers varies

These companies exist on a spectrum in terms of supplier differentiation (and, by extension, supplier power):

  • Facebook has commoditized suppliers more than anyone: an article from the New York Times is treated no differently from a BuzzFeed quiz or the latest picture of your niece or an advertisement.
  • Google gives slightly more deference to established content providers, but not much; search results are presented the same regardless of their source (although Google increasingly presents results differently depending on the type of content).
  • Amazon is a little harder to classify — that’s kind of entailed in the name The Everything Store — but generally brands are much less important than they are in a world of limited shelf space, and few people even realize they are buying from the 3rd party merchants that make up over half of Amazon’s sales.
  • Differentiation matters more for Netflix, particularly when it comes to acquiring new users; still, users are transacting with Netflix and, the longer they stick with the streaming service, first opening Netflix and then looking for something to watch, as opposed to the other way around.
  • Apple first and foremost attracts and retains users through its integrated experience, but that experience would quickly be abandoned were there not third party apps.
  • Microsoft traditionally succeeded entirely because of its ecosystem, not just applications but also the entire universe of value-added resellers, systems integrators, etc.

The extremes make the point: Facebook could lose all of its third party content providers overnight and still be a compelling service; Microsoft without third parties would be, well, we already saw with Windows Phone.

The Network Effect Spectrum

Another way to consider this spectrum is in terms of user-related network effects. The idea of a network effect is that an additional user increases the value of a good or service, and indeed all of these companies depend on network effects. However, the type of network effect differs considerably, as well as the extent to which the network effect directly improves a company’s core product (what I am calling an “internalized” versus “externalized” network effect):

The internalization of network effects varies by tech company

Again there is a spectrum:

  • For Facebook the network effect that matters is users — a social network’s most important feature is whether your friends and family are using it. This network — given it is the product! — is completely internal to Facebook.
  • Google has network effects of its own, but they are less about users and more about data: more people searching makes for better search results, because of the system Google has built to relentlessly harvest, analyze, and iterate on data. Like Facebook, Google’s network effect is largely internal to Google.
  • Amazon’s network effect is more subtle: there is an aspect where your shopping on Amazon improves my experience through things like rankings, reviews, and data feedback loops. Just as important, though, are two additional effects: first, the more people that shop on Amazon, the more likely suppliers are to come onto Amazon’s platform, increasing price and selection for everyone. In other words, Amazon, particularly as it transitions to being more of a commerce platform and less of a retailer, is a two-sided network. There is one more factor though: Amazon’s incredible service rests on hundreds of billions of dollars in investments; that fixed cost investment has to be born by customers at some point, which means the more customers there are the less any one customer is responsible for those fixed costs (this manifests indirectly through lower prices and better service).
  • Netflix is a hybrid much like Amazon: there are certainly data network effects when it comes to what shows are made, what are cancelled, recommendations, ratings, etc. An essential part of Netflix’s competitive advantage going forward, though, rests on its differentiated ability to invest in new shows; this investment capability is driven by the company’s huge and still-growing user base, which is the biggest way that additional users benefit users already on the service.
  • Apple certainly benefits from a large user base over which to spread the significant fixed costs of its products, but on this end of the spectrum it is the two-sided network of developers and users that is most important. The more users that are on a platform, the more developers there will be, which increases the value of the platform for everyone.
  • Microsoft, befitting the point I made above about the expansiveness of its ecosystem, has the most “externalized” network effect of all: there is very little about Windows, for example, that produces a network effect (Office is another story), but the ecosystem on top of Windows produced one of the greatest network effects ever.

At this point, you may have noticed that these two spectrums run in roughly the same order: I don’t think that is a coincidence.

The Moat Map

Here are these two spectrums laid out on two orthogonal axis:

The Map Moat represents the relationship between supplier differentiation and network externalization

This relationship between the differentiation of the supplier base and the degree of externalization of the network effect forms a map of effective moats; to again take these six companies in order:

  • Facebook has completely internalized its network and commoditized its content supplier base, and has no motivation to, for example, share its advertising proceeds. Google similarly has internalized its network effects and commoditized its supplier base; however, given that its supply is from 3rd parties, the company does have more of a motivation to sustain those third parties (this helps explain, for example, why Google’s off-site advertising products have always been far superior to Facebook’s).
  • Netflix and Amazon’s network effects are partially internalized and partially externalized, and similarly, both have differentiated suppliers that remain very much subordinate to the Amazon and Netflix customer relationship.
  • Apple and Microsoft, meanwhile, have the most differentiated suppliers on their platforms, which makes sense given that both depend on largely externalized network effects. “Must-have” apps ultimately accrue to the platform’s benefit.

It is just as useful to think about what happens when companies find themselves outside of the Moat Map.

Missing Moats

Start with Apple and apps: in August 1997, Steve Jobs, having just returned to Apple, took the stage at Macworld Boston and proceeded to humble himself: first, he talked about how much Apple needed Adobe, and then he announced a settlement with Microsoft that entailed Microsoft investing in Apple and developing Office for Mac for at least five years. That was followed by Bill Gates’ grinning visage appearing via satellite over Jobs’ head:

An image from MacWorld Boston when Microsoft invested in Apple

I wrote in 2013 that I believe this experience resulted in Apple making poor strategic choices with the iPhone and iPad: the company never again wanted to have its suppliers become too powerful. The way this played out, though, is that Apple for years neglected the business model needs of developers building robust productivity apps that could have meaningfully differentiated iOS devices from Android.

To be sure, the company has been more than fine: its developer ecosystem is plenty strong enough to allow the company’s product chops to come to the fore. I continue to believe, though, that Apple’s moat could be even deeper had the company considered the above Moat Map: the network effects of a platform like iOS are mostly externalized,2 which means that highly differentiated suppliers are the best means to deepen the moat; unfortunately Apple for too long didn’t allow for suitable business models.

Some company's and models outside of the Moat Map

Another example is Uber: on the one hand, Uber’s suppliers are completely commoditized. This might seem like a good thing! The problem, though, is that Uber’s network effects are completely externalized: drivers come on to the platform to serve riders, which in turn makes the network more attractive to riders. This leaves Uber outside the Moat Map. The result is that Uber’s position is very difficult to defend; it is easier to imagine a successful company that has internalized large parts of its network (by owning its own fleet, for example), or done more to differentiate its suppliers. The company may very well succeed thanks to the power from owning the customer relationship, but it will be a slog.

On the opposite side of the map are phone carriers in a post-iPhone world: carriers have strong network effects, both in terms of service as well as in the allocation of fixed costs. Their profit potential, though, was severely curtailed by the emergence of the iPhone as a highly differentiated supplier. Suddenly, for the first time, customers chose their carrier on the basis of whether or not their preferred phone worked there; today, every carrier has the iPhone, but the process of reaching that point meant the complete destruction of carrier dreams of value-added services, and a lot more competition on capital-intensive factors like coverage and price.

Direction or Context?

It’s worth noting that maps can take two forms: some give direction, and others provide context for what has already happened; I’m not entirely sure which best describes the Moat Map. In the case of Apple and apps, for example, I absolutely believe the company could have made different strategic choices had it fully appreciated the interaction between supplier differentiation and network effects.

On the other hand, one could make a very strong case that the degree of supplier differentiation possible flows from the network effect involved: perhaps it was inevitable that Facebook and Google commoditized suppliers, for example, or that Amazon and Netflix would have to simultaneously pursue differentiated suppliers even as they sought to suppress them. What is always certain, though, is that there is no one perfect strategy: as always, it depends.

Thanks to James Allworth, my co-host on the Exponent podcast, for helping me conceptualize this framework

I wrote a follow-up to this article in this Daily Update.

  1. In this article when I refer to “Amazon” I am primarily referring to the e-commerce company; Microsoft the PC company. I will cover AWS and Azure in a follow-up in the Daily Update. [↩︎]
  2. iMessage being an instructive exception [↩︎]