View profile

The DL - An inside view into Pacific Northwest Tech

Revue
 
Welcome to The DL, a weekly newsletter about tech, startups, and investing in the Pacific Northwest.
 
October 14 · Issue #17 · View online
The DL
Welcome to The DL, a weekly newsletter about tech, startups, and investing in the Pacific Northwest.

This week’s issue has a deep dive on the 737 MAX and what it means for the future of AI, how companies are disrupting themselves, and the latest trend to come out of YC - dopamine fasting.

👋 Referred by a friend? Sign up here.

Boeing's 737 MAX and what it means for AI
Warning: this is a long article! Feel free to skip to the bottom!
A couple of weeks ago, the NY Times published an incredibly thorough investigation into the Boeing 737 MAX. It’s written by a pilot-turned-journalist, and he goes against prevailing opinion to argue that poorly-trained pilots were the cause of the crashes, not the plane’s systems.

Interestingly, on the same day, the New Republic also released a report with the opposite point of view - blaming Boeing and exonerating the pilots.

The 737 MAX investigation raises a lot of interesting questions about AI, so here is my two-minute summary of the situation and what it means for the future of automation:

✈️ Boeing vs. Airbus
  • Boeing’s basic philosophy is the pilot always has “ultimate authority of control,” and pilots can override or turn off any system
  • Airbus, on the other hand, was founded 50 years after Boeing on the idea of a “robotic airplane” that required minimal piloting skills using digital flight controls and pilot-proof protections

🖥️ The MCAS
  • The Maneuvering Characteristics Augmentation System (MCAS) is the center of the 737 MAX investigation
  • The MCAS is a software fix to an aerodynamic problem; it creates synthetic forces to mimic the aerodynamic performance of earlier 737s, which allowed it to avoid designation as a new model
  • If the MCAS detected a certain set of inputs, it would trigger the system to turn the nose of the plane down
  • If the MCAS detected a false positive, it would present as a runaway trim, which is a problem that “any pilot would know how to handle”

👨‍✈️ Pilot training
  • As flying becomes cheaper and the industry needs more pilots, training has become productionized, and schools don’t care about “airmanship”
  • For many students, flight training is about rote memorization of the steps for each flight or simulation (at one pilot training school in Indonesia, the completion rate is 95%)
  • Runaway trim, for example, is always part of Simulation No. 3, and no one ever has issues with it because it always occurs at the same time in the same way in the simulation

🚨 The accidents and the response
  • Oct 28, 2018 - A Lion Air plane sees errors from a faulty sensor. The MCAS kicks in, and the plane pitches down. The pilot disengages the electric trim, disabling the MCAS, and everything is fine
  • Oct 29, 2018 - The same plane is flown again without repairs, and it crashes 12 minutes after take-off. The faulty sensor activates the MCAS, it continually forces the plane’s nose down, and the pilots do not disengage the electric trim
  • Mar 10, 2019 - Ethiopian Airlines 302 crashes for the same reasons – faulty sensor, MCAS engaged, and electric trim was not disengaged
  • Mar 16, 2019 - Within a week of the second crash, countries around the world ground Boeing 737 MAX flights

This is a very complex investigation, but I think the crux of the issue is automation. As we use more tools to automate and augment human work, who’s responsible if something goes wrong – the user? their manager? the trainer? the toolmaker? the regulators? all of the above?

Here are some of the questions I had reading through these articles:
What should we expect from humans? The way pilots describe the MCAS makes it sound like disengaging electric trim is as straightforward as guiding a car with lane-keeping back into a lane if it begins to drift. But who decides what to expect from users? Should humans always have ‘ultimate authority’ over machines, or should there be 'user-proof’ protections?

Should training be focused on tools or theory? As automation moves up the stack, the debate between teaching theory and teaching tools will become more important. The question of “why do I need to learn math if I always have a calculator” is going to become “why memorize names of bones and muscles when I could spend that time in a surgical simulator?”

Are software problems different than hardware problems? The MCAS was developed because the MAX did not have the same aerodynamic properties as previous 737 models, so Boeing fixed it with software. That feels weird because testing is meant to identify problems, but the processes for testing software are different than those for testing hardware (and testing “AI” is different than testing traditional software).

Where will these types of issues pop up next? These situations will happen in healthcare, transportation, finance, manufacturing, and every industry that is attempting to automate or augment human work. The way the 737 MAX case is handled will set some important precedents for how to assign responsibility, how to thoughtfully design holistic systems, and what companies should expect from their partners.

Disrupt yo self! 💥
It’s really interesting to see corporate strategy play out in this era of technology and innovation because innovators are willing to take gigantic bets and disrupt themselves.

For example, four of the major online brokerages recently announced they would be cutting online trading commissions to $0 (to compete with Robhinhood). That is huge - to put it in perspective, the brokerage industry is probably going to lose $2B in annual revenue from commissions.

Another local example is Zillow. Rich Barton rejoined the company as CEO this year to rally the company around “instant buying” (and compete with Opendoor). Last week, he said the reason they are investing in instant-buying is it’s “an existential threat because if it works and we don’t do it, we get displaced as the marketplace.”

I’m sure at some point, someone at Robinhood or OpenDoor has answered a defensibility question by saying, “our legacy competitors can’t offer this because it conflicts with their core business model.” But in today’s world, expect to see more and more companies stay competitive by disrupting themselves.

Other stuff Dan's talking about
🧠 The three stages of hearing about dopamine fasting - 1. ??? 2. lol 3. yeah I’d probably try this too tbh. 🤣 Here’s the link to learn more
💣 r/WallStreetBets - Speaking of Robinhood… this is my favorite subreddit. It’s dedicated to people making YOLO stock bets on Robinhood. Here’s one guy who started with $50K, made $600K on two trades, and then lost it all over the course of a week
⏲️ Seattle time-lapses - Time lapses of the Microsoft campus construction and the teardown of the Viaduct
🌊 Digital Gold - Best book on the history of crypto. Really exciting to put yourself in the shoes of the early adopters at the forefront of a new technology wave

Please hit reply! (Or subscribe or forward!)
About me: I work as an investor at Madrona Venture Group, a Seattle-based venture capital firm that has been early partners with companies like Amazon, Smartsheet, Apptio, and Redfin.

If you have thoughts, questions, or comments, hit reply!

👋 Referred by a friend? Sign up here.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue
Seattle, WA