A new research initiative, Flapping Airplanes, was launched on Wednesday, securing $180 million in seed funding from prominent investors including Google Ventures, Sequoia, and Index. This ambitious project aims to innovate airplane technology by exploring methods to train large AI models with reduced data requirements.
Flapping Airplanes stands out in the tech landscape, as highlighted by Sequoia partner David Cahn, who notes that it is among the first labs to shift focus from the industry’s prevalent data-scaling model. While the conventional scaling approach emphasizes massive resource allocation to enhance current large language models (LLMs) as a pathway to artificial general intelligence (AGI), the research paradigm suggests that significant breakthroughs could be just a few years away. This perspective advocates for investing in long-term research initiatives that could require five to ten years to yield results.
Cahn elaborates on how a compute-centric strategy might prioritize immediate successes, often favoring short-term achievements over substantial, more elusive advancements. In contrast, a research-driven approach embraces a patient, exploratory mindset. It involves taking calculated risks on projects that may have lower probabilities of success but collectively broaden the horizons of what’s possible in AI.
This divergence in strategy highlights the importance of diverse approaches in AI development, making Flapping Airplanes an intriguing player in the quest to reshape the technological landscape of aviation.
