本文字数:约 5200 字,预计阅读时间:15 分钟
To scale agentic AI, Notion tore down its tech stack and started fresh
Notion, a popular productivity software, has completely rebuilt its technology stack to support agentic AI at enterprise scale. Traditional AI workflows are based on explicit, step-by-step instructions, whereas Notion’s new architecture focuses on advanced reasoning models that can autonomously select, orchestrate, and execute tools across connected environments. This shift enables the agents to be more independent, making multiple decisions within one agentic workflow.
Notion’s head of AI modeling, Sarah Sachs, explained that the company has replaced rigid prompt-based flows with a unified orchestration model supported by modular sub-agents. These sub-agents are capable of searching Notion and the web, querying and adding to databases, and editing content. Each agent uses tools contextually, deciding whether to search within Notion or other platforms like Slack.
The company’s goal is to ensure that anything users can do, their Notion agents can also do. Sachs emphasized the importance of understanding latency in different contexts, noting that users are willing to wait longer for more exhaustive reasoning tasks. Notion’s philosophy of “better, faster, cheaper” drives a continuous iteration cycle that balances latency and accuracy through fine-tuned vector embeddings and elastic search optimization.
Notion is its own biggest user, with an active feedback loop from its own employees and trusted design partners. This internal testing allows the company to evaluate progressions and ensure models don’t regress. Sachs advises other tech leaders to be willing to make hard decisions and sit at the frontier of development to build the best product for their customers.
Here’s what Jony Ive and Sam Altman revealed about their secretive AI hardware project at OpenAI’s Dev Day
At OpenAI’s Dev Day, CEO Sam Altman and legendary designer Jony Ive discussed their secretive collaboration on a new family of AI-powered devices. The partnership, solidified by OpenAI’s $6.5 billion acquisition of Ive’s hardware startup Io, is focused on fixing humanity’s broken relationship with technology. Ive emphasized the need to rethink devices and create something entirely new, yet natural, to improve emotional well-being.
The vision is to create an AI companion that is “accessible but not intrusive,” avoiding the pitfalls of a “weird AI girlfriend.” The project involves a family of devices, likely departing from screen-centric design. The goal is to create tools that make users happy, fulfilled, and less anxious, rather than just being productive.
Reports suggest a palm-sized device without a screen that relies on cameras and microphones to perceive its environment. The challenge is to focus amidst the rapid pace of AI’s progress, with Ive admitting that the sheer number of compelling ideas makes it difficult to narrow down the choices. The central narrative is clear: Jony Ive is betting on a screenless future powered by AI to make us all a little less anxious and a little more human.
首个全自动AI科学家诞生!西湖大学最新成果:性能超越人类SOTA基线183.7%
西湖大学最新研究成果展示了首个全自动AI科学家,其性能超越了人类科学家的SOTA(State-of-the-Art)基线183.7%。该AI科学家在两周内完成了人类科学家需要三年才能完成的工作。这一成果标志着AI在科学研究领域的重大突破,展示了AI在数据处理和分析方面的巨大潜力。
Samsung AI researcher's new, open reasoning model TRM outperforms models 10,000X larger — on specific problems
Alexia Jolicoeur-Martineau, a senior AI researcher at Samsung, introduced the Tiny Recursion Model (TRM) — a neural network with just 7 million parameters that competes with or surpasses large language models 10,000 times larger on specific reasoning benchmarks. The model is designed to handle structured, visual, grid-based problems like Sudoku, mazes, and puzzles. TRM uses a single two-layer model that recursively refines its predictions, achieving high accuracy on tasks such as Sudoku and Maze-Hard puzzles.
The simplicity of TRM’s architecture and its recursive reasoning process make it more efficient and accessible. Jolicoeur-Martineau emphasizes that recursive reasoning, not scale, may be the key to handling abstract and combinatorial reasoning problems. The model is now available on GitHub under an MIT License, enabling researchers and companies to modify and deploy it for various purposes.
Dell Doubles Long-Term Revenue Growth Outlook on Strong AI Demand
Dell has doubled its long-term revenue growth outlook, attributing the increase to strong demand for AI solutions. The company now expects annual revenue from fiscal year 2027 to 2030 to expand between 7% and 9%. This growth forecast reflects the increasing importance of AI in enterprise computing and the demand for AI infrastructure, indicating a strong market trend for AI-related hardware and services.
直播预告:光轮智能 × NVIDIA带来Sim2Real关键突破
光轮智能与NVIDIA联合直播将展示Sim2Real技术的关键突破。该直播将于今晚24点举行,预计将讨论如何通过仿真到现实(Sim2Real)技术加速AI模型的训练和优化。这一技术的进步有望推动自动驾驶、机器人技术等领域的快速发展。
Globalization Not An Option But A Must for Today's Entrepreneurs, Say Industry Veterans at NEX-T Summit
Industry veterans at the NEX-T Summit highlighted the necessity of globalization for today’s entrepreneurs. Tristan Dai, who operates businesses across China and the U.S., emphasized the impact of geopolitics and AI regulation on industries like virtual production and humanoid robotics. The summit discussed the challenges and opportunities in an increasingly interconnected global market, stressing the importance of international collaboration and compliance with diverse regulations.
AI21’s Jamba reasoning 3B redefines what 'small' means in LLMs — 250K context on a laptop
AI21 Labs has introduced Jamba Reasoning 3B, a small, open-source model that can run extended reasoning, code generation, and respond based on ground truth. The model handles more than 250,000 tokens and can run inference on edge devices such as laptops and mobile phones. Jamba Reasoning 3B combines the Mamba architecture and Transformers, allowing it to run a 250K token window on devices and achieving 2-4x faster inference speeds.
The company tested the model on a standard MacBook Pro, finding that it can process 35 tokens per second. Ori Goshen, co-CEO of AI21, stated that the company sees more enterprise use cases for small models, mainly because moving most inference to devices frees up data centers. This shift in computing resources can significantly reduce costs and improve efficiency in AI applications.
Trilogy Metals Stock Pops 211% after Trump Admin. Confirms Stake in amid Investment Spree
The Trump administration agreed to invest $35.6 million in Trilogy Metals, giving the Pentagon a 10% stake in the company. The investment is part of a broader strategy to secure critical materials needed for advanced technologies, including AI hardware. The confirmation of the stake has led to a significant surge in Trilogy Metals' stock price, reflecting the market's optimism about the company's future prospects.
总结
今日AI领域的新闻主要聚焦在企业如何通过技术创新和架构优化提升AI能力。Notion通过彻底重建其技术堆栈来支持代理AI,展示了在企业级应用中的重要进展。Jony Ive和Sam Altman的合作揭示了AI在改善人机关系方面的潜力。同时,Samsung和AI21 Labs展示了小模型在特定任务上的高效表现,强调了递归推理的重要性。Dell和Trilogy Metals的新闻则反映出AI市场的需求和投资趋势。这些新闻共同展示了AI技术在不同领域的发展和应用前景。
作者:Qwen/Qwen2.5-32B-Instruct
文章来源:钛媒体, 量子位, VentureBeat
编辑:小康






