🚗 Lyft is heading to Interrupt. Over the last year, Lyft built 8 AI agents that are capable of fully resolving 35% of all customer issues. Head of Data Science, Safety and Customer Care Nick Ung will take us through the evals used internally, how they scale evals with LangSmith, and the lessons learned along the way. Catch Nick’s talk along with all the others at Interrupt, the Agent Conference by LangChain on May 13-14 in San Francisco. Get tickets here 👉interrupt.langchain.com
About us
At LangChain, our mission is to make intelligent agents ubiquitous. We build the foundation for agent engineering in the real world, helping developers move from prototypes to production-ready AI agents that teams can rely on. What began as widely adopted open-source tools has grown into a platform for building, evaluating, deploying, and operating agents at scale. LangChain provides the agent engineering platform and open source frameworks developers need to ship reliable agents fast. LangSmith offers observability, evaluation, and deployment for rapid iteration. Our open source frameworks, LangGraph, LangChain, and Deep Agents, help developers build agents with speed and granular control. LangSmith is trusted by leading AI teams at Zip, Vanta, Klarna, Workday, Linkedin, Cloudflare, and more.
- Website
-
langchain.com
External link for LangChain
- Industry
- Technology, Information and Internet
- Company size
- 51-200 employees
- Type
- Privately Held
Employees at LangChain
Updates
-
LangChain just received a 2026 Google Cloud Partner of the Year Award in the Marketplace: Agent Platform category. Over the last two years, we have built a close partnership with Google Cloud and expanded LangSmith’s capabilities and its reach within the ecosystem. Today, enterprise teams at companies including LinkedIn, Cloudflare, GitLab, L'Oréal, Workday, and Klarna are running LangSmith on Google Cloud infrastructure. We are proud to see our efforts in making LangSmith a natural fit for Google Cloud environments recognized, and look forward to continuing our partnership and enabling joint customer success. At Google Cloud Next? Stop by Booth #5006 this week to see how LangSmith can help your company observe, evaluate, and deploy agents. #GoogleCloudPartnerAwards Google Cloud Partners Google Cloud
-
LangChain reposted this
Interrupt is right around the corner, and tickets are selling fast! https://lnkd.in/gW-96wSp Our second agent conference is your chance to hear from the AI leaders deploying in production (Cisco, Clay, Toyota Motor Corporation, Lyft, Chime, Box, + more), see what’s next on our roadmap, attend workshops run by the LangChain team, and connect with the community. The last Interrupt sold out, and this one will too. Come see what’s next for AI agents!
-
LangChain reposted this
Different voices. Same answer: open models. NVIDIA Founder and CEO Jensen Huang sat down with the leaders from Mistral AI, Black Forest Labs, Cursor, LangChain, Perplexity, Reflection AI, Thinking Machines Lab, Ai2 & AMP PBC to discuss the rapid rise of open frontier models. Get the top takeaways from the frontier of AI: https://nvda.ws/4bNmQ7Z
-
Google Cloud Next starts this week. If you’re attending, come find us at Booth #5006. We’ll be running demos and having technical conversations around building, evaluating, and scaling AI agents. Check out everything we’re up to at Next: https://lnkd.in/gGs5aUrp
-
-
At Credit Genie, LangSmith Insights Agent is used to understand user behavior and expand capabilities for their AI financial assistant AskGenie. https://lnkd.in/eQyYFvUf AskGenie was built on LangGraph, and helps users gain deeper insights into their finances. Using LangSmith Insights Agent, the team can detect gaps in functionality through corresponding traces and speed up product development. David Li, Jeffrey Ngai, Gregorio Lozano Palacio, and Charles Yuan from the Credit Genie team shared their most interesting findings surfaced by Insights Agent and how it has influenced new features and functionality.
-
-
LangChain reposted this
Developing an agent is a harness problem. Deploying an agent is a runtime problem. The harness is everything you build around the model: prompts, tools, skills, the reasoning loop. The runtime is everything underneath: durable execution, memory, human-in-the-loop, multi-tenancy support, and observability. We wrote a guide that walks through production requirements that surface when deploying long running agents, the runtime primitives that meet them, and how `deepagents deploy` packages those capabilities into something you can ship. https://lnkd.in/eNtSgqZr
-
The latest from the LangSmith Signal: We looked at API call volume and developer adoption trends across LLM providers using LangChain open source telemetry since December 2025 ⤵️ Anthropic had the speediest ascent, with a 73% increase in users and 39% increase in overall model run/API call share. While OpenAI growth was flat, they continued to dominate in the volume category with 80% of LLM traces. Google Gemini has seen a steady climb since the launch of Gemini 3, with an 18% increase in users and 43% increase in overall share. When it comes to workloads, token economics are a major factor. Anthropic’s average number of input tokens per trace is 3 times OpenAI’s and 2 times Google’s, making it the go-to choice for heavy lifting. Meanwhile, OpenAI has a significant lead for smaller, high-frequency tasks. 📊We analyzed this information from LangSmith Observability data across billions of agent runs, and we’re just getting started. Stay tuned for more LangSmith Signals as we share how devs are building agents, by the numbers.
-
Things don’t always go to plan when bringing agents into production. ‘deepagents deploy’ is purpose-built around the challenges teams face when deploying. https://lnkd.in/efc8NzqR ✅ Every infrastructural consideration is mapped to a purpose-built runtime capability. No need to rebuild components from scratch. ✅ Open models, open harness, open memory. Your team is in control. Here’s our conceptual guide by Sydney Runkle + Vivek Trivedy on the biggest issues faced in production, the runtime solutions we built around them, and how you can go from a toml file to a LangSmith Deployment in minutes.
-
-
LangChain reposted this
It’s a LangSmith world, and we’re just living in it!
-