Microsoft just wrapped its annual Build conference for 2025. The focus this year: AI agents. We got a look at the open agentic web, new CoPilot updates, Linux is now open-source, and the company also confirmed it’s expanding its Azure AI Foundry models list to include Elon Musk’s Grok 3 and Grok 3 mini from xAI. Plus, we hear from OpenAI’s Sam Altman and Nvidia's Jensen Huang. Here’s everything you missed
Category
🤖
TechTranscript
00:00Good morning and welcome to Build.
00:03We have a bunch of new updates we are rolling out at Build,
00:06starting with Visual Studio.
00:08It is the most powerful IDE for .NET and C++,
00:12and we're making it even better.
00:14.NET 10 support,
00:16a live preview at design time,
00:18improvements to Git tooling,
00:20a new debugger for cross-platform apps,
00:23and much, much more.
00:25And we're moving, by the way,
00:27a new cadence for stable releases as well.
00:30And when it comes to VS Code,
00:32just a couple of weeks ago,
00:34we shipped a hundredth release in the open.
00:39It included improved multi-window support
00:42and made it easier to view stage directly
00:46from within the editor.
00:48And GitHub continues to be the home for developers.
00:52GitHub Enterprise has tremendous momentum
00:55in the enterprise.
00:58And we're doubling down for developers
01:00building any applications.
01:02Trust, security, compliance, auditability,
01:07data residency are even more critical today.
01:11Now, starting with tools you use
01:14to the infrastructure you deploy on
01:17to reach the users and the markets you want.
01:21You know, talking about trust,
01:24open source is at the core of GitHub.
01:29And we're taking this next big step.
01:33As GitHub Copilot has evolved inside VS Code,
01:37AI has become so central to how we code.
01:41And that's why we're open sourcing Copilot in VS Code.
01:45We're really excited about this.
01:47Starting today, we will integrate these AI-powered capabilities
01:54directly into the core of VS Code,
01:57bringing them into the same open source repo
02:00that powers the world's most loved dev tool.
02:05And the next thing we're introducing
02:07is an autonomous agent for Site Reliability Engineering,
02:11or SRE.
02:12This is the next big step forward,
02:14which is a full coding agent built right into GitHub,
02:20taking Copilot from being a pair programmer
02:24to a peer programmer.
02:25You can assign issues to Copilot, bug fixes,
02:30new features, code maintenance,
02:33and it will complete these tasks autonomously.
02:36And today, I'm super excited that it's now available to all of you.
02:42You know, here I am with all the bugs that I have
02:46or issues that I have to deal with in GitHub issues.
02:50The first one is adding a filter for user group size,
02:56community page.
02:57Let's go take a look at this issue.
02:59It's nice.
03:00They say, like, I've got to go put some new filter up here.
03:04It also shows me where.
03:06Let's do the thing that I can do,
03:08which is assign it to my new buddy, Copilot.
03:11So I'm going to assign it.
03:13And there you go.
03:16Let's go and see.
03:18Let me scroll down.
03:20Ah! It's picked it up.
03:24It sees me.
03:25It creates a PR.
03:27And, you know, you see that nice eye emoji.
03:31It sort of knows that I'm here,
03:32and it's sort of going on to work.
03:34And we'll come back and check it out later.
03:36Today, we're introducing a new class of enterprise-grade agents
03:40you can build using models fine-tuned
03:44on your company's data, workflows, and style.
03:48We call it Copilot Tuning.
03:51This is about really not just using Copilot,
03:54but it's about tuning Copilot for every customer,
03:58every enterprise, every firm.
04:00You know, Copilot can now learn your company's unique tone
04:04and language, and soon it'll even go further understanding
04:08all of the company's specific expertise and knowledge.
04:11All you need to do is seed the training environment
04:15with a small set of references and kick off a training run.
04:18The customized model inherits the permissions
04:22of all the source control.
04:24And once integrated into the agent,
04:27it can deploy to authorized users, right?
04:30So you can sort of go to groups that you've set up
04:33and distribute it across Copilot.
04:35Think of Foundry like a production line for intelligence.
04:40You know, we already support 1,900 models,
04:44whether they're response models, reasoning models,
04:47task-specific, multi-modal, you name it.
04:50They're all there in Foundry.
04:52But, you know, still, picking a model can be a bit of a chore,
04:56and you need to be able to route your queries
04:58to the right one fast.
05:00And so we are making that easier, too.
05:02Our new model router will automatically choose
05:05the best OpenAI model for the job.
05:07No more sort of those, you know, manual model selections.
05:11That's why today we are thrilled to announce
05:15Grok from XAI is coming to Azure.
05:22So, basically, the focus of Grok 3.5 is
05:26sort of a fundamental source of physics
05:30and applying physics tools across all lines of reasoning.
05:34And to aspire to truth with minimal error.
05:39Like, there's always going to be some mistakes that are made,
05:42but we aim to get to truth with acknowledged error,
05:48but minimize that error over time.
05:50And I think that's actually extremely important for AI safety.
05:55So I've thought a lot for a long time about AI safety,
05:58and my ultimate conclusion is the old maxim that honesty is the best policy.
06:03Yeah.
06:04It really is for safety.
06:06Thank you so much, Ilan, for briefly joining us today.
06:09And we're really excited about working with you
06:11and getting this into the developers' hands.
06:13Thank you very much.
06:14And I can't emphasize enough that we're looking for feedback
06:17from you, the developer audience.
06:19Tell us what you want, and we'll make it happen.
06:23I was at LamaCon recently with Mark, and you know, as he likes to say,
06:26he's bringing the full herd of Lama, the full Lama herd to Azure.
06:31We're excited about all the advances that they're making in open source,
06:34Black Forest Labs, and many more, right?
06:37So all of them.
06:38And we also have expanded our partnership with Hugging Face.
06:41So you'll have access to over 11,000 frontier
06:45and open source models in Foundry.
06:47But, you know, models are just part of the equation.
06:50You really need, like, a database or a knowledge engine,
06:55a real query engine that's custom-built for agents.
07:00And the Foundry agent service lets you build declarative agents,
07:05in fact, just with a few lines of code just in the portal.
07:09For complex workflows, it supports multi-agent orchestration.
07:13And I'm excited to share that now the agent service
07:16is generally available.
07:18With Entra ID, agents now get their own identity permissions,
07:24policies, access controls.
07:26The agents you build in Foundry and Copilot Studio show up automatically
07:31in an agent directory in Entra.
07:33This is Vibe Travel, a chat-based app that uses Azure AI Foundry
07:38to provide an AI travel agent to help people plan their trips.
07:42I'm planning a trip to New Zealand in about eight months,
07:45so I asked Vibe for places to ski.
07:48Hmm, this math isn't mathing.
07:51I'm pretty sure eight months from now is January 2026.
07:55We definitely need to make our AI agents smarter
07:58and give more grounded responses.
08:01We can do this by giving it files with reference data
08:04or connections to other services like Microsoft Fabric or TripAdvisor.
08:09In addition to giving it access to knowledge,
08:12we also give it access to our flight reservation API.
08:15Now, let's go back to our app and try our query again.
08:19In about eight months, it's January 2026,
08:23which is the summer season.
08:25No skiing possible.
08:27Now, when I click this reset button in the app,
08:31it immediately clears our chat session without a warning.
08:34That's not ideal, and I can ask GitHub Copilot
08:38to help me implement the details for issue number one
08:41and also fetch the details for issue number one.
08:44I also have this beautiful design.
08:47It's pretty sweet, wouldn't you say?
08:50Let me add this to the context window and send the prompt.
08:54You can see the GitHub MCP server fetch the details,
08:57and with Agent Mode's new vision capabilities,
09:00it means that Copilot can even understand the sketches of what I want.
09:05Copilot Agent Mode was able to examine the proposed changes,
09:08look for relevant files to change, make the changes as appropriate,
09:13and it even stuck to the styles and coding standards that I wanted it to.
09:17When I go back to the app,
09:19it was able to implement the UI changes that my boss wanted.
09:23Let me see if it's running.
09:28Let's hit keep.
09:30As you can see, folks, this is live.
09:33I don't have time to debug,
09:39but once I click this issue, I'm pretty sure it's implemented.
09:43Copilot was able to add the modal in our app,
09:46and that's two issues that we worked on in a handful of minutes.
09:50Thank you, GitHub Copilot and Azure AI Foundry.
09:54Let's take a look at the issue that he assigned earlier.
09:57Do you remember it?
09:58To add that group size filter to the site,
10:00and Copilot was able to open up a pull request
10:03to start implementing these changes.
10:05So let's take a look at the deployment.
10:07Fingers crossed.
10:09And you can see the group size filter was added.
10:16We're excited to announce the Windows AI Foundry.
10:20And now we're extending this platform to support the full dev life cycle,
10:25right?
10:26Not just on Copilot PCs, but across CPUs, GPUs, NPUs,
10:30all and in the cloud, right?
10:32So you can build your application and have them run across all of that silicon.
10:37We're announcing native support for MCP in Windows.
10:44Windows will now include several built-in MCP servers,
10:48like file systems, settings, app actions, as well as windowing.
10:52We first announced Bash on Ubuntu on Windows nearly 10 years ago.
10:59It subsequently became what we obviously call, you know, today WSL.
11:05Today we are making WSL fully open source.
11:09The super exciting thing that we really want to talk about here at this build
11:16is our real serious commitment to MCP.
11:21We can get into all sorts of arguments,
11:24and I know you all are perfectly capable of this,
11:27as am I, as engineers, like where you've got really sharp opinions
11:31about pieces of technology and like, you know,
11:34this thing's a little bit better than that thing.
11:36But what's better than all of that is like just getting something standard
11:40that we can all use and build on top of,
11:42and like we hope that thing is MCP.
11:44So we're announcing today, and you all should go check out the code
11:47in the GitHub repo, NLWeb.
11:49And the idea behind NLWeb is it is a way for anyone who has a website
11:55or an API already to very easily make their website or their API
12:02an agentic application.
12:04And because every NLWeb endpoint is by default an MCP server,
12:09it means that those things that people are offering up via NLWeb
12:15will be accessible to any agent that speaks MCP.
12:18So you really can think about it a little bit like HTML for the agentic web.
12:23We're integrating Cosmos DB directly into Foundry.
12:27So that means any agent can store and retrieve things like conversational history,
12:33and soon they'll be able to also use Cosmos for all their RAG application needs.
12:41Last fall, we put SQL into Fabric,
12:44and today we're taking the next big step.
12:47We're bringing Cosmos DB to Fabric too, right?
12:50Because AI apps need more than just structured data.
12:54They need semi-structured data, whether it's text, images, audio.
12:58And with Cosmos and Fabric and your data instantly available,
13:02alongside SQL, you can now unify your entire data estate
13:07and make it ready for AI.
13:09Now, while our cloud offers this comprehensive coverage,
13:13there are still many critical use cases that need extreme low latency
13:17and explicit control where and how your apps and data are stored.
13:22And that means the ability to run things disconnected,
13:25and that's why we also offer Azure Local.
13:29Azure Local.
13:30But there's one more domain that I want to talk about to close out,
13:34which is science.
13:36We'll have real breakthroughs in the scientific process itself,
13:40which will really accelerate our ability to create, you know,
13:45a new material, a new compound, a new molecule.
13:48That's our ambition with Microsoft Discovery, which we are announcing today.
13:54Discovery is built on Foundry bringing advanced agents highly specialized in R&D,
14:01not just for reasoning, but for conducting research itself.
14:05These are the set of candidates that Microsoft Discovery has identified for PFAS-free immersion coolants.
14:12But I know what you're wondering.
14:14Did we actually make a Discovery?
14:16Well, this is not just a demo.
14:19We really did this.
14:21We took one of the most promising candidates and synthesized it.
14:25They didn't let me bring a new material unknown to humans onto this stage,
14:30but I've got this video from the lab.
14:33So we can see there my coolant, and we dropped a standard PC in it,
14:37and it's running Forza Motorsport.
14:40And it is keeping the temperature stable with no fans.
14:43It's literally very cool.
14:46So that was, you know, a quick, comprehensive, whatever you want to call it,
14:51walk through the full stack and how we're creating new opportunity for you across the agentic web.
14:59Thank you all very, very much.
15:01Enjoy the rest of Build.