Welcome to Microsoft Ignite 2023! The previous yr has been one among true transformation. Corporations are seeing actual advantages right now and are desirous to discover what’s subsequent—together with how they will do extra with their knowledge investments, construct clever purposes, and uncover what AI can do for his or her enterprise.
We lately commissioned a study through IDC and uncovered insights into how AI is driving enterprise outcomes and financial influence for organizations worldwide. Greater than 2,000 enterprise leaders surveyed confirmed they’re already utilizing AI for worker experiences, buyer engagement, and to bend the curve on innovation.
The examine illustrates the enterprise worth of AI nevertheless it actually involves life via the tales of how our customers and partners are innovating today. Clients like Heineken, Thread, Moveworks, the National Basketball Association (NBA), and so many extra are placing AI applied sciences to work for his or her companies and their very own clients and workers.
From trendy knowledge options uniquely suited to the period of AI, beloved developer instruments, and software companies, we’re constructing Microsoft Azure because the AI supercomputer for patrons, irrespective of the place to begin.
This week at Ignite, the tempo of innovation isn’t slowing down. We’ll share extra tales about how organizations are turning to new options to drive their enterprise ahead. We’re additionally saying many new capabilities and updates to make it simpler than ever to make use of your favourite instruments, maximize current investments, save time, and innovate on Azure as a trusted platform.
Fashionable knowledge options to energy AI transformation
Each clever app begins with knowledge—and your AI is just nearly as good as your knowledge—so a contemporary knowledge and analytics platform is more and more vital. The combination of knowledge and AI companies and options is usually a distinctive aggressive benefit as a result of each group’s knowledge is exclusive.
Final yr, we launched the Microsoft Intelligent Data Platform as an built-in platform to convey collectively operational databases, analytics, and governance and allow you to combine all of your knowledge belongings seamlessly in a means that works for your enterprise.
At Ignite this week, we’re saying the final availability of Microsoft Material, our most integrated data and AI solution but, into the Clever Knowledge Platform. Microsoft Fabric can empower you in ways in which weren’t doable earlier than with a unified knowledge platform. This implies you possibly can convey AI on to your knowledge, irrespective of the place it lives. This helps foster an AI-centered tradition to scale the ability of your knowledge worth creation so you possibly can spend extra time innovating and fewer time integrating.
EDP is a world power firm that goals to rework the world via renewable power sources. They’re utilizing Microsoft Material and OneLake to simplify knowledge entry throughout knowledge storage, processing, visualization, and AI workflows. This permits them to completely embrace a data-driven tradition the place they’ve entry to high-value insights and choices are made with a complete view of the info atmosphere.
We’re additionally saying Fabric as an open and extensible platform. We are going to showcase integrations with lots of our companions like LSEG, Esri, Informatica, Teradata and SAS, who’ve been demonstrating the chances of bringing their product experiences as workloads into Material, widening their attain and breadth of capabilities.
Each group is raring to avoid wasting money and time as they remodel. We’re announcing several new features and updates for Azure SQL that make Azure the perfect and most cost-effective place to your knowledge. Updates embrace decrease pricing for Azure SQL Database Hyperscale compute, Azure SQL Managed Occasion free trial provide, and a wave of different new options.
Lufthansa Technik AG has been working Azure SQL to help its software platform and knowledge property, leveraging totally managed capabilities to empower groups throughout features. They’re becoming a member of on stage throughout a breakout session on cloud-scale databases, so you possibly can be taught extra about their expertise instantly.
Simply construct, scale, and deploy multimodal generative AI experiences responsibly with Azure
The AI alternative for companies is centered on the unbelievable energy of generative AI. We’re impressed by clients who at the moment are nimbly infusing content material technology capabilities to rework every kind of apps into intuitive, contextual experiences that impress and captivate their very own clients and workers.
Siemens Digital Industries is one firm utilizing Azure AI to boost its manufacturing processes by enabling seamless communication on the store flooring. Their latest resolution helps area engineers report points of their native language, selling inclusivity, environment friendly downside decision, and quicker response instances.
Immediately organizations want extra complete, unified instruments to construct for this subsequent wave of generative AI-based purposes. This is the reason we’re saying new updates that push the boundaries of AI innovation and make it simpler for patrons to responsibly deploy AI at scale throughout their enterprise.
All the things that you must construct, check, and deploy AI improvements in a single handy location
At Ignite, we’re thrilled to introduce the general public preview of Azure AI Studio, a groundbreaking platform for AI builders by Microsoft. All the things organizations must sort out generative AI is now in a single place: cutting-edge fashions, knowledge integration for retrieval augmented technology (RAG), clever search capabilities, full-lifecycle mannequin administration, and content material security.
We proceed to develop alternative and suppleness in generative AI fashions past Azure OpenAI Service. We introduced the mannequin catalog at Construct and at Ignite, we’re saying Model as a Service in managed API endpoint coming quickly throughout the mannequin catalog. It will allow professional builders to simply combine new basis fashions like Meta’s Llama 2, G42’s Jais, Command from Cohere and Mistral’s premium fashions into their purposes as an API endpoint and fine-tune fashions with customized coaching knowledge, with out having to handle the underlying GPU infrastructure. This performance will assist eradicate the complexity for our clients and companions of provisioning sources and managing internet hosting.
Giant language fashions (LLM) orchestration and grounding RAG are prime of thoughts as momentum for LLM-based AI purposes grows. Prompt flow, an orchestration instrument to handle immediate orchestration and LLMOps, is now in preview in Azure AI Studio and customarily accessible in Azure Machine Studying. Immediate circulate offers a complete resolution that simplifies the method of prototyping, experimenting, iterating, and deploying your AI purposes.
We’re additionally saying at Ignite that Azure AI Search, previously Azure Cognitive Search, is now accessible in Azure AI Studio so all the things stays in a single handy location for builders to avoid wasting time and enhance productiveness.
Azure AI Content Safety can also be accessible in Azure AI Studio so builders can simply consider mannequin responses multi function unified improvement platform. We’re additionally saying the preview of recent options inside Azure AI Studio powered by Azure AI Content material Security to handle harms and safety dangers which might be launched by massive language fashions. The brand new options assist establish and stop tried unauthorized modifications, and establish when massive language fashions generate materials that leverages third-party mental property and content material.
With Azure AI Content material Security, builders can monitor human and AI-generated content material throughout languages and modalities and streamline workflows with customizable severity ranges and built-in blocklists.
It’s nice to see clients already leveraging this to construct their AI options. In simply six months, Perplexity introduced Perplexity Ask, a conversational reply engine, to market with Azure AI Studio. They had been in a position to streamline and expedite AI improvement, get to market quicker, scale shortly to help thousands and thousands of customers, and cost-effectively ship safety and reliability.
When you’re making a customized copilot, bettering search, enhancing name facilities, creating bots, or a mix of all of this, Azure AI Studio gives all the things you want. You’ll be able to take a look at Eric Boyd’s blog to be taught extra about Azure AI Studio.
Generative AI is now multi-modal
We’re excited to allow a brand new chapter within the generative AI journey for our clients with GPT-4 Turbo with Imaginative and prescient, in preview, coming quickly to the Azure OpenAI Service and Azure AI Studio. With GPT-4 Turbo with Imaginative and prescient, builders can ship multi-modal capabilities of their purposes.
We’re including several new updates to Azure AI Vision. GPT-4 Turbo with Imaginative and prescient together with our Azure AI Imaginative and prescient service can see, perceive, and make inferences like video evaluation or video Q&A from visible inputs and related text-based immediate directions.
Along with GPT-4 Turbo with Imaginative and prescient, we’re blissful to share different new innovations to Azure OpenAI Service together with GPT-4 Turbo in preview and GPT-3.5 Turbo 16K 1106 typically availability coming on the finish of November and picture mannequin DALL-E 3 in preview now.
Search within the period of AI
Efficient retrieval strategies, like these powered by search, can enhance the standard of responses and response latency. A standard observe for information retrieval (retrieval step in RAG), is to make use of vector search. Search can energy efficient retrieval strategies to vastly enhance the standard of responses and cut back latency, which is important for generative AI apps as they should be grounded on content material from knowledge, or web sites, to reinforce responses generated by LLMs.
Azure AI Search is a strong info retrieval and search platform that permits organizations to make use of their very own knowledge to ship hyper-personalized experiences in generative AI purposes. We’re announcing the general availability of vector search for quick, extremely related outcomes from knowledge.
Vector search is a technique of trying to find info inside varied knowledge varieties, together with pictures, audio, textual content, video, and extra. It’s one of the crucial essential parts of AI-powered, clever apps, and the addition of this functionality is our newest AI-ready performance to come back to our Azure databases portfolio.
Semantic ranker, previously referred to as semantic search, can also be usually accessible and offers entry to the identical machine learning-powered search re-ranking expertise used to energy Bing. Your generative AI purposes can ship the very best high quality responses to each person Q&A with a feature-rich vector database built-in with state-of-the-art relevance expertise.
Speed up your AI journey responsibly and with confidence
At Microsoft, we’re dedicated to secure and accountable AI. It goes past moral values and foundational rules, that are critically vital. We’re integrating this into the merchandise, companies, and instruments we launch so organizations can construct on a basis of safety, danger administration, and belief.
We’re happy to announce new updates at Ignite to assist clients pursue AI responsibly and with confidence.
Setting the usual for accountable AI innovation—increasing our Copilot Copyright Dedication
Microsoft has set the usual with companies and instruments like Azure AI Content material Security, the Accountable AI Dashboard, mannequin monitoring, and our industry-leading dedication to defend and indemnify industrial clients from lawsuits for copyright infringement.
Immediately, we’re saying the enlargement of the Copilot Copyright Dedication, now referred to as Buyer Copyright Dedication (CCC), to clients utilizing Azure OpenAI Service. As extra clients construct with generative AI inside their organizations, they’re impressed by the potential of this expertise and are desirous to commercialize it externally.
By extending the CCC to Azure OpenAI Service, Microsoft is broadening our dedication to defend our industrial clients and pay for any adversarial judgments if they’re sued for copyright infringement for utilizing the outputs generated by Azure OpenAI Service. This profit will probably be accessible beginning December 1, 2023.
As a part of this enlargement, we’ve revealed new documentation to assist Azure OpenAI Service clients implement technical measures and different greatest practices to mitigate the chance of infringing content material. Clients might want to adjust to the documentation to benefit from the profit. Azure OpenAI Service is a developer service and comes with a shared dedication to construct responsibly. We look ahead to clients leveraging it as they construct their very own copilots.
Saying the Azure AI Benefit provide
We wish to be your trusted accomplice as you ship next-gen, transformative experiences with pioneering AI expertise, a deeply built-in platform, and main cloud safety.
Azure gives a full, built-in stack purpose-built for cloud-native, AI-powered purposes, accelerating your time to market and providing you with a aggressive edge and superior efficiency. To assist on that journey we’re blissful to introduce a new provide to assist new and current Azure AI and GitHub Copilot clients understand the worth of Azure AI and Azure Cosmos DB collectively and get on the quick monitor to creating AI powered purposes. You’ll be able to learn more concerning the Azure AI Benefit provide and register here.
Azure Cosmos DB and Azure AI mixed ship many advantages, together with enhanced reliability of generative AI purposes via the pace of Azure Cosmos DB, a world-class infrastructure and safety platform to develop your enterprise whereas safeguarding your knowledge, and provisioned throughput to scale seamlessly as your software grows.
Azure AI companies and GitHub Copilot clients deploying their AI apps to Azure Kubernetes Service could also be eligible for added reductions. Converse to your Microsoft consultant to be taught extra.
Empowering all builders with AI powered instruments
There’s a lot in retailer this week at Ignite to enhance the developer expertise, save time, and improve productiveness as they construct clever purposes. Let’s dive into what’s new.
Updates for Azure Cosmos DB—the database for the period of AI
For builders to ship apps extra effectively and with lowered manufacturing prices, at Ignite we’re sharing new options in Azure Cosmos DB.
Now in preview, dynamic scaling offers builders new flexibility to scale databases up or down and brings value financial savings to clients, particularly these with operations across the globe. We’re additionally bringing AI deeper into the developer expertise and growing productiveness with the preview of Microsoft Copilot for Azure enabling pure language queries in Azure Cosmos DB.
Bond Brand Loyalty turned to Azure Cosmos DB to scale to greater than two petabytes of transaction knowledge whereas sustaining safety and privateness for their very own clients. On Azure, Bond constructed a contemporary providing to help intensive safety configurations, decreasing onboarding time for brand spanking new shoppers by 20 %.
We’re saying two thrilling updates to allow builders to construct clever apps: basic availability of each Azure Cosmos DB for MongoDB vCore and vector search in Azure Cosmos DB for MongoDB vCore.
Azure Cosmos DB for MongoDB vCore permits builders to construct clever purposes with full help for MongoDB knowledge saved in Azure Cosmos DB, which unlocks alternatives for app improvement because of deep integration with different Azure companies. Meaning builders can get pleasure from the advantages of native Azure integrations, low whole value of possession (TCO), and a well-known vCore structure when migrating current purposes or constructing new ones.
Vector search in Azure Cosmos DB for MongoDB vCore permits builders to seamlessly combine knowledge saved in Azure Cosmos DB into AI-powered purposes, together with these utilizing Azure OpenAI Service embeddings. Constructed-in vector search lets you effectively retailer, index, and question high-dimensional vector knowledge, and eliminates the necessity to switch the info outdoors of your Azure Cosmos DB database.
PostgreSQL builders have used built-in vector search in Azure Database for PostgreSQL and Azure Cosmos DB for PostgreSQL since this summer season. Now, they will benefit from the public preview of Azure AI extension in Azure Database for PostgreSQL to construct LLMs and wealthy generative AI options.
KPMG Australia used the vector search functionality once they turned to Azure OpenAI Service and Azure Cosmos DB to construct their very own copilot software. The KymChat app has helped workers pace up productiveness and streamline operations. The answer can also be being made accessible to KPMG clients via an accelerator that mixes KymChat’s use circumstances, options, and classes realized, serving to clients speed up their AI journey.
Constructing cloud-native and clever purposes
Clever purposes mix the ability of AI and cloud-scale knowledge with cloud-native app improvement to create extremely differentiated digital experiences. The synergy between cloud-native applied sciences and AI is a tangible alternative for evolving conventional purposes, making them clever, and delivering extra worth to finish customers. We’re devoted to repeatedly enhancing Azure Kubernetes Service to satisfy these evolving calls for of AI for patrons who’re simply getting began in addition to those that are extra superior.
Clients can now run specialised machine studying workloads like LLMs on Azure Kubernetes Service extra cost-effectively and with much less handbook configuration. The Kubernetes AI toolchain Operator automates LLMs deployment on AKS throughout accessible CPU and GPU sources by choosing optimally sized infrastructure for the mannequin. It makes it doable to simply cut up inferencing throughout a number of lower-GPU-count virtural machines (VMs) thus growing the variety of Azure areas the place workloads can run, eliminating wait instances for larger GPU-count VMs, and reducing total value. Clients may also run preset fashions from the open supply hosted on AKS, considerably decreasing prices and total inference service setup time whereas eliminating the necessity for groups to be specialists on accessible infrastructure.
Azure Kubernetes Fleet Supervisor is now usually accessible and permits multi-cluster and at-scale situations for Azure Kubernetes Service clusters. Fleet supervisor offers a world scale for admins to handle workload distribution throughout clusters and facilitate platform and software updates so builders can relaxation assured they’re working on the most recent and most safe software program.
We’ve additionally been sharing learnings about find out how to assist engineering organizations allow their very own builders to get started and be productive quickly, whereas nonetheless making certain programs are safe, compliant, and cost-controlled. Microsoft is offering a core set of technology building blocks and learning modules to assist organizations get began on their journey to determine a platform engineering observe.
New Microsoft Dev Field capabilities to enhance the developer expertise
Sustaining a developer workstation that may construct, run, and debug your software is essential to maintaining with the tempo of contemporary improvement groups. Microsoft Dev Field offers builders with safe, ready-to-code developer workstations for hybrid groups of any measurement.
We’re introducing new preview capabilities to give improvement groups extra granular management over their pictures, the power to hook up with Hosted Networks to simplify connecting to your sources securely, and templates to make it simpler to stand up and working. Paired with new capabilities coming to Azure Deployment Environments, it’s simpler than ever to deploy these initiatives to Azure.
Construct upon a dependable and scalable basis with .NET 8
.NET 8 is a giant leap ahead in the direction of making .NET among the best platforms to construct clever cloud-native purposes, with the first preview of .NET Aspire – an opinionated cloud prepared stack for constructing observable, manufacturing prepared, distributed cloud native purposes. It contains curated parts for cloud-native fundamentals together with telemetry, resilience, configuration, and well being checks. The stack makes it simpler to find, purchase, and configure important dependencies for cloud-native purposes on day 1 and day 100.
.NET 8 can also be the fastest version of .NET ever, with developer productiveness enhancements throughout the stack – whether or not you might be constructing for cloud, a full stack internet app, a desktop or cellular app suing .NET MAUI, or integrating AI to construct the subsequent copilot to your app. These are available in Visual Studio, which additionally releases right now.
Azure Features and Azure App Service have full help for .NET 8 each in Linux and Home windows, and each Azure Kubernetes Service and Azure Container Apps additionally help .NET 8 right now.
There aren’t any limits to your innovation potential with Azure
There’s a lot rolling out this week with knowledge, AI, and digital purposes so I hope you’ll tune into the virtual Ignite experience and listen to concerning the full slate of bulletins and extra about how one can put Azure to work for your enterprise.
This week’s bulletins are proof of our dedication to serving to clients take that subsequent step of innovation and keep future-ready. I can’t wait to see how your creativity and new improvements unfold for your enterprise.
You’ll be able to take a look at these sources to be taught extra about all the things shared right now. We hope you have got an important Ignite week!