In this jam-packed, one-day lecture and demo-focused workshop, Brian, Mickey, and Ken will take you on a journey through building intelligent AI-powered applications using C#, GitHub Copilot, Azure AI services, and modern DevOps practices. You'll explore essential AI concepts, practical architectural patterns, and best practices for integrating powerful AI models into your applications. The team will demonstrate state-of-the-art techniques, including Retrieval-Augmented Generation (RAG), semantic search, advanced prompt engineering, and responsible AI practices. You'll also see how DevOps, MLOps, and Platform Engineering come together seamlessly, enabling secure, reliable, and continuous deployment of intelligent applications.
We'll start with an overview and AI Fundamentals. This will provide you with an introduction to the transformative power of Large Language Models (LLMs) and essential AI tools available for developers. You'll see live demonstrations of GitHub Copilot and other tools that enhance coding productivity directly within Visual Studio, Visual Studio Code, and github.com. We'll explore how LLMs are reshaping software engineering and cover key AI tools including GitHub Copilot, Azure OpenAI Service, and Semantic Kernel. We'll also cover model selection strategies for choosing between OpenAI models, Anthropic, Google (and more) as well as local models. Plus, we'll provide an overview of self-hosted options like Ollama, LM Studio, and other enterprise deployment patterns.
Next, we'll explore the AI Development Environment Landscape to help you understand the rapidly evolving world of AI-powered development tools and how to choose the right environment for your team. We'll compare AI IDEs including GitHub Copilot, Cursor, JetBrains AI Assistant, and Google Gemini Code Assist. You'll see demonstrations of command-line AI agents like Claude Code and terminal-based development workflows, along with integration strategies for hybrid development environments.
We'll then cover architectural patterns for AI-enhanced apps, exploring practical architectural patterns crucial for integrating AI capabilities into modern applications. You'll learn about key components like LLM services, embedding models, vector databases, and the powerful Retrieval-Augmented Generation (RAG) technique. We'll cover pattern overviews for LLM services, embeddings, vector databases, and semantic search, along with an introduction to RAG. We'll dive into advanced RAG techniques including hybrid search, reranking, and multi-modal RAG for text and images, plus memory and state management for conversation history.
From there, we'll dive into AI agents and autonomous workflows, covering the hottest area of AI development—autonomous agents that can handle complex, multi-step tasks. You'll discover how chained reasoning, state management, and tool integration create powerful AI workflows. We'll demonstrate GitHub Copilot's Agent Mode evolution from pair programmer to autonomous teammate, explore multi-agent workflows and collaboration patterns, and cover tool integration, planners, and state management in AI agents. We'll include examples of agentic workflows with MCP (Model Context Protocol) servers.
Our focus will then shift to practical prompt engineering, covering the principles of effective prompt engineering to ensure accurate and reliable results from AI models. You'll learn techniques like crafting clear system and user prompts, employing few-shot examples, and leveraging function-calling capabilities. We'll demonstrate crafting effective prompts for accurate results, explain the differences between system and user prompts, few-shot prompting, and function calling. We'll also cover cost optimization strategies including token management and caching techniques.
We'll address testing and evaluating AI applications, looking at essential practices for ensuring AI application quality and reliability through systematic testing and evaluation approaches. We’ll cover evaluating AI outputs and testing strategies for non-deterministic systems, A/B testing prompts and automated testing frameworks, plus performance monitoring and quality assurance for AI applications.
Moving into operations, we'll explore how DevOps meets MLOps, showing how modern DevOps practices integrate seamlessly with Machine Learning Operations (MLOps) to enhance AI-driven development. The team will showcase continuous integration and automated workflows specifically tailored to deploying intelligent applications with examples using GitHub Actions. This includes integrating AI and ML into modern DevOps pipelines, continuous integration for AI-driven projects, and continuous deployment for AI-driven projects.
Security is critical, so we'll cover securing AI applications with GitHub Advanced Security, highlighting security considerations unique to AI-driven applications and emphasizing how GitHub Advanced Security (GHAS) helps maintain secure codebases. The team will show how GHAS can help you produce more secure code and how it works with Copilot to help you build better solutions. We'll cover security challenges unique to AI-driven codebases and demonstrate leveraging GHAS for automated vulnerability detection and remediation.
We'll address responsible AI, governance, and data privacy, covering responsible AI practices, ethical considerations, and essential privacy protections necessary for AI applications. We'll demonstrate practical methods for implementing content moderation and data privacy controls. This covers content moderation and filtering in AI-driven apps, best practices for ethical AI usage and privacy preservation, enterprise AI governance, audit trails, and compliance considerations, plus Azure content-filtering options.
Finally, we'll conclude by deploying AI-powered applications to Azure, exploring deployment strategies for AI applications and weighing the pros and cons of Azure-managed versus self-hosted options. Practical demonstrations will show you how to deploy and monitor a live, production-grade AI-powered application on Azure. This includes choosing between Azure and self-hosted deployments, practical deployment options using Azure App Service, Containers, and Azure Functions, hybrid solutions using cloud and edge computing, plus cost monitoring and optimization for production AI applications.
Key Takeaways
- Clearly understand essential AI development concepts and architectural patterns
- Learn practical techniques to integrate AI into your development workflows effectively
- Grasp modern DevOps and MLOps practices tailored for secure, AI-driven application delivery
- See real-world, production-grade AI applications deployed to Azure and secured through GitHub Advanced Security
- Understand the AI development tool landscape and how to choose the right environment for your team