AWS announces M8azn instances with 5 GHz AMD EPYC processors, expands Amazon Bedrock with six new open weights models, and introduces several service improvements including enhanced logging for EKS Auto Mode and backup configuration for RDS restores.
AWS continues its rapid pace of innovation with a wave of new services and capabilities announced this week. From the launch of high-frequency M8azn instances to expanded model support in Amazon Bedrock, these updates demonstrate AWS's commitment to pushing performance boundaries and enhancing developer productivity across its cloud ecosystem.
Amazon EC2 M8azn Instances: Pushing Cloud Performance to 5 GHz
The general availability of Amazon EC2 M8azn instances marks a significant milestone in cloud computing performance. These general-purpose instances, powered by fifth-generation AMD EPYC processors, achieve the highest maximum CPU frequency in the cloud at 5 GHz, setting a new standard for high-performance workloads.
Built on the AWS Nitro System with sixth-generation Nitro Cards, M8azn instances deliver substantial performance improvements over their predecessors. Compared to M5zn instances, they offer up to 2x compute performance, 4.3x higher memory bandwidth, and a 10x larger L3 cache. The networking throughput has doubled, and Amazon EBS throughput has increased by 3x, making these instances ideal for demanding applications.
These instances are specifically designed for workloads that require both high frequency and high network performance. Real-time financial analytics, high-frequency trading platforms, CI/CD pipelines, gaming applications, and simulation modeling across industries like automotive, aerospace, energy, and telecommunications will benefit from the enhanced capabilities.
The M8azn family offers nine sizes ranging from 2 to 96 vCPUs with up to 384 GiB of memory, maintaining a 4:1 ratio of memory to vCPU. Two bare metal variants are also available for customers who need direct hardware access. This flexibility allows organizations to right-size their infrastructure for specific workload requirements while taking advantage of the 5 GHz performance ceiling.
Amazon Bedrock Expands with Six New Open Weights Models
Amazon Bedrock has significantly expanded its model offerings with support for six new fully managed open weights models, broadening the platform's capabilities for AI and machine learning workloads. These additions include DeepSeek V3.2, MiniMax M2.1, GLM 4.7, GLM 4.7 Flash, Kimi K2.5, and Qwen3 Coder Next.
The new models cover a wide range of use cases, from frontier reasoning to agentic coding workloads. DeepSeek V3.2 and Kimi K2.5 focus on reasoning and agentic intelligence, while GLM 4.7 and MiniMax M2.1 support autonomous coding with large output windows. For production deployments requiring cost efficiency, Qwen3 Coder Next and GLM 4.7 Flash provide optimized alternatives.
All these models are powered by Project Mantle, AWS's distributed inference engine for large-scale machine learning model serving on Amazon Bedrock. Project Mantle provides serverless inference with quality of service controls, higher default customer quotas with automated capacity management, and out-of-the-box compatibility with OpenAI API specifications. This compatibility ensures that developers can migrate existing applications with minimal code changes.
The integration extends beyond Bedrock itself. Developers can now use these new open weight models—DeepSeek v3.2, MiniMax 2.1, and Qwen3 Coder Next—in Kiro, a spec-driven AI development tool. This integration streamlines the development workflow for AI applications and reduces the time to market for new features.
Enhanced Security and Performance Across AWS Services
Several other AWS services received significant updates this week, focusing on security, performance, and operational efficiency.
Amazon Bedrock now supports AWS PrivateLink for the bedrock-mantle endpoint, in addition to existing support for the bedrock-runtime endpoint. This enhancement allows customers to access Bedrock models through private network connections, improving security and reducing data exposure to the public internet. The PrivateLink support is available in 14 AWS Regions and provides the same OpenAI API compatibility that developers expect from Bedrock.
Amazon EKS Auto Mode introduces enhanced logging capabilities using Amazon CloudWatch Vended Logs. This feature allows customers to collect logs from Auto Mode's managed Kubernetes capabilities, including compute autoscaling, block storage, load balancing, and pod networking. Each capability can be configured as a CloudWatch Vended Logs delivery source with built-in AWS authentication and authorization. The reduced pricing compared to standard CloudWatch Logs makes this an attractive option for organizations looking to optimize their logging costs while maintaining comprehensive visibility into their Kubernetes environments.
Amazon OpenSearch Serverless now supports Collection Groups, a feature that enables sharing OpenSearch Compute Units (OCUs) across collections with different AWS Key Management Service (AWS KMS) keys. This shared compute model reduces overall OCU costs while maintaining collection-level security and access controls. Collection Groups also introduce the ability to specify minimum OCU allocations alongside maximum OCU limits, providing guaranteed baseline capacity at startup for latency-sensitive applications. This feature is particularly valuable for organizations running multiple search workloads with varying performance requirements.
Amazon RDS has introduced backup configuration support when restoring snapshots, addressing a long-standing operational pain point. Customers can now view and modify the backup retention period and preferred backup window before and during snapshot restore operations. Previously, restored database instances inherited backup parameter values from snapshot metadata and could only be modified after the restore was complete. This enhancement is available for all Amazon RDS database engines (MySQL, PostgreSQL, MariaDB, Oracle, SQL Server, and Db2) and Amazon Aurora (MySQL-Compatible and PostgreSQL-Compatible editions) in all AWS commercial Regions and AWS GovCloud (US) Regions at no additional cost.
Looking Ahead: AWS Events and Community Engagement
The AWS ecosystem continues to thrive through community engagement and knowledge sharing. Several upcoming events provide opportunities for developers, architects, and business leaders to connect and learn:
AWS Summits in 2026 will take place in major cities worldwide, including Paris (April 1), London (April 22), and Bengaluru (April 23–24). These free in-person events offer hands-on experiences with emerging cloud and AI technologies, best practices sessions, and networking opportunities with industry peers and experts.
The AWS AI and Data Conference 2026, scheduled for March 12 at the Lyrath Convention Centre in Ireland, focuses on designing, training, and deploying agents with Amazon Bedrock, Amazon SageMaker, and QuickSight. The conference covers integration with AWS data services and governance practices for operating AI systems at scale, featuring strategic guidance and hands-on labs for architects, developers, and business leaders.
AWS Community Days, led by community members rather than AWS employees, provide technical discussions, workshops, and hands-on labs. Upcoming events include Ahmedabad (February 28), Slovakia (March 11), and Pune (March 21). These community-driven conferences offer unique perspectives and practical insights from practitioners working with AWS technologies daily.
The Pace of Innovation Continues
With over 1,160 Amazon EC2 instance types now available and new capabilities rolling out across the AWS portfolio, the pace of innovation shows no signs of slowing. The M8azn instances demonstrate AWS's commitment to pushing hardware performance boundaries, while the expanded Bedrock model support reflects the growing importance of AI and machine learning in enterprise workloads.
These announcements collectively represent AWS's strategy of providing comprehensive, high-performance cloud services that address diverse customer needs. From the raw computational power of 5 GHz instances to the sophisticated AI capabilities of managed models, AWS continues to deliver the building blocks that enable organizations to innovate and scale in the cloud.
As the cloud computing landscape evolves, AWS's focus on performance, security, and developer experience positions it to meet the challenges of tomorrow's workloads. Whether you're running real-time financial systems, building AI-powered applications, or managing complex Kubernetes environments, the latest AWS innovations provide the tools needed to succeed in an increasingly competitive digital economy.


Comments
Please log in or register to join the discussion