Marvell Blogs

マーベルニュースルーム

Latest Marvell Blog Articles

  • November 20, 2025

    Video Series: The Future of Optical Technology

    By Vienna Alexander, Marketing Content Professional, Marvell

    Optical connectivity is the backbone of AI servers and an expanding opportunity where Marvell shines, given its comprehensive optical connectivity portfolio.

    Marvell showcased its notable developments at ECOC, the European Conference on Optical Communication, alongside various companies contributing to the hardware needed for this AI era.

    Learn more about these impactful optical innovations that are enabling AI infrastructure, plus the trends and goings-on of the market.

     

     

  • November 20, 2025

    The Next Step for AI Storage: GPU-initiated and CPU-initiated Storage

    By Chander Chadha, Director of Marketing, Flash Storage Products, Marvell

    AI is all about dichotomies. Distinct computing architectures and processors have been developed for training and inference workloads. In the past two years, scale-up and scale-out networks have emerged.

    Soon, the same will happen in storage.

    The AI infrastructure need is prompting storage companies to develop SSDs, controllers, NAND and other technologies fine-tuned to support GPUs—with an emphasis on higher IOPS (input/output operations per second) for AI inference—that will be fundamentally different from those for CPU-connected drives where latency and capacity are the bigger focus points. This drive bifurcation also likely won’t be the last; expect to also see drives optimized for training or inference.

    As in other technology markets, the changes are being driven by the rapid growth of AI and the equally rapidly growing need to boost the performance, efficiency and TCO of AI infrastructure. The total amount of SSD capacity inside data centers is expected to double to approximately 2 zettabytes by 2028 with the growth primary fueled by AI.1 By that year, SSDs will account for 41% of the installed base of data center drives, up from 25% in 2023.1

    Greater storage capacity, however, also potentially means more storage network complexity, latency, and storage management overhead. It also means potentially more power. In 2023, SSDs accounted for 4 terawatt hours of data center power, or around 25% of the 16 TWh consumed by storage. By 2028, SSDs are slated to account for 11TWh, or 50%, of storage’s expected total for the year.1 While storage represents less than five percent of total data power consumption, the total remains large and provides incentives for saving. Reducing storage power by even 1 TWh, or less than 10%, would save enough electricity to power 90,000 US homes for a year.2 Finding the precise balance between capacity, speed, power and cost will be critical for both AI data center operators and customers. Creating different categories of technologies becomes the first step toward optimizing products in a way that will be scalable.

  • November 06, 2025

    Marvell Wins LEAP Award for 1.6T LPO Optical Chipset

    By Vienna Alexander, Marketing Content Professional, Marvell

    Marvell Wins LEAP Award for 1.6T LPO Optical Chipset

    Marvell was announced as the top Connectivity winner in the 2025 LEAP Awards for its 1.6 Tbps LPO Optical Chipset. The judges' remarks noted that “the value case writes itself—less power, reduced complexity but substantial bandwidth increase.” Marvell earned the gold spot, reaffirming the industry-leading connectivity portfolio it is continually building.

    The LEAP (Leadership in Engineering Achievement Program) Awards recognize best-in-class product and component designs across 11 categories with the feedback of an independent judging panel of experts. These awards are published by Design World, the trade magazine that covers design engineering topics in detail.

    This chipset, combining a 200G/lane TIA (transimpedance amplifier) and laser drivers, enables 800G and 1.6T linear-drive pluggable optics (LPO) modules. LPO modules offer longer reach than passive copper, at low power and low latency, and are designed for scale-up compute-fabric applications.

  • November 03, 2025

    Co-packaged Optics: Powering the Next Wave of AI Data Center Innovation

    By Chris McCormick, Product Management Director, Cloud Platform Group, Marvell

    Co-packaged optics (CPO) will play a fundamental role in improving the performance, efficiency, and capabilities of networks, especially the scale-up fabrics for AI systems.

    Realizing these benefits will also require a fundamental transformation in the way computing and switching assets are designed and deployed in data centers. Marvell is partnering with equipment manufacturers, cable specialists, interconnect companies and others to ensure the infrastructure for delivering CPO will be ready when customers are ready to adopt it.

    The Trends Driving CPO

    AI’s insatiable appetite for bandwidth and the physical limitations of copper are driving demand for CPO. Network bandwidth doubles every two to three years, and the reach of copper reduces meaningfully as bandwidth increases. Meanwhile, data center operators are clamoring for better performance per watt and rack.

    CPO ameliorates the problem by moving the conversion of electrical to optical from an external slot on the faceplate to a position as close to the ASIC as possible. This shortens the copper trace, which may improve the link budget enough to remove digital signal processor (DSP) or retimer functionality, thereby reducing the overall power per bit, a key metric in AI datacenter management. Achieving commercial viability and scalability, however, has taken years of R&D across the ecosystem, and the benefits will likely depend on the use cases and applications where CPO is deployed.

    While analyst firms such as LightCounting predict that optical modules will continue to constitute the majority of optical links inside data centers through the decade,1 CPO will likely become a meaningful segment.

    The CPO Server Tray

    The image below shows a conceptualized AI compute tray with CPO developed with products from SENKO Advanced Components and Marvell. The design contains room for four XPUs and up to 102.4 Tbps of bandwidth delivered through 1024 optical fibers, all in a 1U tray. The density and reach enabled by CPO opens the door to scale-up domains far beyond what is possible with copper alone..

    Co-packaged optics for AI Scale-up

    When asked at recent trade shows how many fibers the tray contained, most attendees guessed around 250 fibers. The actual number is 1,152 fibers.

  • October 14, 2025

    AI Scale Up Goes for Distance with 9-meter 800G AEC from Infraeo and Marvell

    By Winnie Wu, Senior Director Product Marketing at Marvell

    Welcome to the beginning of row-scale computing.

    At the 2025 OCP Global Summit, Marvell and Infraeo will showcase a breakthrough in high-speed interconnect technology — a 9-meter active electrical cable (AEC) capable of transmitting 800G across standard copper. The demonstration will take place in the Marvell booth #B1.

    This latest innovation brings data center architecture one step closer to full row-scale AI system design, allowing copper connections that stretch across seven racks - that’s nearly the length of a standard 10-rack row. It builds on the prior achievement by Marvell of a 7-meter AEC demonstrated at OFC 2025, pushing high-speed copper technology even further beyond what was thought possible.

    Pushing the Boundaries of Copper

    Until now, copper connections in large-scale AI systems have been limited by reach. Traditional electrical cables lose signal quality as distance increases, restricting system architects to a few meters between servers or racks. The 9-meter AEC changes that equation.

    By combining high-performance digital signal processing (DSP) with advanced noise reduction and signal integrity engineering, the new design extends copper’s effective range well beyond conventional limits, maintaining clean, low-latency data transfer over distances once thought achievable only with optical fiber.

アーカイブス